Search Filters

Search Results

Found 2 results

510(k) Data Aggregation

    K Number
    K030457
    Device Name
    REX, VERSION 3.0
    Manufacturer
    Date Cleared
    2003-04-08

    (56 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Why did this record match?
    Applicant Name (Manufacturer) :

    POINTDX, INC

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    REX™ 3.0 is a software package intended for viewing and manipulating DICOM-compliant medical images acquired from CT and MR scanners. REX™ 3.0 can be used for real-time image viewing, image manipulation, 3D volume rendering, virtual endoscopy, and issuance of reports.

    Device Description

    REX™ 3.0 is a tool for 3D (three dimensional) and 2D (two dimensional) viewing and manipulation of DICOM compliant CT and MR images. The proposed software provides real-time image viewing, image manipulation, 3D volume rendering, virtual endoscopy, and issuance of reports.

    AI/ML Overview

    The provided text is a 510(k) Premarket Notification Summary for the REX™ 3.0 PACS / Image Processing Software. It focuses on demonstrating substantial equivalence to predicate devices rather than providing detailed acceptance criteria and a specific study proving the device meets them in the way modern AI/medical device submissions typically do.

    Based on the provided text, here's a breakdown of the information requested, with "N/A" where the information is not available in the document:


    Acceptance Criteria and Device Performance

    1. Table of Acceptance Criteria and Reported Device Performance

    The document describes the device's performance through a comparison to predicate devices rather than against pre-defined, quantitative acceptance criteria. The "acceptance criteria" here are implicitly that the REX™ 3.0 software performs all specified functions in line with software requirements and safety standards, and is substantially equivalent to predicate devices.

    Feature/CriterionREX™ 3.0 Reported Performance (Implicit Acceptance)
    DICOM ConformanceConforms to DICOM Version 3.0.
    Functional RequirementsPerforms all input functions, output functions, and all required actions according to the functional requirements specified in the Software Requirements Specification. Validation testing confirmed this.
    Non-Clinical Performance (Safety/Hazards)Potential hazards identified in Hazard Analysis are controlled by design controls, protective measures, and user warnings. No new potential safety risks identified.
    Intended UsePerforms in accordance with its intended use (viewing and manipulating DICOM-compliant medical images from CT and MR scanners, real-time image viewing, image manipulation, 3D volume rendering, virtual endoscopy, and issuance of reports).
    Equivalence to REX™ 1.0Substantially equivalent to REX™ 1.0 with the addition of MR image analysis functions and a dual-monitor setup (one for image viewing, one for report viewing).
    Equivalence to Rapidia® V 2.0Substantially equivalent in common features and specifications.
    Image SourcesSupports CT and MR images (enhancement over REX™ 1.0 which only supported CT).
    Operating SystemOperates on Windows 2000. (Note: Not on Windows XP or NT, unlike Rapidia® V 2.0).
    Multi Planar ReformattingYes (enhancement over REX™ 1.0 which did not have this).
    Other Features (GUI, Patient Demographics,Yes (comparable to predicate devices for these listed features: GUI, Platform, PC, Patient Demographics, Networking (TCP/IP), DICOM 3.0 compliant, PNG (Lossless) image compression, Annotations, 3D Volume rendering, Still/Window/Level/Zoom/Pan/Flip for image review, 2D measurements (length, area), DICOM 3.0 image input, PNG (lossless snapshots) image output, Standard monitor use, Patient and Study Browser, Measure Image Intensity Values (ROI), Standalone software, Virtual Endoscopy (instant access to lesions, real-time display, internal/external viewing of hollow structures), Local Image Storage, True Color, User Login, Preset Window and Level, Image Conversion (for browser viewing), Trained Physician users, Volume Rendering algorithms, Reporting, Off-the-shelf hardware, Windows 2000 OS, DICOM compatible).
    Image Communication, Image Processing etc.)

    2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)

    The document refers to "Validation testing" but does not specify a separate "test set" with a defined sample size for clinical or image-based performance evaluation. The "test set" is implicitly the DICOM-compliant images used during software validation, but no details are provided about their origin, number, or whether they were retrospective or prospective.

    • Sample Size: N/A (Not specified as a distinct clinical test set with a quantifiable size)
    • Data Provenance: N/A (Not specified)

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)

    N/A. The submission does not describe a process for establishing ground truth by expert consensus for a test set, as it is a PACS/image processing software focused on viewing and manipulation, not diagnostic interpretation or algorithm-based detection needing labeled ground truth in the context of an AI device. The validation is focused on software functionality and compliance.

    4. Adjudication method (e.g. 2+1, 3+1, none) for the test set

    N/A. Since no specific test set with expert-established ground truth is described, no adjudication method is mentioned.

    5. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    N/A. This is not an AI-assisted diagnostic device. It is a PACS/image processing software. Therefore, an MRMC study concerning AI assistance is not relevant or described.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    The device itself is described as "Standalone" software (meaning it's not embedded within a larger system). However, a "standalone algorithm performance" study related to an AI diagnostic function is not applicable here as it is not an AI diagnostic algorithm. The safety statement explicitly mentions: "Clinician interactive review/editing of data integral to use," indicating human-in-the-loop is part of its intended operational model.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)

    N/A. Ground truth in the context of diagnostic accuracy is not discussed because this is an image viewing and manipulation software, not a diagnostic algorithm. The "truth" for its proper functioning is adherence to DICOM standards and its own software requirements specification.

    8. The sample size for the training set

    N/A. This is not an AI/machine learning device that relies on a "training set" in the context of deep learning models. The software performs deterministic image processing and viewing functions.

    9. How the ground truth for the training set was established

    N/A. Not applicable, as there is no training set mentioned or implied for an AI/ML model.


    Summary of the Study and Device Performance:

    The "study" described in K030457 is primarily a software validation and verification process to ensure the REX™ 3.0 software conforms to its design specifications, DICOM standards (Version 3.0), and relevant regulations. It is a non-clinical performance data assessment rather than a clinical trial or performance study involving patient data in a diagnostic context.

    The primary method to "prove" the device meets acceptance criteria (which are largely functional and safety-based for this type of software) is through:

    1. Conformance to DICOM 3.0: Stated directly in the "Non-Clinical Performance Data" section.
    2. Validation and Verification Process: PointDx followed established procedures for software development, validation, and verification which confirm that REX™ 3.0 "performs all input functions, output functions, and all required actions according to the functional requirements specified in the Software Requirements Specification."
    3. Hazard Analysis: Potential hazards were identified and controlled through design, protective measures, and user warnings, concluding that REX™ 3.0 "does not result in any new potential safety risks."
    4. Substantial Equivalence Comparison: A detailed tabular comparison against predicate devices (REX™ 1.0 and Rapidia® V 2.0) highlights that REX™ 3.0 has similar features and functionalities, with improvements such as MR image support and multi-planar reformatting compared to REX™ 1.0, and overall equivalence in common features to Rapidia® V 2.0. This comparison implicitly serves as evidence that the device meets the "acceptance criteria" of being similar in performance and safety to already cleared devices.

    In essence, the submission relies on software engineering best practices and regulatory compliance to demonstrate that the REX™ 3.0 software functions as intended and is safe, rather than a clinical study measuring diagnostic performance or accuracy against ground truth.

    Ask a Question

    Ask a specific question about this device

    K Number
    K021099
    Device Name
    REX, VERSION 1.0
    Manufacturer
    Date Cleared
    2002-06-27

    (84 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Why did this record match?
    Applicant Name (Manufacturer) :

    POINTDX, INC

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    REX™ 1.0 is a software package intended for viewing and manipulating DICOM-compliant medical images acquired from CT scanners. REX™ 1.0 can be used for real-time image viewing, image manipulation, 3D volume rendering, virtual endoscopy, and issuance of reports.

    Device Description

    REX™ 1.0 is a tool for 3D (three dimensional) and 2D (two dimensional) viewing and manipulation of DICOM compliant CT images. The proposed software provides real-time image viewing, image manipulation, 3D volume rendering, virtual endoscopy, and issuance of reports.

    AI/ML Overview

    Here's an analysis of the provided text regarding the acceptance criteria and supporting study for the REX™ 1.0 device:

    Important Note: The provided document is a 510(k) Premarket Notification Summary from 2002. At that time, the regulatory requirements and expectations for demonstrating substantial equivalence, particularly for software devices, were different from current standards, especially for AI/ML-driven devices. This document focuses on demonstrating equivalence to a predicate device rather than presenting extensive performance studies against defined acceptance criteria in the way a modern AI/ML device would.


    1. Table of Acceptance Criteria and Reported Device Performance

    The document does not explicitly present a table of quantitative acceptance criteria and corresponding reported device performance metrics in the way modern AI/ML device submissions typically do (e.g., sensitivity, specificity, AUC, etc.). Instead, the performance is framed in terms of substantial equivalence to a predicate device and adherence to functional requirements and standards.

    Acceptance Criteria (Inferred from Substantial Equivalence and Validation)Reported Device Performance (REX™ 1.0)
    Functional Equivalence to Predicate Device (Rapidia® V 2.0):
    - DICOM 3.0 ComplianceConforms to DICOM Version 3.0 (Explicitly stated).
    - Real-time image viewingYes (Matches predicate)
    - Image manipulationYes (Matches predicate)
    - 3D volume renderingYes (Matches predicate)
    - Virtual endoscopyYes (Matches predicate) - specifically: instant access to lesions by single click, real-time display of endoscopic view, internal and external viewing of any hollow structures.
    - Issuance of reportsYes (Matches predicate)
    - Operating System compatibility (Windows 2000)Yes (Matches predicate)
    - Patient demographics displayYes (Matches predicate)
    - TCP/IP networkingYes (Matches predicate)
    - PNG (Lossless) Image CompressionYes (Feature present, predicate unspecified, but deemed acceptable for equivalence due to REX™ being a subset of features and no new safety risks)
    - Annotations - markerYes (Matches predicate)
    - Image Review (Still, Window, Level, Zoom, Pan, Flip)Yes (Matches predicate)
    - 2D Measurements (Length, Area)Yes (Matches predicate)
    - Image Source (CT only for REX™)Yes (Supports CT, unlike predicate which also supports MR; REX™ is a subset of features and thus acceptable)
    - Image Input (DICOM 3.0)Yes (Matches predicate)
    - Image Output (PNG lossless snapshots)PNG (lossless snapshots) (Predicate outputs JPEG, BMP, DICOM; considered acceptable due to REX™ being a subset of features and no new safety risks)
    - Use Standard MonitorYes (Matches predicate)
    - Patient and Study BrowserYes (Matches predicate)
    - Measure CT Numbers (ROI)Yes (Matches predicate)
    - Standalone Software TypeYes (Matches predicate)
    - Local Image StorageYes (Matches predicate)
    - True ColorYes (Matches predicate)
    - User LoginYes (Predicate unspecified, but this is a security/access feature and does not introduce new safety risks that would preclude equivalence)
    - Preset Window and LevelYes (Matches predicate)
    - Image Conversion (for viewing in browser)Yes (Matches predicate)
    - Trained Physicians as UsersYes (Matches predicate)
    - Volume Rendering algorithmsYes (Matches predicate)
    - Reporting algorithmsYes (Matches predicate)
    Adherence to Internal Requirements and Regulatory Practices:
    - Performance to functional requirements specified in SRSValidation testing confirms REX™ 1.0 performs all input, output functions, and required actions according to the Software Requirements Specification.
    - Adherence to Software Development PracticesFollowed specified Software Development Practices and Validation and Verification Process. Procedures specify individuals responsible for developing/approving specs, coding/testing, validation, and maintenance.
    - Hazard ControlPotential hazards identified in Hazard Analysis and controlled by designing controls, introducing protective measures, and warning users.
    - No new potential safety risksConcluded that REX™ 1.0 "does not result in any new potential safety risks and performs in accordance with its intended use as well as the Rapidia® V 2.0 device currently on the market."
    - Substantial Equivalence to PredicatePointDx considers features of REX™ 1.0 to be substantially equivalent to the subset of features in common with Rapidia® V 2.0. The FDA ultimately agreed and determined the device to be substantially equivalent.

    The "study" proving the device meets these criteria is the non-clinical performance data and substantial equivalence comparison presented in the 510(k) summary.

    Detailed Breakdown of the Study/Evidence:

    2. Sample Size Used for the Test Set and Data Provenance:

    • Test Set Sample Size: Not explicitly stated as a separate "test set" with image counts. The validation testing mentioned refers to internal software validation. The comparison for substantial equivalence does not involve a specific clinical image dataset for performance metrics, but rather a feature-by-feature comparison against the predicate device.
    • Data Provenance: Not applicable in the context of a clinical test set. The validation would have been performed on internally generated or standard DICOM test files to verify functionality.

    3. Number of Experts Used to Establish Ground Truth for the Test Set and Qualifications:

    • Number of Experts/Qualifications: Not applicable. This submission predates the common requirement for human-in-the-loop or standalone clinical performance studies with expert ground truth for imaging software that provides basic viewing and manipulation functionalities. Substantial equivalence was primarily based on functional comparison.

    4. Adjudication Method for the Test Set:

    • Adjudication Method: Not applicable. There was no "test set" requiring adjudication in the way modern AI performance studies do. The equivalence was determined by comparing features and functionality to the predicate device.

    5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study was done, and effect size:

    • MRMC Study: No, an MRMC comparative effectiveness study was not done. The device is a "PACS / Image Processing Software" that provides viewing and manipulation tools, not an AI diagnostic aid requiring assessment of human reader improvement.

    6. If a Standalone (i.e., algorithm only without human-in-the-loop performance) was done:

    • Standalone Performance: Not applicable in the context of a diagnostic algorithm. The device itself is a standalone software, but its "performance" is about its functionality (e.g., can it render 3D, can it measure length) rather than a diagnostic output that would have standalone metrics like sensitivity/specificity. The validation testing ensured the software functions as designed.

    7. The Type of Ground Truth Used:

    • Type of Ground Truth: For the functional validation, the "ground truth" would be the software requirements specification (SRS) and DICOM 3.0 standard compliance. For the substantial equivalence, the "ground truth" was the features and specifications of the predicate device (Rapidia® V 2.0). There was no clinical ground truth (like pathology or outcomes data) used as this device is a viewing/manipulation tool, not a diagnostic algorithm.

    8. The Sample Size for the Training Set:

    • Training Set Sample Size: Not applicable. REX™ 1.0 is described as PACS/Image Processing Software. It's a deterministic software program for viewing and manipulation, not an AI/ML algorithm that requires a "training set" in the modern sense.

    9. How the Ground Truth for the Training Set was Established:

    • Ground Truth for Training Set: Not applicable, as there is no "training set" for this type of software.
    Ask a Question

    Ask a specific question about this device

    Page 1 of 1