Search Results
Found 1 results
510(k) Data Aggregation
(56 days)
REX, VERSION 3.0
REX™ 3.0 is a software package intended for viewing and manipulating DICOM-compliant medical images acquired from CT and MR scanners. REX™ 3.0 can be used for real-time image viewing, image manipulation, 3D volume rendering, virtual endoscopy, and issuance of reports.
REX™ 3.0 is a tool for 3D (three dimensional) and 2D (two dimensional) viewing and manipulation of DICOM compliant CT and MR images. The proposed software provides real-time image viewing, image manipulation, 3D volume rendering, virtual endoscopy, and issuance of reports.
The provided text is a 510(k) Premarket Notification Summary for the REX™ 3.0 PACS / Image Processing Software. It focuses on demonstrating substantial equivalence to predicate devices rather than providing detailed acceptance criteria and a specific study proving the device meets them in the way modern AI/medical device submissions typically do.
Based on the provided text, here's a breakdown of the information requested, with "N/A" where the information is not available in the document:
Acceptance Criteria and Device Performance
1. Table of Acceptance Criteria and Reported Device Performance
The document describes the device's performance through a comparison to predicate devices rather than against pre-defined, quantitative acceptance criteria. The "acceptance criteria" here are implicitly that the REX™ 3.0 software performs all specified functions in line with software requirements and safety standards, and is substantially equivalent to predicate devices.
Feature/Criterion | REX™ 3.0 Reported Performance (Implicit Acceptance) |
---|---|
DICOM Conformance | Conforms to DICOM Version 3.0. |
Functional Requirements | Performs all input functions, output functions, and all required actions according to the functional requirements specified in the Software Requirements Specification. Validation testing confirmed this. |
Non-Clinical Performance (Safety/Hazards) | Potential hazards identified in Hazard Analysis are controlled by design controls, protective measures, and user warnings. No new potential safety risks identified. |
Intended Use | Performs in accordance with its intended use (viewing and manipulating DICOM-compliant medical images from CT and MR scanners, real-time image viewing, image manipulation, 3D volume rendering, virtual endoscopy, and issuance of reports). |
Equivalence to REX™ 1.0 | Substantially equivalent to REX™ 1.0 with the addition of MR image analysis functions and a dual-monitor setup (one for image viewing, one for report viewing). |
Equivalence to Rapidia® V 2.0 | Substantially equivalent in common features and specifications. |
Image Sources | Supports CT and MR images (enhancement over REX™ 1.0 which only supported CT). |
Operating System | Operates on Windows 2000. (Note: Not on Windows XP or NT, unlike Rapidia® V 2.0). |
Multi Planar Reformatting | Yes (enhancement over REX™ 1.0 which did not have this). |
Other Features (GUI, Patient Demographics, | Yes (comparable to predicate devices for these listed features: GUI, Platform, PC, Patient Demographics, Networking (TCP/IP), DICOM 3.0 compliant, PNG (Lossless) image compression, Annotations, 3D Volume rendering, Still/Window/Level/Zoom/Pan/Flip for image review, 2D measurements (length, area), DICOM 3.0 image input, PNG (lossless snapshots) image output, Standard monitor use, Patient and Study Browser, Measure Image Intensity Values (ROI), Standalone software, Virtual Endoscopy (instant access to lesions, real-time display, internal/external viewing of hollow structures), Local Image Storage, True Color, User Login, Preset Window and Level, Image Conversion (for browser viewing), Trained Physician users, Volume Rendering algorithms, Reporting, Off-the-shelf hardware, Windows 2000 OS, DICOM compatible). |
Image Communication, Image Processing etc.) |
2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)
The document refers to "Validation testing" but does not specify a separate "test set" with a defined sample size for clinical or image-based performance evaluation. The "test set" is implicitly the DICOM-compliant images used during software validation, but no details are provided about their origin, number, or whether they were retrospective or prospective.
- Sample Size: N/A (Not specified as a distinct clinical test set with a quantifiable size)
- Data Provenance: N/A (Not specified)
3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)
N/A. The submission does not describe a process for establishing ground truth by expert consensus for a test set, as it is a PACS/image processing software focused on viewing and manipulation, not diagnostic interpretation or algorithm-based detection needing labeled ground truth in the context of an AI device. The validation is focused on software functionality and compliance.
4. Adjudication method (e.g. 2+1, 3+1, none) for the test set
N/A. Since no specific test set with expert-established ground truth is described, no adjudication method is mentioned.
5. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance
N/A. This is not an AI-assisted diagnostic device. It is a PACS/image processing software. Therefore, an MRMC study concerning AI assistance is not relevant or described.
6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done
The device itself is described as "Standalone" software (meaning it's not embedded within a larger system). However, a "standalone algorithm performance" study related to an AI diagnostic function is not applicable here as it is not an AI diagnostic algorithm. The safety statement explicitly mentions: "Clinician interactive review/editing of data integral to use," indicating human-in-the-loop is part of its intended operational model.
7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)
N/A. Ground truth in the context of diagnostic accuracy is not discussed because this is an image viewing and manipulation software, not a diagnostic algorithm. The "truth" for its proper functioning is adherence to DICOM standards and its own software requirements specification.
8. The sample size for the training set
N/A. This is not an AI/machine learning device that relies on a "training set" in the context of deep learning models. The software performs deterministic image processing and viewing functions.
9. How the ground truth for the training set was established
N/A. Not applicable, as there is no training set mentioned or implied for an AI/ML model.
Summary of the Study and Device Performance:
The "study" described in K030457 is primarily a software validation and verification process to ensure the REX™ 3.0 software conforms to its design specifications, DICOM standards (Version 3.0), and relevant regulations. It is a non-clinical performance data assessment rather than a clinical trial or performance study involving patient data in a diagnostic context.
The primary method to "prove" the device meets acceptance criteria (which are largely functional and safety-based for this type of software) is through:
- Conformance to DICOM 3.0: Stated directly in the "Non-Clinical Performance Data" section.
- Validation and Verification Process: PointDx followed established procedures for software development, validation, and verification which confirm that REX™ 3.0 "performs all input functions, output functions, and all required actions according to the functional requirements specified in the Software Requirements Specification."
- Hazard Analysis: Potential hazards were identified and controlled through design, protective measures, and user warnings, concluding that REX™ 3.0 "does not result in any new potential safety risks."
- Substantial Equivalence Comparison: A detailed tabular comparison against predicate devices (REX™ 1.0 and Rapidia® V 2.0) highlights that REX™ 3.0 has similar features and functionalities, with improvements such as MR image support and multi-planar reformatting compared to REX™ 1.0, and overall equivalence in common features to Rapidia® V 2.0. This comparison implicitly serves as evidence that the device meets the "acceptance criteria" of being similar in performance and safety to already cleared devices.
In essence, the submission relies on software engineering best practices and regulatory compliance to demonstrate that the REX™ 3.0 software functions as intended and is safe, rather than a clinical study measuring diagnostic performance or accuracy against ground truth.
Ask a specific question about this device
Page 1 of 1