Search Results
Found 3 results
510(k) Data Aggregation
(142 days)
Image Suite is a stand-alone radiographic imaging software designed to perform patient registration, review, report, archive, and print patient images. The software interacts with Carestream CR or DR to receive radiographic images for review, and has the ability to receive, archive, and review DICOM images from compatible third-party modalities (for example, US, MR, CT). The software can also update Carestream CR or DR device firmware and monitor device calibration. The software serves as a web server to support web clients for image viewing. Image Suite is intended for use by radiologists and trained healthcare professionals. The software presents images to medical professionals in a convenient digital medium so they can make diagnostic and/or therapeutic decisions.
This excludes mammography applications and Tablet Viewer applications in the United States.
Image Suite software is a Picture Archiving and Communication System (PACS) that allows patient registration, image acquisition, processing, reviewing, reporting, archiving and printing of radiographic images from compatible Carestream image acquisition devices, such as digital radiography (DR) detectors or computed radiography (CR) systems. In addition to being designed to function with images acquired from Carestream compatible devices, Image Suite can receive DICOM images from compatible third-party modalities (for example, Ultrasound, MRI, CT). Third party images may be viewed and archived by Image Suite, but no changes to the raw image from the third-party device are possible.
This Image Suite submission adds features that are commercially available on other Carestream products. Image Suite utilizes the same image processing software (Eclipse) as the Carestream CR and DR devices that the features being detailed in this submission were previously cleared on.
These features include:
- Companion Imaging
- Multiple Looks (subset of the looks provided in the commercially available x-ray systems)
- Smart Grid (cleared under K163157).
This submission will also address cumulative changes made since the last 510(k) submission K140271. Over time minor changes have been made to the Image Suite product that have not impacted safety or effectiveness of the product.
N/A
Ask a specific question about this device
(196 days)
The device is designed to perform radiographic x-ray examinations on all pediatric and adult patients, in all patient treatment areas.
The DRX-Revolution Mobile X-ray System is a mobile diagnostic x-ray system that utilizes digital technology for bedside or portable exams. Key components of the system are the x-ray generator, a tube head assembly (includes the x-ray tube and collimator) that allows for multiple axes of movement, a maneuverable drive system, touchscreen user interface(s) for user input. The system is designed with installable software for acquiring and processing medical diagnostic images outside of a standard stationary X-ray room. It is a mobile diagnostic system intended to generate and control X-rays for examination of various anatomical regions.
The provided text describes a 510(k) premarket notification for the DRX-Revolution Mobile X-ray System, which includes changes such as the addition of Smart Noise Cancellation (SNC) functionality and compatibility with a new detector (Lux 35). The study focuses on demonstrating the substantial equivalence of the modified device to a previously cleared predicate device (DRX-Revolution Mobile X-ray System, K191025).
Here's an analysis of the acceptance criteria and the study that proves the device meets them, based on the provided information:
1. A table of acceptance criteria and the reported device performance
| Acceptance Criteria (for SNC) | Reported Device Performance |
|---|---|
| At least 99% of all image pixels were within ± 1 pixel value | Achieved. The results demonstrated that at least 99% of all image pixels were within ± 1 pixel value. |
| Absolute maximum difference across all test images should be ≤ 10-pixel values | Achieved. The absolute maximum difference seen across all test images was 3-pixel values, meeting the acceptance criterion of a maximum allowable difference of 10-pixel values. |
| Noise ratio values computed for every pixel of the test images should be < 1.0 | Achieved. All noise ratio values computed for every pixel of the test images were less than 1.0, indicating that the difference in SNC noise fields between the Evolution and Revolution systems was less than the expected system noise. |
| No perceptible differences visually when compared at 200% magnification using flicker comparison | Achieved. Processed images on both systems were visually compared on a diagnostic monitor using flicker comparison, and no perceptible differences were observed when compared at 200% magnification. |
2. Sample size used for the test set and the data provenance
- Test Set Sample Size: Not explicitly stated as a number of images or patients. The study refers to "all the test images" for pixel difference analysis and "every pixel of the test images" for noise ratio calculations, implying a comprehensive evaluation of the images used.
- Data Provenance: Not explicitly stated (e.g., country of origin, retrospective/prospective). It mentions comparing images from the "in-room system (K202441)" which is the DRX-Evolution Plus system, suggesting a controlled comparison under laboratory or simulated clinical conditions rather than real-world patient data.
3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts
- This type of information is generally relevant for studies involving human interpretation and clinical endpoints. For the technical performance evaluation of Smart Noise Cancellation (SNC) described, the "ground truth" was established through quantitative technical metrics (pixel value differences, noise ratios) and visual comparison, rather than human expert consensus on clinical diagnoses. Therefore, no human experts were explicitly used to establish ground truth in the traditional sense for this specific performance evaluation.
4. Adjudication method for the test set
- Given that the performance evaluation was based on quantitative pixel-level analysis and visual comparison by presumably trained personnel rather than clinical interpretation, an adjudication method (like 2+1 or 3+1) was not applicable or described.
5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance
- No MRMC comparative effectiveness study was done or described. The study focused on the technical equivalency of the SNC feature between two systems (mobile vs. in-room) and the integration of new hardware (detector), not on the impact of AI assistance on human reader performance.
6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done
- Yes, a standalone performance evaluation of the SNC algorithm and its image output was effectively done. The assessment involved a "pixel-by-pixel analysis" and "noise ratio metric" to compare the output of the SNC processing on the mobile system against the in-room system. This evaluated the algorithm's performance independently of human interpretation.
7. The type of ground truth used
- The ground truth for the SNC performance evaluation was established through quantitative technical metrics and visual comparison against a known reference (the in-room system's SNC output). The reference was the cleared in-room system (DRX-Evolution Plus, K202441) with SNC, which was considered the "expected" or "gold standard" performance for SNC.
8. The sample size for the training set
- The document does not provide information on the sample size for the training set. This submission is for a modification to an existing device, specifically integrating an already cleared SNC technology (from K202441) onto a mobile platform and adding a new detector. The focus is on demonstrating the equivalence of the implementations and not on the development or training of the SNC algorithm itself.
9. How the ground truth for the training set was established
- Since information on the training set for the SNC algorithm is not provided, how its ground truth was established is not detailed in this document. It's implied that the SNC algorithm itself was developed and validated in the predicate device (DRX-Evolution Plus, K202441) submission, and the current submission leverages that existing, cleared technology.
Ask a specific question about this device
(29 days)
The device is designed to perform radiographic x-ray examinations on all pediatric and adult patient treatment areas.
The DRX-Revolution Mobile X-ray System is a diagnostic mobile x-ray system utilizing digital radiography (DR) technology. The system consists of a self-contained x-ray generator, image receptor(s), imaging display and software for acquiring medical diagnostic images outside of a standard stationary xray room. The DRX-Revolution system incorporates a flat-panel detector that can be used wirelessly for exams such as in-bed chest projections. The system can also be used to expose CR phosphor screens or films.
The Carestream DRX-Revolution Mobile X-ray System (K191025) underwent modifications compared to its predicate device (K120062). The primary changes include a different X-ray tube supplier, additional support for DRX Plus detectors, updated image acquisition software (ImageView), and a replaced high-voltage X-ray generator.
Here's an analysis of the acceptance criteria and the study proving adherence:
1. Table of Acceptance Criteria and Reported Device Performance:
| Acceptance Criteria | Reported Device Performance |
|---|---|
| Image Quality Equivalency | "Results of this data demonstrated that image quality on the modified DRX-Revolution Mobile X-ray System is equivalent to the device on the market.""Testing demonstrates that the modified device produces diagnostic image quality that is the same or better than the predicate.""The detectors have been tested and verified to meet the requirements for integration with the DRX-Revolution Mobile X-ray System (modified) device and the DQE/MTF data demonstrates image quality is the same as or better than the predicate.""The image quality of the modified device is at least as good as or better than that of the predicate device." |
| Safety and Effectiveness Equivalency | "The modified DRX-Revolution Mobile X-ray System is substantially equivalent to the predicate device currently cleared on the market (K120062).""The change in X-ray tube does not significantly change the functionality of the redesigned DRX-Revolution system, nor do changes significantly affect the safety or effectiveness of the device.""The generator was verified and validated and passed all testing and demonstrates there is no significant impact on clinical functionality or performance that could significantly affect safety and effectiveness.""Risks were assessed in accordance to ISO 14971 and risk control options were implemented with safety by design principles and with a risk methodology that reduces risks as far as possible.""Results of non-clinical testing demonstrate that the modified device is as safe and as effective as the predicate device.""The subject device is expected to be safe and effective for the device indications and are substantially equivalent to the predicate." |
| Maintenance of Intended Use | "In addition, the indications for use of the modified device, as described in labeling does not change as a result of the device modification(s).""The intended use remains unchanged." |
| Fundamental Scientific Technology Equivalency | "The modified DRX-Revolution employs the same fundamental scientific technology as the predicate device.""The fundamental scientific technology of the modified device is the same and is substantially equivalent to the predicate." |
| Hardware Components Functionality (e.g., X-ray tube) | "The change in X-ray tube does not significantly change the functionality of the redesigned DRX-Revolution system, nor do changes significantly affect the safety or effectiveness of the device." |
| Detector Integration and Performance | "The detectors have been tested and verified to meet the requirements for integration with the DRX-Revolution Mobile X-ray System (modified) device and the DQE/MTF data demonstrates image quality is the same as or better than the predicate." |
| Software Functionality (ImageView) | "No changes have been made between the DRX Carestream Evolution with ImageView (K163203) and the subject device with ImageView, other than some minor changes necessary for the software to function on the subject device. The image processing between the two devices is the same. This change has no clinical impact on image diagnosis, bench testing data demonstrates substantial equivalence." |
| Generator Performance | "The High-voltage X-ray generator has been replaced. This generator is considered a 1:1 replacement, there was no change in performance specifications. The generator was verified and validated and passed all testing and demonstrates there is no significant impact on clinical functionality or performance that could significantly affect safety and effectiveness." |
2. Sample size used for the test set and the data provenance:
- Sample Size: Not explicitly stated as a number of images or cases. The document mentions a "Phantom Imaging study."
- Data Provenance: The study was a "Phantom Imaging study," which implies the use of test phantoms rather than real patient data. This is typically done in a controlled laboratory environment. The country of origin is not specified but given Carestream's location (Rochester, New York), it is likely the US. The study type is retrospective, as it's bench testing to compare a modified device to an already marketed predicate.
3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts:
- This information is not provided in the document. As the study was a non-clinical "phantom imaging study" evaluating technical image quality attributes, it might not have involved human expert readers establishing diagnostic ground truth in the traditional sense. The evaluation likely relied on quantitative measurements of image quality metrics.
4. Adjudication method for the test set:
- This information is not provided as the study was a phantom imaging study focusing on technical image quality. Adjudication methods like 2+1 or 3+1 are typically used in clinical studies with human readers interpreting medical images.
5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:
- No, an MRMC comparative effectiveness study was not done. The submission explicitly states: "Clinical testing was not required to establish substantial equivalence. Bench testing was sufficient to assess the device safety and effectiveness." This device is a mobile X-ray system, not an AI-powered diagnostic software.
6. If a standalone (i.e. algorithm only without human-in-the loop performance) was done:
- Yes, in a sense, a "standalone" evaluation of the device's image quality was performed through the "Phantom Imaging study" and DQE/MTF data. This tested the device's inherent capability to produce images without direct human interpretation for diagnostic purposes, focusing on technical image quality attributes rather than diagnostic accuracy.
7. The type of ground truth used:
- The ground truth used was based on technical image quality attributes such as detail, sharpness, noise, and appearance of artifacts, as evaluated through a "Phantom Imaging study" and by DQE/MTF data. This is an objective technical assessment against established metrics for image quality, rather than a clinical ground truth like pathology or expert consensus on a diagnosis.
8. The sample size for the training set:
- This information is not applicable/not provided. This device is a hardware X-ray system with standard image processing software, not an AI/Machine Learning algorithm that undergoes a "training" phase with a large dataset. The "ImageView" software is a web-based application to improve usability, and its image processing is stated to be the same as previously cleared versions.
9. How the ground truth for the training set was established:
- This information is not applicable/not provided for the same reasons as point 8.
Ask a specific question about this device
Page 1 of 1