Search Filters

Search Results

Found 3 results

510(k) Data Aggregation

    K Number
    K213568
    Manufacturer
    Date Cleared
    2022-03-23

    (134 days)

    Product Code
    Regulation Number
    892.1720
    Why did this record match?
    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The device is designed to perform radiographic x-ray examinations on all pediatric and adult patient treatment areas.

    Device Description

    The DRX-Rise Mobile X-ray System is a diagnostic mobile X-ray system utilizing digital radiography technology. The DRX-Rise consists of a self-contained X-ray generator, image receptor(s), imaging display and software for acquiring medical diagnostic images outside of a standard stationary X-ray room. These components are mounted on a motorized cart that is battery powered to enable the device to be driven from location to location by user interaction. The DRX-Rise system incorporates a flat-panel detector that can be used wirelessly for exams such as in-bed chest projections. The device acquires images using Carestream's clinical acquisition software platform (ImageView) and digital flat panel detectors. ImageView is considered software that is of Moderate Level of Concern and not intended for manipulation of medical images. The DRX-Rise Mobile X-ray System is designed for digital radiography (DR) with Carestream detectors.

    AI/ML Overview

    The provided document is a 510(k) premarket notification for the DRX-Rise Mobile X-ray System, asserting its substantial equivalence to a predicate device (DRX-Revolution Mobile X-ray System, K191025). The document does not describe a study involving acceptance criteria for an AI/CADe device's performance when assisting human readers or evaluating standalone AI performance.

    Instead, the document focuses on demonstrating that the DRX-Rise Mobile X-ray System itself, as a physical medical device, is substantially equivalent to an already cleared device. This is achieved through comparisons of technological characteristics and compliance with consensus standards.

    Therefore, I cannot provide the requested information regarding acceptance criteria and studies for an AI/CADe device's performance (points 2-9) because the submission does not pertain to such a device or study.

    Here's a breakdown of what can be extracted from the provided text, related to the device itself:

    1. A table of acceptance criteria and the reported device performance:

    The document doesn't present acceptance criteria in the typical "performance target" vs. "achieved performance" format for an AI/CADe. Instead, it compares the modified device's specifications to the predicate device's specifications, arguing that any differences do not impact safety or performance.

    Criterion (Feature)Predicate Device Performance (DRX-Revolution Mobile X-ray System K191025)Modified Device Performance (DRX-Rise Mobile X-ray System K213568)Impact Assessment (Implicit Acceptance Criterion)
    Indications for UseThe device is designed to perform radiographic X-ray examinations on all pediatric and adult patients, in all patient treatment areas.SameSubstantially equivalent (Same indications for use is an explicit statement of acceptance)
    Imaging Device CompatibilityDigital Radiography (DR)SameSubstantially equivalent
    Digital Radiography Imaging Device (Detector)DRX Plus Detectors (K150766), (K153142), (K183245)SameSubstantially equivalent
    X-ray Generator Rating32kWSameSubstantially equivalent
    mAs Range (Generator)0.1-320 mAs0.1 mAs~630 mAsThe DRX Rise (modified device) provides more power in generator output. No impact to safety/performance. (Implicitly accepted if no safety/performance impact)
    X-ray Tube Voltage Range40-150kV (1kV steps)40-125kV (1kV steps)40-125kV is the most commonly used kV range in clinical imaging. No impact to safety/performance. (Implicitly accepted if no safety/performance impact)
    X-ray Tube ModelCanon/XRR-3336XCanon/E7242 (X / FX / GX)Same supplier but different tube model is used with the modified device. No impact to safety/performance. (Implicitly accepted if no safety/performance impact)
    X-ray Tube Focal Spot Size0.6mm and 1.2mm0.6 mm and 1.5 mmSmall focal spot size is same as predicate. Large focus spot size is 20% larger but within expected range for clinical imaging. No impact to safety/performance or to image quality. (Implicitly accepted if no safety/performance impact or to image quality)
    System Power for ChargingSingle Phase AC: 50/60 Hz, 1440 VA Voltage:100-240VSameSubstantially equivalent
    Application System Software (Operator Console X-ray Control)Carestream ImageView System software with image processing capability (K191025)SameSubstantially equivalent
    Collapsible ColumnYesNoThe column is fixed on the modified device. No impact to safety/performance. (Implicitly accepted if no safety/performance impact)
    Column Height2193mm-1390mm1930mm (fixed column)No impact to safety/performance. (Implicitly accepted if no safety/performance impact)
    Column Rotation Range+/- 270 degreesSameSubstantially equivalent
    Travel MethodElectric motor (battery powered)SameSubstantially equivalent

    2. Sample sized used for the test set and the data provenance: Not applicable. This submission concerns a hardware medical device, not a performance study on a test set of images. The "test set" in this context refers to the device itself being tested against its specifications and existing standards.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts: Not applicable. Ground truth for image interpretation by experts is not relevant to this submission.

    4. Adjudication method (e.g. 2+1, 3+1, none) for the test set: Not applicable.

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance: Not applicable. No AI assistance mentioned.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done: Not applicable.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc): Not applicable. The "ground truth" for this device's acceptance is its compliance with recognized consensus standards and its functional equivalence to a predicate device.

    8. The sample size for the training set: Not applicable. There is no mention of a training set as this is not an AI/ML device submission.

    9. How the ground truth for the training set was established: Not applicable.

    Ask a Question

    Ask a specific question about this device

    K Number
    K203159
    Device Name
    Lux 35 Detector
    Manufacturer
    Date Cleared
    2020-12-02

    (40 days)

    Product Code
    Regulation Number
    892.1680
    Why did this record match?
    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The device is intended to capture for display radiographic images of human anatomy including both pediatric and adult patients. The device is intended for use in general projections wherever conventional screen-film systems or CR systems may be used. Excluded from the indications for use are mammography, fluoroscopy, and angiography applications

    Device Description

    The modified DRX Plus 3543C is a scintillator-photodetector device (Solid State X-ray Imager) utilizing an amorphous silicon flat panel image sensor. The modified detector is redesigned with the intent to reduce weight and increase durability, while utilizing a non-glass substrate material and cesium iodide scintillator. The modified detector, like the predicate is designed to interact with Carestream's DRX-1 System (K090318).

    The modified DRX Plus 3543C Detector, like the predicate, creates a digital image from the x-rays incident on the input surface during an x-ray exposure. The flat panel imager absorbs incident x-rays and converts the energy into visible light photons. These light photons are converted into electrical charge and stored in structures called "pixels." The digital value in each pixel of the image is directly related to the intensity of the incident x-ray flux at that particular location on the surface of the detector. Image acquisition software is used to correct the digital image for defective pixels and lines on the detector, perform gain and offset correction and generate sub-sampled preview images

    AI/ML Overview

    The provided text describes a 510(k) submission for a medical device, the Lux 35 Detector, which is a digital X-ray flat panel detector. The submission aims to demonstrate substantial equivalence to a predicate device (DRX Plus 3543 Detector). The information focuses on design modifications and non-clinical testing.

    Here's an analysis of the acceptance criteria and study details based on the provided text, highlighting where information is present and where it is not:

    Device: Lux 35 Detector (Carestream Health, Inc.)

    Study Type: Non-clinical (bench) testing, specifically a Phantom Image Study, to demonstrate substantial equivalence of image quality to a predicate device.

    1. Table of Acceptance Criteria and Reported Device Performance:

    The document doesn't explicitly state "acceptance criteria" for image quality in a tabular format with pass/fail thresholds. Instead, it provides a qualitative comparison of image attributes. The closest interpretation of "acceptance criteria" is that the modified device's image quality needed to be "equivalent to just noticeably better than" the predicate.

    Acceptance Criterion (Inferred)Reported Device Performance (Lux 35 Detector vs. Predicate)
    Image Detail PerformanceRatings for detail were "significantly greater than 0," indicating images were equivalent to or better than predicate.
    Image Sharpness PerformanceRatings for sharpness were "significantly greater than 0," indicating images were equivalent to or better than predicate.
    Image Noise PerformanceRatings for noise were "significantly greater than 0," indicating images were equivalent to or better than predicate.
    Appearance of ArtifactsQualitative assessment, results not numerically quantified but implied to be equivalent or better given overall conclusion.
    DQE (Detective Quantum Efficiency)55% (RQA-5, 1 cycle/mm, 2.5 µGy) for Lux 35 vs. 26% (RQA-5, 1 cycle/mm, 3.1 µGy) for Predicate. This represents "improved image quality."
    MTF (Modulation Transfer Function)62% (RQA-5, 1 cycle/mm) for Lux 35 vs. 54% (RQA-5, 1 cycle/mm) for Predicate. This represents "improved image quality."
    Overall Image Quality Comparison"Greater than 84% of all responses were rated 0 or higher in favor of the modified DRX Plus 3543C panel." "All ratings for the attributes (detail contrast, sharpness and noise) were significantly greater than 0 indicating that the modified DRX Plus 3543C images were equivalent to just noticeably better than the predicate images." "The image quality of the modified device is at least as good as or better than that of the predicate device."

    2. Sample Size Used for the Test Set and Data Provenance:

    • Sample Size: Not explicitly stated. The text mentions "a Phantom Image Study" but does not quantify the number of images or runs.
    • Data Provenance: This was a non-clinical bench testing study using phantoms. Therefore, there is no patient data or geographical provenance. The study was likely conducted at Carestream's facilities. It is a prospective study in the sense that the testing was performed specifically for this submission.

    3. Number of Experts Used to Establish Ground Truth for the Test Set and Qualifications of Experts:

    • Number of Experts: Not specified. The text mentions "Greater than 84% of all responses were rated 0 or higher," implying a group of evaluators, but their number is not provided.
    • Qualifications of Experts: Not specified. It's unclear if these were radiologists, imaging scientists, or other relevant personnel.

    4. Adjudication Method for the Test Set:

    • Adjudication Method: Not specified. The phrase "Greater than 84% of all responses were rated 0 or higher" suggests individual ratings were collected, but how conflicts or multiple ratings were aggregated or adjudicated is not detailed.

    5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study Was Done:

    • Answer: No. The study was a "Phantom Image Study" focused on technical image quality attributes, not human reader performance.
    • Effect Size of Human Readers: Not applicable, as no MRMC study was performed.

    6. If a Standalone (i.e., algorithm only without human-in-the-loop performance) Was Done:

    • Answer: Yes, in a sense. The evaluation of DQE and MTF are standalone technical performance metrics of the detector itself, independent of human interpretation. The "Phantom Image Study" also evaluates the output of the device (images) based on technical attributes, rather than a human diagnostic task.

    7. The Type of Ground Truth Used:

    • Type of Ground Truth: For the phantom image study, the "ground truth" for evaluating image quality attributes (detail, sharpness, noise, artifacts) is based on technical image quality metrics (DQE, MTF) and potentially expert consensus on visual assessments of phantom images against known ideal phantom characteristics. It is not based on patient outcomes, pathology, or clinical diagnoses.

    8. The Sample Size for the Training Set:

    • Sample Size for Training Set: Not applicable. This device is a hardware component (X-ray detector) and the study described is a non-clinical evaluation of its image quality, not an AI/algorithm that requires a training set of data.

    9. How the Ground Truth for the Training Set Was Established:

    • Ground Truth Establishment for Training Set: Not applicable, as this is not an AI/algorithm that requires a training set.
    Ask a Question

    Ask a specific question about this device

    K Number
    K183245
    Date Cleared
    2019-02-08

    (79 days)

    Product Code
    Regulation Number
    892.1680
    Reference & Predicate Devices
    Why did this record match?
    Reference Devices :

    K180809

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The device is intended to capture for display radiographic images of human anatomy including both pediatric and adult patients. The device is intended for use in general projection radiographic applications wherever conventional screen-film systems or CR systems may be used. Excluded from the indications for use are mammography, fluoroscopy, and angiography applications.

    Device Description

    The Carestream DRX-1 System is a diagnostic imaging system utilizing digital radiography (DR) technology that is used with diagnostic x-ray systems. The system consists of the Carestream DRX-1 System Console (operator console), flat panel digital imager (detector), and optional tether interface box. The system can be configured to register and use any of the two new DRX Plus 2530 and DRX Plus 2530C Detectors. Images captured with a flat panel digital detector can be communicated to the operator console via tethered or wireless connection.

    AI/ML Overview

    Here's a breakdown of the acceptance criteria and the study proving the device meets them, based on the provided text:

    1. Table of Acceptance Criteria and Reported Device Performance

    The document mentions that predefined acceptance criteria were met for a range of aspects. While specific numeric targets for all criteria are not explicitly stated, the summary indicates successful performance against these criteria.

    Acceptance Criteria CategorySpecific Criteria (where mentioned)Reported Device Performance
    Non-Clinical (Bench) Testing- WeightMet predefined acceptance criteria, demonstrated safety, effectiveness, and performance as good as or better than predicate.
    - Pixel sizeMet predefined acceptance criteria, demonstrated safety, effectiveness, and performance as good as or better than predicate.
    - ResolutionMet predefined acceptance criteria, demonstrated safety, effectiveness, and performance as good as or better than predicate.
    - Pixel pitchMet predefined acceptance criteria, demonstrated safety, effectiveness, and performance as good as or better than predicate.
    - Total pixel areaMet predefined acceptance criteria, demonstrated safety, effectiveness, and performance as good as or better than predicate.
    - Usable pixel areaMet predefined acceptance criteria, demonstrated safety, effectiveness, and performance as good as or better than predicate.
    - MTF (at various spatial resolutions)Met predefined acceptance criteria, demonstrated safety, effectiveness, and performance as good as or better than predicate.
    - DQE (at various spatial resolutions)Met predefined acceptance criteria; demonstrated to deliver quality images equivalent to the predicate.
    - SensitivityMet predefined acceptance criteria; demonstrated to deliver quality images equivalent to the predicate.
    - GhostingMet predefined acceptance criteria, demonstrated safety, effectiveness, and performance as good as or better than predicate.
    - Boot-up timeMet predefined acceptance criteria, demonstrated safety, effectiveness, and performance as good as or better than predicate.
    - Operating temperatureMet predefined acceptance criteria, demonstrated safety, effectiveness, and performance as good as or better than predicate.
    - Exposure latitudeMet predefined acceptance criteria, demonstrated safety, effectiveness, and performance as good as or better than predicate.
    - Signal uniformityMet predefined acceptance criteria, demonstrated safety, effectiveness, and performance as good as or better than predicate.
    - Dark noise (ADC)Met predefined acceptance criteria, demonstrated safety, effectiveness, and performance as good as or better than predicate.
    - Image QualityDemonstrated to deliver quality images equivalent to the predicate.
    - Intended UseConformed to specifications.
    - Workflow-related performanceConformed to specifications.
    - Shipping performanceConformed to specifications.
    - General functionality and reliability (hardware and software)Conformed to specifications.
    Clinical Study (Reader Study)- Diagnostic image quality (RadLex rating)Mean RadLex rating for both subject devices and predicate device were "Diagnostic (3)" with very little variability.
    - Equivalence to predicate deviceStatistical tests confirmed equivalence between the mean ratings of the subject devices and the predicate, and equivalence in beam detect mode ("On" and "Off").
    - Percentage of Diagnostic/Exemplary images (subject devices)98% of DRX Plus 2530 responses were Diagnostic (3) or Exemplary (4). 96% of DRX Plus 2530C responses were Diagnostic (3) or Exemplary (4).
    - Comparative performance to predicate (subject vs. predicate)71% of DRX Plus 2530 responses were equivalent to or favored the predicate. 68% of DRX Plus 2530C responses were equivalent to or favored the predicate.
    Regulatory Compliance & Safety- Conformance to specificationsConformed to its specifications.
    - Safety and EffectivenessDemonstrated to be as safe and effective as the predicate.

    2. Sample Size Used for the Test Set and Data Provenance

    • Sample Size for Test Set: 162 acquired images (cadaver and pediatric phantom).
    • Data Provenance: The images were acquired at the University of Rochester Medical Center in Rochester, NY. The study used adult cadavers (2) and pediatric phantoms. This indicates a prospective data acquisition specifically for the study.

    3. Number of Experts Used to Establish the Ground Truth for the Test Set and Qualifications

    • Number of Experts: Three (3)
    • Qualifications: Board certified radiologists. (No specific years of experience are mentioned).

    4. Adjudication Method for the Test Set

    The document states that the images were "evaluated by three (3) board certified radiologists using a graduated 4 point scale based on diagnostic image quality." However, it does not explicitly describe an adjudication method (like 2+1 or 3+1 consensus). It sounds like individual ratings were collected, and then the mean RadLex rating was calculated, implying that each individual rating contributed rather than a formal consensus being reached for each image.

    5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study Was Done, and Effect Size

    • Was an MRMC study done? Yes, a reader study was performed comparing the investigational devices (DRX Plus 2530 and DRX Plus 2530C Detectors) to the predicate device (DRX Plus 3543 Detector) using three board-certified radiologists.
    • Effect Size of human readers improve with AI vs without AI assistance: This information is not applicable as the study described is for a digital radiography detector system, not an AI-assisted diagnostic tool. The study focuses on comparing the image quality of different hardware detectors.

    6. If a Standalone (Algorithm Only Without Human-in-the-Loop Performance) Was Done

    This is not applicable as the device is a hardware detector system, not a standalone algorithm. The study evaluated the image quality produced by the hardware, which then humans interpret.

    7. The Type of Ground Truth Used

    The ground truth for the clinical study was established through expert consensus (implicit in the rating by multiple radiologists) on the "diagnostic image quality" using a RadLex rating scale. It's important to note this is not "pathology" or "outcomes data" but rather a subjective assessment of image quality by qualified experts.

    8. The Sample Size for the Training Set

    The document does not provide information regarding a training set size. This is expected as the submission is for a hardware device (detector), and the testing described focuses on its performance characteristics and image quality, not the training of an AI algorithm.

    9. How the Ground Truth for the Training Set Was Established

    Since no training set is mentioned (as this is a hardware device submission), this information is not applicable.

    Ask a Question

    Ask a specific question about this device

    Page 1 of 1