K Number
K172700
Device Name
OEC One
Date Cleared
2017-11-09

(63 days)

Product Code
Regulation Number
892.1650
Panel
RA
Reference & Predicate Devices
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

The OEC One™ mobile C-arm system is designed to provide fluoroscopic and digital spot/film images of adult and pediatric patient populations during diagnostic, interventional, and surgical procedures. Examples of a clinical application may include: orthopedic, gastrointestinal, endoscopic, urologic, neurologic, critical care, and emergency procedures.

Device Description

The OEC One™ is a mobile C-arm x-ray system to provide fluoroscopic images of the patient during diagnostic, interventional, and surgical procedures such as orthopedic, gastrointestinal, endoscopic, urologic, neurologic, critical care, and emergency procedures. These images help the physician visualize the patient's anatomy and localize clinical regions of interest. The system consists of a mobile stand with an articulating arm attached to it to support an image display monitor (widescreen monitor) and a TechView tablet, and a "C" shaped

AI/ML Overview

Based on the provided text, the device in question is the OEC One™ mobile C-arm system, which is an image-intensified fluoroscopic X-ray system. The document is a 510(k) Premarket Notification Submission, indicating that the manufacturer is seeking to demonstrate substantial equivalence to a legally marketed predicate device rather than provide evidence of a novel device's safety and effectiveness.

Therefore, the "acceptance criteria" and "study that proves the device meets the acceptance criteria" are framed within the context of demonstrating substantial equivalence to the predicate device (K123603 OEC Brivo), rather than proving the device's de novo performance against specific clinical metrics as one might expect for a new AI/CADx device.

Here's an analysis of the provided information in response to your specific questions:

1. A table of acceptance criteria and the reported device performance

The document does not present a table of specific performance acceptance criteria (e.g., sensitivity, specificity, accuracy) for a diagnostic output, as this is an imaging device rather than a diagnostic AI/CADx algorithm. Instead, the acceptance criteria are linked to demonstrating that the modified device maintains the same safety and effectiveness as the predicate device, especially considering the changes made (integration of mainframe/workstation, new display, software updates).

The "acceptance criteria" for this 510(k) appear to be:

  • Conformance to relevant safety and performance standards: IEC 60601-1 Ed. 3 series (including IEC60601-2-54 and IEC 60601-2-43), and all applicable 21 CFR Subchapter J performance standards.
  • Successful verification and validation: Demonstrating that the system met design input and user needs, including hazard mitigation.
  • Maintenance of comparable image quality: Assessed through engineering bench testing using anthropomorphic phantoms.
  • Compliance with software development requirements: For a "Moderate" level of concern device.
Acceptance Criteria CategoryReported Device Performance/Evidence
Safety and Performance Standards- System tested by an NRTL and certified compliant with IEC 60601-1 Ed. 3 series, including IEC60601-2-54 and IEC 60601-2-43.
  • All applicable 21CFR Subchapter J performance standards are met. |
    | Verification and Validation | - Verification and validation including hazard mitigation has been executed with results demonstrating the OEC One™ system met design input and user needs.
  • Developed under GE Healthcare's Quality Management System, including design controls, risk management, and software development life cycle processes.
  • Quality assurance measures applied: Risk Analysis, Required Reviews, Design Reviews, Unit Testing (Sub System verification), Integration testing (System verification), Performance testing (Verification), Safety testing (Verification), Simulated use testing (Validation). |
    | Image Quality/Performance (Non-Clinical) | - Additional engineering bench testing on image performance using anthropomorphic phantoms was performed.
  • All the image quality/performance testing identified for fluoroscopy found in FDA's "Information for Industry: X-ray Imaging Devices - Laboratory Image Quality and Dose Assessment, Tests and Standards" was performed with acceptable results. |
    | Software Compliance | - Substantial equivalence was also based on software documentation for a "Moderate" level of concern device. |
    | Clinical Equivalence (No Clinical Study) | - "Because OEC One's modification based on the predicate device does not change the system's intended use and represent equivalent technological characteristics, clinical studies are not required to support substantial equivalence." This indicates the acceptance criterion for clinical performance was met by demonstrating the modifications did not impact the clinical function or safety relative to the predicate. |

2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)

  • Sample Size for Test Set: Not applicable in the context of clinical images with human expert ground truth for an AI/CADx device. The testing described focuses on non-clinical engineering bench tests using anthropomorphic phantoms and system verification/validation against standards.
  • Data Provenance: The document states "Additional engineering bench testing on image performance using anthropomorphic phantoms was also performed." This implies a prospective generation of test data using physical phantoms, rather than retrospective or prospective clinical patient data. The country of origin for this testing is not explicitly stated beyond the manufacturer's location (China).

3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)

  • Not applicable. The testing described is primarily engineering and performance verification using phantoms and standards, not clinical image interpretation requiring expert radiologists to establish ground truth for a diagnostic task.

4. Adjudication method (e.g. 2+1, 3+1, none) for the test set

  • Not applicable, as there is no clinical image-based test set requiring human adjudication.

5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

  • No MRMC study was done. The document explicitly states: "Clinical testing: Because OEC One’s modification based on the predicate device does not change the system’s intended use and represent equivalent technological characteristics, clinical studies are not required to support substantial equivalence." This is not a study of AI assistance.

6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

  • Not applicable. This is not an AI/CADx algorithm. The device itself is an X-ray imaging system. The software updates mentioned ("Adaptive Dynamic Range Optimization(ARDO) and motion artifact reduction") relate to image processing within the device itself, not a separate standalone diagnostic algorithm.

7. The type of ground truth used (expert consensus, pathology, outcomes data, etc)

  • The "ground truth" for the device's performance is established through:
    • Engineering benchmarks and physical phantom measurements: For image quality assessment against established standards (e.g., FDA's "Information for Industry: X-ray Imaging Devices - Laboratory Image Quality and Dose Assessment, Tests and Standards").
    • Compliance with international safety and performance standards: IEC 60601 series, 21 CFR Subchapter J.
    • Conformance to design specifications and user needs: Through verification and validation activities.

8. The sample size for the training set

  • Not applicable. This is not a machine learning or AI device that requires a training set in the conventional sense. The "software updates" mentioned are more likely based on engineering principles and signal processing than machine learning training.

9. How the ground truth for the training set was established

  • Not applicable, as there is no explicit "training set" for an AI algorithm. Software development and calibration would typically rely on engineering specifications, physical models, and potentially empirical adjustments based on performance testing.

§ 892.1650 Image-intensified fluoroscopic x-ray system.

(a)
Identification. An image-intensified fluoroscopic x-ray system is a device intended to visualize anatomical structures by converting a pattern of x-radiation into a visible image through electronic amplification. This generic type of device may include signal analysis and display equipment, patient and equipment supports, component parts, and accessories.(b)
Classification. Class II (special controls). An anthrogram tray or radiology dental tray intended for use with an image-intensified fluoroscopic x-ray system only is exempt from the premarket notification procedures in subpart E of part 807 of this chapter subject to the limitations in § 892.9. In addition, when intended as an accessory to the device described in paragraph (a) of this section, the fluoroscopic compression device is exempt from the premarket notification procedures in subpart E of part 807 of this chapter subject to the limitations in § 892.9.