Search Filters

Search Results

Found 2 results

510(k) Data Aggregation

    K Number
    K182626
    Device Name
    OEC One
    Date Cleared
    2018-11-16

    (53 days)

    Product Code
    Regulation Number
    892.1650
    Reference & Predicate Devices
    Predicate For
    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The OEC One™ mobile C-arm system is designed to provide fluoroscopic and digital spot images of adult and pediatric patient populations during diagnostic, interventional, and surgical procedures. Examples of a clinical application may include: orthopedic, gastrointestinal, endoscopic, urologic, neurologic, vascular, critical care, and emergency procedures.

    Device Description

    The OEC One™ is a mobile C-arm x-ray system to provide fluoroscopic images of the patient during diagnostic, interventional, and surgical procedures such as orthopedic, gastrointestinal, endoscopic, urologic, vascular, neurologic, critical care, and emergency procedures. These images help the physician visualize the patient's anatomy and localize clinical regions of interest. The system consists of a mobile stand with an articulating arm attached to it to support an image display monitor (widescreen monitor) and a TechView tablet, and a "C" shaped apparatus that has an image intensifier on the top of the C-arm and the X-ray Source assembly at the opposite end.

    The OEC One™ is capable of performing linear motions (vertical, horizontal) and rotational motions (orbital, lateral, wig-wag) that allow the user to position the X-ray image chain at various angles and distances with respect to the patient anatomy to be imaged. The C- arm is mechanically balanced allowing for ease of movement and capable of being "locked" in place using a manually activated lock.

    The subject device is labelled as OEC One.

    AI/ML Overview

    The provided text is a 510(k) Premarket Notification Submission for the OEC One with vascular option. This document primarily focuses on establishing substantial equivalence to a predicate device (OEC One, K172700) rather than presenting a detailed study with acceptance criteria for device performance in the context of an AI/algorithm-driven device.

    The "device" in this context is an X-ray imaging system (OEC One™ mobile C-arm system), and the changes described are hardware and software modifications to enhance vascular imaging features. It is not an AI or algorithm-only device with specific performance metrics like sensitivity, specificity, or AUC.

    Therefore, most of the requested information regarding acceptance criteria for AI performance, sample sizes for test/training sets, expert ground truth, adjudication methods, MRMC studies, or standalone algorithm performance is not applicable or cannot be extracted from this document.

    However, I can extract information related to the device's technical specifications and the testing performed to demonstrate its safety and effectiveness.

    Here is a summary of the information that can be extracted, addressing the closest relevant points:

    1. A table of acceptance criteria and the reported device performance

    The document does not provide a table of numerical acceptance criteria (e.g., sensitivity, specificity) for the device's imaging performance in relation to clinical outcomes. Instead, the acceptance criteria are generally implied by conformance to existing standards and successful completion of various engineering and verification tests. The "reported device performance" refers to the device meeting these design inputs and user needs.

    Acceptance Criteria (Implied)Reported Device Performance
    Compliance with medical electrical equipment standardsCertified compliant with IEC 60601-1 Ed. 3 series, including IEC60601-2-54:2009 and IEC 60601-2-43:2010.
    Compliance with radiation performance standardsAll applicable 21 CFR Subchapter J performance standards were met.
    Design inputs and user needs metVerification and validation executed; results demonstrate the OEC One™ system met the design inputs and user needs.
    Image quality and dose assessment for fluoroscopyAll image quality/performance testing identified for fluoroscopy in FDA's "Information for Industry: X-ray Imaging Devices- Laboratory Image Quality and Dose Assessment. Tests and Standards" was performed with acceptable results. This included testing using anthropomorphic phantoms.
    Software documentation requirements for moderate level of concernSubstantial equivalence based on software documentation for a "Moderate" level of concern device.
    Functional operation of new vascular featuresThe primary change was to implement vascular features (Subtraction, Roadmap, Peak Opacification, Cine Recording/Playback, Re-registration, Variable Landmarking, Mask Save/Recall, Reference Image Hold) to perform vascular procedures with "easiest workflow and least intervention by the user" and "further enhance the vascular workflows." (Bench testing demonstrated user requirements were met.)
    Safety and effectivenessThe changes do not introduce any adverse effects nor raise new questions of safety and effectiveness.

    2. Sample sized used for the test set and the data provenance

    • Test Set Sample Size: Not explicitly stated in terms of patient data. The testing involved "anthropomorphic phantoms" for image performance and various engineering/bench testing for functional validation. These are not "test sets" in the typical sense of a dataset for an AI algorithm.
    • Data Provenance: Not applicable as it's not patient data for AI evaluation. The testing was conducted internally at GE Hualun Medical Systems Co., Ltd.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts

    • Number of Experts: Not applicable. Ground truth from experts is not mentioned for this type of device evaluation.
    • Qualifications of Experts: Not applicable.

    4. Adjudication method for the test set

    • Adjudication Method: Not applicable. There was no expert adjudication process described for the testing performed.

    5. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    • MRMC Study: No. This document describes a C-arm X-ray system, not an AI-assisted diagnostic tool that would typically undergo such a study.
    • Effect Size of Human Readers: Not applicable.

    6. If a standalone (i.e., algorithm only without human-in-the-loop performance) was done

    • Standalone Performance: Not applicable. The device is an imaging system; its "performance" is inherently tied to image acquisition and display, which are used by a human operator/physician. The "vascular features" are software enhancements to the imaging workflow, not a standalone AI algorithm.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)

    • Type of Ground Truth: For image quality, the ground truth was based on physical phantom characteristics and established technical standards (e.g., image resolution, contrast, noise, dose measurements). For functional aspects, it was based on meeting design inputs and user requirements validated through engineering tests. No expert consensus, pathology, or outcomes data were used as "ground truth" for this device's substantial equivalence declaration.

    8. The sample size for the training set

    • Training Set Sample Size: Not applicable. This document does not describe an AI model that requires a training set. The software updates are feature additions and modifications, not learned from a large dataset in the way a deep learning model would be.

    9. How the ground truth for the training set was established

    • Ground Truth Establishment: Not applicable, as there is no mention of an AI model with a training set.
    Ask a Question

    Ask a specific question about this device

    K Number
    K172700
    Device Name
    OEC One
    Date Cleared
    2017-11-09

    (63 days)

    Product Code
    Regulation Number
    892.1650
    Reference & Predicate Devices
    Predicate For
    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The OEC One™ mobile C-arm system is designed to provide fluoroscopic and digital spot/film images of adult and pediatric patient populations during diagnostic, interventional, and surgical procedures. Examples of a clinical application may include: orthopedic, gastrointestinal, endoscopic, urologic, neurologic, critical care, and emergency procedures.

    Device Description

    The OEC One™ is a mobile C-arm x-ray system to provide fluoroscopic images of the patient during diagnostic, interventional, and surgical procedures such as orthopedic, gastrointestinal, endoscopic, urologic, neurologic, critical care, and emergency procedures. These images help the physician visualize the patient's anatomy and localize clinical regions of interest. The system consists of a mobile stand with an articulating arm attached to it to support an image display monitor (widescreen monitor) and a TechView tablet, and a "C" shaped

    AI/ML Overview

    Based on the provided text, the device in question is the OEC One™ mobile C-arm system, which is an image-intensified fluoroscopic X-ray system. The document is a 510(k) Premarket Notification Submission, indicating that the manufacturer is seeking to demonstrate substantial equivalence to a legally marketed predicate device rather than provide evidence of a novel device's safety and effectiveness.

    Therefore, the "acceptance criteria" and "study that proves the device meets the acceptance criteria" are framed within the context of demonstrating substantial equivalence to the predicate device (K123603 OEC Brivo), rather than proving the device's de novo performance against specific clinical metrics as one might expect for a new AI/CADx device.

    Here's an analysis of the provided information in response to your specific questions:

    1. A table of acceptance criteria and the reported device performance

    The document does not present a table of specific performance acceptance criteria (e.g., sensitivity, specificity, accuracy) for a diagnostic output, as this is an imaging device rather than a diagnostic AI/CADx algorithm. Instead, the acceptance criteria are linked to demonstrating that the modified device maintains the same safety and effectiveness as the predicate device, especially considering the changes made (integration of mainframe/workstation, new display, software updates).

    The "acceptance criteria" for this 510(k) appear to be:

    • Conformance to relevant safety and performance standards: IEC 60601-1 Ed. 3 series (including IEC60601-2-54 and IEC 60601-2-43), and all applicable 21 CFR Subchapter J performance standards.
    • Successful verification and validation: Demonstrating that the system met design input and user needs, including hazard mitigation.
    • Maintenance of comparable image quality: Assessed through engineering bench testing using anthropomorphic phantoms.
    • Compliance with software development requirements: For a "Moderate" level of concern device.
    Acceptance Criteria CategoryReported Device Performance/Evidence
    Safety and Performance Standards- System tested by an NRTL and certified compliant with IEC 60601-1 Ed. 3 series, including IEC60601-2-54 and IEC 60601-2-43.- All applicable 21CFR Subchapter J performance standards are met.
    Verification and Validation- Verification and validation including hazard mitigation has been executed with results demonstrating the OEC One™ system met design input and user needs.- Developed under GE Healthcare's Quality Management System, including design controls, risk management, and software development life cycle processes.- Quality assurance measures applied: Risk Analysis, Required Reviews, Design Reviews, Unit Testing (Sub System verification), Integration testing (System verification), Performance testing (Verification), Safety testing (Verification), Simulated use testing (Validation).
    Image Quality/Performance (Non-Clinical)- Additional engineering bench testing on image performance using anthropomorphic phantoms was performed.- All the image quality/performance testing identified for fluoroscopy found in FDA's "Information for Industry: X-ray Imaging Devices - Laboratory Image Quality and Dose Assessment, Tests and Standards" was performed with acceptable results.
    Software Compliance- Substantial equivalence was also based on software documentation for a "Moderate" level of concern device.
    Clinical Equivalence (No Clinical Study)- "Because OEC One's modification based on the predicate device does not change the system's intended use and represent equivalent technological characteristics, clinical studies are not required to support substantial equivalence." This indicates the acceptance criterion for clinical performance was met by demonstrating the modifications did not impact the clinical function or safety relative to the predicate.

    2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)

    • Sample Size for Test Set: Not applicable in the context of clinical images with human expert ground truth for an AI/CADx device. The testing described focuses on non-clinical engineering bench tests using anthropomorphic phantoms and system verification/validation against standards.
    • Data Provenance: The document states "Additional engineering bench testing on image performance using anthropomorphic phantoms was also performed." This implies a prospective generation of test data using physical phantoms, rather than retrospective or prospective clinical patient data. The country of origin for this testing is not explicitly stated beyond the manufacturer's location (China).

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)

    • Not applicable. The testing described is primarily engineering and performance verification using phantoms and standards, not clinical image interpretation requiring expert radiologists to establish ground truth for a diagnostic task.

    4. Adjudication method (e.g. 2+1, 3+1, none) for the test set

    • Not applicable, as there is no clinical image-based test set requiring human adjudication.

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    • No MRMC study was done. The document explicitly states: "Clinical testing: Because OEC One’s modification based on the predicate device does not change the system’s intended use and represent equivalent technological characteristics, clinical studies are not required to support substantial equivalence." This is not a study of AI assistance.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    • Not applicable. This is not an AI/CADx algorithm. The device itself is an X-ray imaging system. The software updates mentioned ("Adaptive Dynamic Range Optimization(ARDO) and motion artifact reduction") relate to image processing within the device itself, not a separate standalone diagnostic algorithm.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc)

    • The "ground truth" for the device's performance is established through:
      • Engineering benchmarks and physical phantom measurements: For image quality assessment against established standards (e.g., FDA's "Information for Industry: X-ray Imaging Devices - Laboratory Image Quality and Dose Assessment, Tests and Standards").
      • Compliance with international safety and performance standards: IEC 60601 series, 21 CFR Subchapter J.
      • Conformance to design specifications and user needs: Through verification and validation activities.

    8. The sample size for the training set

    • Not applicable. This is not a machine learning or AI device that requires a training set in the conventional sense. The "software updates" mentioned are more likely based on engineering principles and signal processing than machine learning training.

    9. How the ground truth for the training set was established

    • Not applicable, as there is no explicit "training set" for an AI algorithm. Software development and calibration would typically rely on engineering specifications, physical models, and potentially empirical adjustments based on performance testing.
    Ask a Question

    Ask a specific question about this device

    Page 1 of 1