Search Filters

Search Results

Found 1 results

510(k) Data Aggregation

    K Number
    K241996
    Device Name
    ULTRA 1040
    Manufacturer
    Date Cleared
    2025-04-18

    (283 days)

    Product Code
    Regulation Number
    892.1720
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    ULTRA 1040

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The ULTRA 1040 Portable X-ray Unit is a portable X-ray device, intended for use by a qualified/trained physician or technician for the purpose of acquiring X-ray images of the patient's extremities.

    This device is not intended for mammography.

    Device Description

    This Portable X-ray Unit (Model: ULTRA 1040) consists of the following major components: an X-ray main unit, an X-ray exposure hand switch and a battery charger and other components. The X-ray main unit is mainly for emitting X-rays required for X-ray exams; the hand switch is for output control of X-ray emitting, and the battery charger is for charging the built-in battery in the X-ray main unit. The device can be used with an X-ray detector, a computer for receiving and detecting signal results and image processing software. The major components of the X-ray main unit include: handle, enclosure, control panel, system control board, high-voltage tank, collimator (beam limiter), lithium-ion battery and system control software running on the system control board.

    The system control software is for real-time interaction and control with various circuit modules inside the portable X-ray unit. The software responds to user operations on the control panel. The user can adjust and control the kV and mAs parameters, and the software will display the parameters or directly load the APR parameters. The software loads the control data from X-ray output into the high-voltage generation control circuit of the system control board and control the high-voltage tank to generate high-voltage to excite the ray tube inside to emit X-rays, control the switch of the collimator indicator, and monitor the working status of the device, the battery power status, and control the display of the status indicators.

    AI/ML Overview

    The provided FDA 510(k) clearance letter for the ECORAY Ultra 1040 is a regulatory document and does not contain the detailed clinical study results, particularly regarding acceptance criteria for AI-related performance, MRMC studies, or the specifics of training and test set ground truth establishment. The document focuses on showing substantial equivalence to a predicate device through non-clinical testing (electrical safety, EMC, software, bench performance) and a general statement about a "task-based image quality study" for clinical adequacy.

    Therefore, I cannot provide a table of acceptance criteria and reported device performance related to AI, nor can I fully answer questions about AI-specific study design, ground truth, or MRMC studies, as this information is not present in the provided text.

    The document does mention:

    • A "comprehensive, task-based image quality study" to assess clinical adequacy (Section 8).
    • Radiologic technologists acquired images, and radiologists clinically evaluated image quality (Section 8). This implies human evaluation, but not necessarily a comparative effectiveness study with AI.
    • Software testing in accordance with IEC 62304:2006/A1:2015 (Section 7.3). This standard governs software life cycle processes for medical devices, but doesn't specify AI performance metrics.

    Based only on the provided text, here's what can be extracted and what cannot:


    1. A table of acceptance criteria and the reported device performance

    Cannot be provided for AI-related performance. The document states:

    • "The Ultra 1040 Portable X-ray Unit met bench testing acceptance criteria as defined in the test protocols." (Section 7.2)
    • "All test results were satisfying with the standards." (Section 7.1 regarding electrical, mechanical, environmental safety, and EMC).
    • For the clinical study: "radiologist clinically evaluated the image quality" to assess "clinical adequacy of the device's imaging performance." (Section 8)

    However, the specific quantitative acceptance criteria and their corresponding reported values for image quality performance or any AI-assisted diagnostic criteria are not detailed in this document. The document primarily focuses on regulatory compliance and substantial equivalence to a predicate, not detailed clinical performance metrics for an AI component.


    2. Sample sized used for the test set and the data provenance

    • Test Set Size: Not specified for any clinical study.
    • Data Provenance: Not specified (e.g., country of origin). The study "collected radiographic images for relevant anatomical indications stated in the Indications for Use." (Section 8). There is no mention of retrospective or prospective.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts

    • Number of Experts: Not specified.
    • Qualifications of Experts: "radiologist" (Section 8). Specific experience (e.g., "10 years of experience") is not mentioned.

    4. Adjudication method (e.g. 2+1, 3+1, none) for the test set

    • Adjudication Method: Not specified. It only states "radiologist clinically evaluated the image quality."

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    • MRMC Study: The document does not describe an MRMC comparative effectiveness study involving AI assistance. The clinical study mentioned is for "assessing the clinical adequacy of the device's imaging performance" (Section 8), implying the performance of the X-ray unit itself, not an AI component integrated into a diagnostic workflow with human readers.
    • Effect Size of Human Improvement with AI: Not applicable, as no MRMC study with AI assistance is described.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    • Standalone Performance: The document describes the "ULTRA 1040 Portable X-ray Unit" and its "system control software" which manages device parameters and operations (Section 4). It is a mobile X-ray system, not an AI algorithm for image analysis. The "bench test for the Ultra 1040 Portable X-ray Unit assessed radiation performance, collimator accuracy, battery performance, imaging quality, and safety" (Section 7.2). This is testing of the X-ray hardware and its basic software functions, not an AI algorithm assessing images.

    It seems this device is an X-ray imaging machine, and the "software" mentioned (Section 4) refers to the control software for the X-ray unit itself, not an AI for image interpretation or diagnosis. Therefore, a standalone performance study for an AI algorithm is not relevant based on the information provided.


    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc)

    • Type of Ground Truth: For the "task-based image quality study," the ground truth for image quality appears to be established by clinical evaluation by "radiologist" (Section 8). This is a form of expert consensus or expert reading, but specifically for image quality, not for diagnostic findings like disease presence/absence.

    8. The sample size for the training set

    • Training Set Size: Not applicable. This document does not describe the development or training of an AI algorithm for image analysis. The "software" mentioned is operational control software for the X-ray machine.

    9. How the ground truth for the training set was established

    • Ground Truth Establishment for Training Set: Not applicable, as no AI algorithm training is described.
    Ask a Question

    Ask a specific question about this device

    Page 1 of 1