Search Filters

Search Results

Found 1 results

510(k) Data Aggregation

    K Number
    K071894
    Date Cleared
    2007-08-16

    (38 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Predicate For
    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    Xebra DICOM Image Browser ™ is a software application that is used for viewing medical images. The Xebra Image Viewer receives digital images and data from various sources (including but not limited to CT, MR, US, RF units, computed and direct radiographic devices, and secondary capture devices, (scanners, imaging gateways or imaging sources). Images are stored, communicated, processed and displayed on the local disk of a workstation and/or across computer networks at distributed locations. Tasks that users may perform when viewing images include, but are not limited to: adjustment of window width and level; image stacking; annotation and measurement of regions of interest; and inversion, rotation, and flips of images. In addition, the Xebra Image Viewer can be integrated with an institution's existing HIS, RIS, EMR, or EHR for a fully integrated electronic patient record. Typical users of the Xebra Image Viewer are trained medical professionals, including but not limited to radiologists, clinicians, technologists, and others.

    Lossy compressed mammographic images and digitized film screen images must not be reviewed for primary image interpretation. Mammographic images may only be interpreted using an FDA approved monitor that offers at least 5 Mpixel resolution and meets other technical specifications reviewed and accepted by FDA.

    Device Description

    Xebra DICOM Image Browser ™ is one of the components of a Picture Archiving and Communications System (PACS). Xebra DICOM Image Browser ™ is a software application that provides image viewing and manipulation in a web browser. The functions of this application are applied to medical images that are acquired and stored on an image server in DICOM and or other proprietary formats. The device does not contact the patient, nor does it control any life sustaining devices.

    AI/ML Overview

    This submission (K071894) is for the Xebra DICOM Image Browser, which is a Picture Archiving and Communications System (PACS) software. The document does not contain a specific study proving the device meets detailed acceptance criteria in the manner typically expected for diagnostic or AI-powered devices.

    Instead, this 510(k) summary focuses on demonstrating substantial equivalence to predicate devices (UniPACS Workstation K023476 and eFilm Workstation K012211) for the purpose of image viewing and manipulation. The "acceptance criteria" here are more about the software's functionality and safety as a medical device, rather than a performance benchmark against a specific medical condition.

    Here's an analysis based on the provided text:

    1. Table of Acceptance Criteria and Reported Device Performance:

    The document does not provide a table of quantitative acceptance criteria for image viewing performance (e.g., accuracy, sensitivity, specificity, or image quality metrics) or specific reported device performance values. The device is a PACS image browser, and its "performance" is primarily assessed by its ability to perform standard image viewing, manipulation, and integration functions.

    The acceptance criteria are implicitly met by:

    • Conforming to the functional capabilities of predicate devices: The device is stated to be "substantially equivalent" to UniPACS Workstation and eFilm Workstation, implying it performs the same functions.
    • Meeting software development and testing standards: "Xebra DICOM Image Browser ™ has been tested according to the specifications that are documented in a Software Test Plan. Testing is an integral part of Hx Technologies' software development process as described in the company's Product Development Process."
    • Adherence to voluntary standards: The device "has been and will continue to be manufactured according to the voluntary standards list in the Voluntary Standards section of the submission." (The specific standards are not listed in this excerpt).
    • Hazard Analysis: A hazard analysis classified the "Level of Concern for potential hazards" as "Minor."

    Reported Device Performance: The document does not report specific quantitative metrics of the device's performance (e.g., image clarity, speed of rendering, accuracy of measurements). Its performance is described in terms of its features and capabilities:

    • Viewing medical images from various sources (CT, MR, US, RF units, etc.).
    • Storing, communicating, processing, and displaying images.
    • User tasks: adjustment of window/level, image stacking, annotation, measurement, inversion, rotation, flips.
    • Integration with HIS, RIS, EMR, EHR.

    2. Sample Size Used for the Test Set and Data Provenance:

    The document does not specify a sample size for any test set or provide details on data provenance (e.g., country of origin, retrospective/prospective). The testing mentioned is related to software verification and validation, not clinical performance evaluation with a specific patient dataset.

    3. Number of Experts Used to Establish Ground Truth and Qualifications:

    This information is not provided. As the device is a PACS viewer and not an AI/CAD system, the concept of "ground truth" for a test set in the diagnostic performance sense is not directly applicable to the information presented. The functionality testing would involve internal quality assurance and verification against software specifications.

    4. Adjudication Method for the Test Set:

    This information is not provided.

    5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study:

    A MRMC comparative effectiveness study was not mentioned or performed for this device. The submission does not describe any human reader performance studies, with or without AI assistance, or related effect sizes.

    6. Standalone (Algorithm Only) Performance Study:

    A standalone performance study, as typically understood for diagnostic algorithms, was not mentioned or performed. The device itself is a standalone software application, but its "performance" is on its capabilities as a viewer, not a diagnostic algorithm.

    7. Type of Ground Truth Used:

    Ground truth, in the context of clinical accuracy (e.g., for disease detection), is not relevant to the information provided in this 510(k) summary. The "ground truth" for this device would be its adherence to DICOM standards, correct rendering of images, and accurate execution of viewing and manipulation functions. This would typically be verified through software testing against design specifications, not clinical pathology or outcomes data.

    8. Sample Size for the Training Set:

    This information is not applicable and not provided. The Xebra DICOM Image Browser is a conventional image viewer software, not an AI or machine learning algorithm that requires a training set.

    9. How the Ground Truth for the Training Set Was Established:

    This information is not applicable and not provided, as there is no training set for this type of device.

    Ask a Question

    Ask a specific question about this device

    Page 1 of 1