Search Filters

Search Results

Found 1 results

510(k) Data Aggregation

    K Number
    K153583
    Device Name
    BioVision Plus
    Date Cleared
    2016-04-01

    (108 days)

    Product Code
    Regulation Number
    892.1680
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    BioVision Plus

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The BioVision (Plus/ +)Digital Specimen Radiography (DSR) System is a cabinet digital X-ray imaging system intended to generate and control X-rays for examination of harvested specimens from various anatomical regions, and to provide rapid verification that the correct tissue has been excised.

    Performing the verification directly in the same biopsy procedure room enables cases to be completed faster, thus limiting the time the patient needs to be under examination. Specimen radiography can potentially limit the number of patient recalls. This device is intended to be operated wherever the medical professionals deem appropriate, including a surgical suite or a room adjacent to a surgical suite.

    Device Description

    The BioVision(Plus) Digital Specimen Radiography (DSR) System is a standalone cabinet digital X-ray imaging system designed to provide rapid verification that the correct tissue has been excised.

    Performing the verification directly in the same procedure room enables cases to be completed faster, thus limiting the time the patient needs to be under examination. Specimen radiography can potentially limit the number of patient recalls.

    The BioVision(Plus) Digital Specimen Radiography (DSR) System employs the use of Faxitron Bioptics Vision image acquisition software. The Vision software handles the digital X-ray image acquisition, calibration, image display, image analysis and manipulation, patient database, image archiving, and transmittal.

    AI/ML Overview

    The provided text describes the BioVision(Plus) Digital Specimen Radiography (DSR) System and its comparison to a predicate device, the BioVision Digital Specimen Radiography System (K091558). However, it does not describe a study that uses AI or machine learning algorithms, nor does it provide acceptance criteria and performance data in the context of an AI device.

    Instead, this document is a 510(k) summary for a medical device that uses X-ray technology. The "study" referenced is a series of non-clinical performance data tests to demonstrate substantial equivalence to a predicate device.

    Given that the request asks for details related to AI/ML device studies (e.g., sample size for training data, number of experts, adjudication methods for ground truth, MRMC studies), and the provided text describes an X-ray imaging device without any mention of AI/ML, I cannot fulfill the request as specified.

    However, I can extract the information provided regarding the device's technical specifications and the non-clinical testing performed to establish substantial equivalence.

    Here's an attempt to answer the questions based only on the information available in the provided text, recognizing that many requested fields regarding AI/ML studies are not applicable:


    Acceptance Criteria and Study for BioVision(Plus) Digital Specimen Radiography (DSR) System

    The provided document describes the BioVision(Plus) DSR System as an X-ray imaging device, not an AI/ML device. Therefore, many of the requested fields regarding AI/ML study design are not applicable or not present in the text. The "acceptance criteria" here refer to performance standards and substantial equivalence to a predicate device, rather than diagnostic performance metrics of an AI algorithm.

    1. A table of acceptance criteria and the reported device performance

    The document does not explicitly state "acceptance criteria" in the format of defined thresholds. Instead, it details that the device was tested to perform "as well as" the predicate device and to comply with specific regulations and standards.

    Acceptance Criteria (Implied)Reported Device Performance
    Image Quality (Spatial Resolution) - as good as predicateVerified with line pair gauge and America College Radiology phantom
    Image Quality (Contrast Resolution) - as good as predicateVerified using a Small Field Low Contrast Phantom
    Radiation Safety - Compliance with 21 CFR 1020.40Conforms to 21 CFR 1020.40; radiation emission does not exceed 0.5 mR/hr.
    Electrical Safety - Compliance with UL 61010-1, 3rd EditionMeets and exceeds the requirements of UL 61010-1, 3rd Edition
    Software Performance - All functionality, hazard addressingVerification testing during coding; Alpha validation for all functionality
    Substantial Equivalence to Predicate DevicePerformance testing and validation studies document substantial equivalence

    2. Sample size used for the test set and the data provenance

    The document describes non-clinical performance and validation testing, not a clinical study with a "test set" of patient data in the typical sense for AI/ML.

    • Sample size for test set: Not applicable for a non-clinical device performance test. The testing involved phantoms (line pair gauge, America College Radiology phantom, Small Field Low Contrast Phantom) and engineering validation.
    • Data provenance: Not applicable. The testing was non-clinical, involving device performance measurements rather than patient data.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts

    Not applicable. Ground truth, in the context of an AI/ML diagnostic device, involves expert interpretation of patient data or pathology. This document describes performance testing of an X-ray generator and imaging system using phantoms and engineering methods.

    4. Adjudication method for the test set

    Not applicable. There was no clinical "test set" requiring expert adjudication.

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    Not applicable. This device is a standalone X-ray imaging system, not an AI-assisted diagnostic tool.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    The device itself is a "standalone" X-ray imaging system. The performance testing was of the system's ability to generate and capture images to a specified quality, without human-in-the-loop performance assessment related to diagnostic accuracy. The system's "algorithm" here refers to its internal software for image acquisition, calibration, display, and manipulation, not an AI algorithm for diagnosis.

    7. The type of ground truth used

    For the non-clinical performance tests, the "ground truth" was established by:

    • Known physical properties of phantoms: For spatial resolution (line pair gauge) and contrast resolution (Small Field Low Contrast Phantom).
    • Engineering specifications and regulatory standards: For radiation safety (21 CFR 1020.40) and electrical safety (UL 61010-1).
    • Design specifications and verification/validation results: For software functionality and hazard mitigation.

    8. The sample size for the training set

    Not applicable. This is not an AI/ML device, so there is no "training set."

    9. How the ground truth for the training set was established

    Not applicable. There is no training set for an AI/ML algorithm.

    Ask a Question

    Ask a specific question about this device

    Page 1 of 1