Search Filters

Search Results

Found 1 results

510(k) Data Aggregation

    K Number
    K124051
    Device Name
    THE VAULT SYSTEM
    Date Cleared
    2013-05-17

    (137 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    THE VAULT SYSTEM

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The VAULT® System is intended for use as a software interface and image manipulation system for the transfer of imaging information from a medical scanner such as Computerized Axial Tomography (CT) or Magnetic Resonance Imaging (MRI). It is also intended as pre-operative software for simulating/evaluating implant placement and surgical treatment options. The physician chooses the out-put data file for printing and/or subsequent use in CAD modeling or CNC/Rapid-prototyping.

    Device Description

    The VAULT® System software described here was developed in conformance with reference to the FDA Guidance Document for Industry "Guidance for the Submission of Premarket Notifications for Medical Image Management Devices, July 27, 2000". Based on the information contained in Section G of that document, a final determination of submission content was developed. A secondary reference entitled "Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices. May 11, 2005 was also used and resulted in a determination of a "MODERATE" level of concern for the software.

    The UAULT® System software is made available to the user via a web-accessed software interface. The program is a surgeon directed surgical planning package primarily but not exclusively directed at trauma and orthopedic indications. After secure log-in the user requests, creates, reviews and finally authorizes their desired surgical plan. When authorizing, the surgeon/user may choose additional options such as implant sizing and/or various file output options.

    AI/ML Overview

    The VAULT® System Surgery Planning Software received 510(k) clearance (K124051) from the FDA. The submission focused on demonstrating substantial equivalence to predicate devices (Mimics Software - K073468 and TraumaCAD Software- K073714) rather than a direct study against predefined acceptance criteria for a novel device. The performance data was evaluated through non-clinical testing.

    Here's an breakdown based on the provided document:

    1. Table of Acceptance Criteria and Reported Device Performance

    Since this is a 510(k) submission demonstrating substantial equivalence, explicit "acceptance criteria" in the sense of predefined thresholds for diagnostic performance metrics (like sensitivity, specificity, AUC) are not presented in the same way as for a novel diagnostic AI device. Instead, the "acceptance criteria" are implied by the functional and safety requirements defined for the VAULT® System and its performance being "equivalent" to the predicates.

    Feature/RequirementAcceptance Criteria (Implied)Reported Device Performance
    Functional Equivalence
    Image TransferTransfer imaging information from CT/MRI scanners.The VAULT® System is intended for use as a software interface and image manipulation system for the transfer of imaging information from a medical scanner such as Computerized Axial Tomography (CT) or Magnetic Resonance Imaging (MRI).
    Preoperative PlanningSimulate/evaluate implant placement and surgical treatment options.It is also intended as pre-operative software for simulating/evaluating implant placement and surgical treatment options.
    Output File GenerationPhysician chooses output data file for printing/CAD modeling/CNC/Rapid-prototyping.The physician chooses the out-put data file for printing and/or subsequent use in CAD modeling or CNC/Rapid-prototyping. Additional options include implant sizing and/or various file output options.
    DICOM Image UseUse DICOM images.Yes, uses DICOM images (from feature comparison table). Digital file image upload controlled by DICOM process met specifications. The VAULT® System performs initial conversion of image files to graphical formats (JPEG, BMP, PNG, TIFF) before planning, an improvement over predicates which convert post-plan.
    Overlays & TemplatesSupport overlays and templates.Yes, supports overlays and templates (from feature comparison table).
    Accuracy & Integrity
    Anatomical Model TestingRequired level of accuracy and functionality for anatomical and phantom models.Anatomical and phantom model digital file testing demonstrated the required level of accuracy and functionality.
    Image File IntegrityImage file integrity, accuracy, and suitability after conversion, save, and transfer operations.Image file integrity, accuracy and suitability following required conversion, save and transfer operations met all specifications.
    Image Calculations & MeasurementCalculations & measurement of anatomic features and landmarks meet specifications.Image calculations & measurement of anatomic features and landmarks meets specifications.
    SafetyAbsence of control over life-saving devices; adherence to safety risk/hazard analysis.Does not control life-saving devices (from feature comparison table). Safety requirements were developed using a safety risk/hazard analysis based on ISO 14971:2007 approach.
    Software ValidationTraceability, boundary values, and stress testing as per FDA guidance.Functional requirements as defined by the VAULT® System Software Requirements Specification (SRS) were tested and traceability was performed and documented using FDA's General Principles of Software Validation guidance document. Validation included boundary values and stress testing as defined by the FDA's Guidance for the Content of Premarket Submission for Software Contained in Medical Devices guidance document.

    2. Sample Size Used for the Test Set and Data Provenance

    The document does not specify a distinct "test set" with a particular sample size of patient data. The non-clinical performance data relied on:

    • "Anatomical and phantom model digital file testing": The exact number of models used is not specified.
    • The testing of various software functionalities (DICOM process, image file integrity, calculations, measurements).

    The data provenance is not explicitly stated as country of origin or retrospective/prospective data for a clinical study. The testing appears to be primarily software functional and performance testing using internal data (anatomical and phantom models) rather than a large clinical dataset.

    3. Number of Experts Used to Establish the Ground Truth for the Test Set and Qualifications

    The document does not describe the use of human experts to establish ground truth for a diagnostic test set in the conventional sense. The "ground truth" for the software's performance seems to be based on:

    • Specifications: Whether the software performed according to its defined functional and safety specifications ("met specifications").
    • Accuracy against known physical/digital models: For anatomical and phantom model testing, the "ground truth" would be the known parameters of these models.

    There are no details provided about experts involved in establishing this "ground truth" or their qualifications.

    4. Adjudication Method for the Test Set

    Not applicable. The document does not describe an adjudication method as would be used for a clinical study involving human readers or interpretation of results. The testing was focused on meeting software specifications.

    5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    No, an MRMC comparative effectiveness study involving human readers with and without AI assistance was not described or conducted. This submission focused on the functional equivalence of the software to existing predicate devices.

    6. If a Standalone (i.e. algorithm only without human-in-the-loop performance) was done

    Yes, the performance testing described is essentially "standalone" in the sense that it evaluates the software's inherent functions (image processing, calculations, file handling) without explicitly measuring its impact on human reader performance or a human-in-the-loop scenario. The assessment is of the software itself fulfilling its defined requirements.

    7. The Type of Ground Truth Used

    The ground truth used for testing appears to be primarily:

    • Software Specifications: The software's ability to "meet specifications" for various functions (DICOM process, image integrity, calculations, measurements).
    • Reference Data/Models: For "anatomical and phantom model digital file testing," the ground truth would be the known, accurate parameters of these models against which the software's output was compared.

    It does not mention ground truth derived from expert consensus, pathology, or outcomes data in a clinical trial setting.

    8. The Sample Size for the Training Set

    The document does not describe a "training set" in the context of machine learning or AI algorithm development. The VAULT® System appears to be a rule-based or traditional image processing software rather than an AI/ML-driven device that requires training data. No training set size is mentioned.

    9. How the Ground Truth for the Training Set was Established

    Not applicable, as a training set for machine learning is not mentioned or implied by the device's description or the validation approach.

    Ask a Question

    Ask a specific question about this device

    Page 1 of 1