Search Filters

Search Results

Found 1 results

510(k) Data Aggregation

    K Number
    K130724
    Manufacturer
    Date Cleared
    2013-06-28

    (102 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    CODIAGNOSTIX IMPLANT PLANNING SOFTWARE

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    coDiagnostiX is an implant planning and surgery planning software tool intended for use by dental professionals who have appropriate knowledge in dental implantology and surgical dentistry. This software reads imaging information output from medical scanners such as CT or DVT scanners. It allows pre-operative simulation and evaluation of patient anatomy and dental implant placement.

    For automated manufacturing of drill guides in the dental laboratory environment, the coDiagnostiX software allows for export of data to 3D manufacturing systems. Alternatively, coDiagnostiX can provide printouts of template plans for the creation of surgical templates using a manually operated gonyX table.

    Device Description

    coDiagnostiX Implant Planning Software is designed for diagnosis of 3dimensional datasets and precise image-guided and reproducible preoperative planning of dental implants. The software is provided in three configurations: station, client or server. Patient image data is received from various sources such as CT or DVT scanning. The scanned data will be read with the coDiagnostiX DICOM transfer module using the DICOM format, converted into 3dimensional datasets and stored in a database. Surgical planning is performed through the calculation of special views, analysis of graphic data and the placement of virtual dental implants. Additional functions are available to the user for refinement of the surgical planning, such as:

    • Active measurement tools, length and angle, for individual measuring of implant positions
    • · Nerve module to assist in distinguishing the nervus mandibularis channel
    • . 3D cut for a 3dimensional cut through the jaw for fine adjustment of the implant position
    • Segmentation module for coloring several areas inside the slice . dataset, e.g., jaw bone, natural tooth series, or types of tissue, e.g., bone, skin, and creating a 3D reconstruction for the dataset
    • Parallelizing function for adjustment of adjacent images ●
    • Bone densitometry with a density statistic for density measuring in the area around the positioned implant; a density allocation along, and transverse to, the implant cover area is displayed

    All working steps are saved automatically to the patient file; one patient file may contain multiple surgical plan proposals which allow the user to choose the ideal surgical plan. The guided surgical template plan, STL file and/or guided surgical protocol is generated from the final surgical plan.

    AI/ML Overview

    The provided text is a 510(k) Summary for the coDiagnostiX Implant Planning Software. It indicates that "Software verification and validation testing were performed to ensure that the device subject to this 510(k) Premarket Notification functions as intended and that design input matches design output." and "The proposed software met the acceptance criteria."

    However, the document does not provide specific details regarding:

    • A table of acceptance criteria and the reported device performance. While it states the software met acceptance criteria, the criteria themselves and the performance metrics are not listed.
    • Sample size used for the test set and data provenance.
    • Number of experts used to establish ground truth for the test set and their qualifications.
    • Adjudication method for the test set.
    • Whether a multi-reader multi-case (MRMC) comparative effectiveness study was done, or its effect size.
    • Whether a standalone performance study (algorithm only) was done.
    • The type of ground truth used (e.g., expert consensus, pathology, outcomes data).
    • The sample size for the training set.
    • How the ground truth for the training set was established.

    The document primarily focuses on establishing substantial equivalence to a predicate device (K071636, IVS coDiagnostiX) based on intended use, technological characteristics, and a statement that verification and validation testing was performed. It references the "General Principles of Software Validation: Final Guidance for Industry and FDA Staff," issued on January 11, 2002, as the guiding document for testing.

    Conclusion based on the provided text:

    The document states that the software met the acceptance criteria, and verification and validation testing was performed. However, it does not provide the specific details of the acceptance criteria or the study results to prove the device met these criteria. Therefore, most of the requested information cannot be extracted from this 510(k) summary.

    Information that can be extracted (limited):

    • Acceptance Criteria Summary: The overall acceptance criterion was that "the device subject to this 510(k) Premarket Notification functions as intended and that design input matches design output."
    • Study Proving Acceptance: "Software verification and validation testing were performed."
    • Guidance followed: "Testing has been carried out in accordance with the FDA guidance document, 'General Principles of Software Validation: Final Guidance for Industry and FDA Staff,' issued on January 11, 2002."

    Table of Acceptance Criteria and Reported Device Performance (Cannot be generated from the provided text):

    Acceptance CriteriaReported Device Performance
    Not specifiedNot specified

    Details of the Study (Mostly N/A from provided text):

    • Sample size used for the test set and the data provenance: Not specified.
    • Number of experts used to establish the ground truth for the test set and the qualifications of those experts: Not specified.
    • Adjudication method for the test set: Not specified.
    • If a multi reader multi case (MRMC) comparative effectiveness study was done: Not specified. The document does not describe any human reader studies.
    • If a standalone (i.e. algorithm only without human-in-the-loop performance) was done: The document states "Software verification and validation testing were performed," which implies standalone testing of the software's functionality, but no specific performance metrics are provided.
    • The type of ground truth used: Not specified.
    • The sample size for the training set: Not applicable, as this is a software for planning, not a machine learning model that undergoes "training" in the typical AI sense. The document describes traditional software verification and validation, not AI model development.
    • How the ground truth for the training set was established: Not applicable for the same reason as above.
    Ask a Question

    Ask a specific question about this device

    Page 1 of 1