Search Filters

Search Results

Found 3 results

510(k) Data Aggregation

    K Number
    K172007
    Manufacturer
    Date Cleared
    2017-11-22

    (142 days)

    Product Code
    Regulation Number
    872.1745
    Reference & Predicate Devices
    Why did this record match?
    Applicant Name (Manufacturer) :

    Duerr Dental AG

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The CamX Triton HD Proxi Head is a diagnostic aid for the detection of interproximal caries lesions above the gingiva and for monitoring the progress of such lesions.

    Device Description

    The CamX Triton HD "Proxi" aids in the detection and diagnosis of proximal caries. It consists of a toothbrush-sized handpiece and a "Proxi" head. A USB cable connects the handpiece to a personal computer with PACS software such as DBSWIN to enable communication between a PC computer and the handpiece. After a camera cover (single patient disposable sheath, K132953) is placed over the distal end, and an autoclave-able spacer is installed, the Handpiece is positioned over the teeth to be examined. The camera functions by transilluminating sound tooth enamel with infrared light. Areas that spread and reflect the light (e.g. caries lesions) show up clearly delimited bright areas. A digital camera converts the object situation into an electrical signal, sent it over USB to a computer, converted into an image (by imaging software) and presented on a monitor in monochrome colors to illustrate suspected areas of decay.

    AI/ML Overview

    This document is a 510(k) Pre-Market Notification, which focuses on demonstrating substantial equivalence to a predicate device rather than fulfilling specific acceptance criteria through a clinical study. Therefore, comprehensive information regarding acceptance criteria and a detailed study proving the device meets them, as typically found in a clinical trial report, is not present.

    However, based on the provided text, here's what can be extracted and inferred:

    1. A table of acceptance criteria and the reported device performance

    The document does not explicitly state acceptance criteria in a quantitative table format with corresponding performance results. Instead, it relies on a comparison to the predicate device and validation of key performance attributes.

    Acceptance Criteria (Inferred from non-clinical testing)Reported Device Performance
    LED Illumination and output (similar to predicate)Validation and verification test results showed that new device and the predicate device are equivalent.
    Image Quality (similar to predicate)Illumination and image quality of potential caries detection products are similar for both Duerr Dental AG's CamX Triton HD Proxi and KaVo's DIAGNOcam camera.
    Compliance with StandardsDuerr Dental AG's CamX Triton HD Proxi complies with IEC 60601-1:2005 + CORR. 1 (2006) + CORR. 2 (edition 3), IEC 60601-1-2:2014 (edition 4), and IEC 80601-2-60:2012 (first edition).
    Sterilization ValidationSterilization validation testing was successfully performed according to ANSI/AAMI/ISO 17665-1, Annex D and the validation approach outlined in ANSI/AAMI/ISO 14937, Annex D (Approach 3).
    Biocompatibility (for patient-contacting components)Patient contact Distance Spacer component was tested and complies with ISO 10993-10:2002 Standard and Amendment 1, and ISO 10993-5. (Other components isolated from patient contact).
    Functional Principle (transillumination)The main function is based on transillumination, same as the predicate devices.

    2. Sample size used for the test set and the data provenance (e.g., country of origin of the data, retrospective or prospective)

    The document explicitly states: "11. Clinical Data: Not required for a finding of substantial equivalence." This means a clinical test set of patient data, as would be used in a typical clinical study, was not required or performed for this submission. The performance assessment was based on non-clinical data.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g., radiologist with 10 years of experience)

    Given that clinical data was not required, there is no mention of experts establishing ground truth for a clinical test set. The "ground truth" concept in a clinical context (e.g., confirmed caries presence) is not applicable here as no clinical study was performed.

    4. Adjudication method (e.g., 2+1, 3+1, none) for the test set

    Not applicable, as no clinical test set requiring adjudication was used.

    5. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    Not applicable. This is a device for detecting caries, not an AI-assisted diagnostic tool for human readers in the context of an MRMC study.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    The device is described as "a diagnostic aid," implying human-in-the-loop use. The primary mode of operation described is a digital camera converting the object situation into an electrical signal, sent to a computer, converted into an image, and "presented on a monitor in monochrome colors to illustrate suspected areas of decay." This suggests the image is then interpreted by a dentist. A standalone algorithm performance was not reported.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc)

    For the non-clinical testing of "illumination and image quality," the ground truth implicitly would be objective measurements and comparisons against the predicate device's characteristics and output. For biocompatibility and sterilization, the ground truth is adherence to established international standards and successful completion of specified tests.

    8. The sample size for the training set

    Not applicable. This device does not appear to be an AI/machine learning product that would require a "training set" in the conventional sense. Its function is based on transillumination and image capture.

    9. How the ground truth for the training set was established

    Not applicable, as there is no mention of a training set for an AI/machine learning model.

    Ask a Question

    Ask a specific question about this device

    K Number
    K170733
    Manufacturer
    Date Cleared
    2017-05-01

    (52 days)

    Product Code
    Regulation Number
    872.1800
    Reference & Predicate Devices
    Why did this record match?
    Applicant Name (Manufacturer) :

    Duerr Dental AG

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The ScanX Intraoral View is intended to be used for scanning and processing digital images exposed on Phosphor Storage Plates (PSPs) in dental applications.

    Device Description

    The ScanX Intraoral View is a dental device that scans photostimulable phosphor storage plates that have been exposed in place of dental X- Ray film and allows the resulting images to be displayed on a personal computer monitor and stored for later recovery. It will be used by licensed clinicians and authorized technicians for this purpose.

    AI/ML Overview

    The provided document is a 510(k) premarket notification for a dental imaging device called ScanX Intraoral View. It primarily focuses on demonstrating substantial equivalence to a predicate device rather than detailing specific acceptance criteria and a study proving the device meets those criteria from an AI/algorithm performance perspective.

    Therefore, the document does not contain the information requested regarding acceptance criteria for an AI/algorithm, a study proving device performance against such criteria, sample sizes for test/training sets, expert involvement, adjudication methods, MRMC studies, standalone performance, or ground truth establishment in the context of an AI/algorithm.

    The document describes the device as a scanner that processes digital images from Phosphor Storage Plates (PSPs) and displays them on a computer monitor. It is not an AI/ML powered device, but rather a digital radiography system. The performance testing summarized focuses on:

    • Safety testing: Electrical safety (IEC 61010-1), Electromagnetic Compatibility (EMC) (EN 61326-1), and Laser safety (IEC 60825-1).
    • Imaging performance: Theoretical resolutions, MTF (Modulation Transfer Function), and DQE (Detective Quantum Efficiency) were measured with reference to IEC 62220-1:2003, as recommended by the FDA guidance for Solid State X-ray Imaging Devices.
    • Software validation and risk analysis were performed.
    • Cybersecurity was addressed per FDA guidance.

    The document explicitly states: "Summary of clinical performance testing: Not required to establish substantial equivalence." This further confirms that no clinical study (which would typically assess performance against clinical acceptance criteria) was conducted or presented in this 510(k) submission.

    In summary, the provided text does not describe an AI/ML-based device or a study proving its performance against acceptance criteria in the way your detailed questions imply for an AI/ML product.

    Ask a Question

    Ask a specific question about this device

    K Number
    K161444
    Manufacturer
    Date Cleared
    2016-06-21

    (27 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Why did this record match?
    Applicant Name (Manufacturer) :

    Duerr Dental AG

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    DBSWIN and VistaEasy imaging software are intended for use by qualified dental professionals for windows based diagnostics. The software is a diagnostic aide for licensed radiologists, dentists and clinicians, who perform the actual diagnosis based on their training, qualification, and clinical experience. DBSWIN and VistaEasy are clinical software applications that receive images and data from various imaging sources (i.e., radiography devices and digital video capture devices) that are manufactured and distributed by Duerr Dental and Air Techniques. It is intended to acquire, display, edit (i.e., resize, adjust contrast, etc.) and distribute images using standard PC hardware. In addition, DBSWIN enables the acquisition of still images from 3rd party TWAIN compliant imaging devices (e.g., generic image devices such as scanners) and the storage and printing of clinical exam data, while VistaEasy distributes to 3rd party TWAIN compliant PACS systems for storage and printing.

    DBSWIN and VistaEasy software are not intended for mammography use.

    Device Description

    DBSWIN and VistaEasy imaging software is an image management system that allows dentists to acquire, display, edit, view, store, print, and distribute medical images. DBSWIN and VistaEasy software runs on user provided PC-compatible computers and utilize previously cleared digital image capture devices for image acquisition.

    VistaEasy is included as part of DBSWIN. It provides additional interfaces for Third Party Software. VistaEasy can also be used by itself, as a defeatured version of DBSWIN.

    AI/ML Overview

    The provided text describes a 510(k) premarket notification for "DBSWIN and VistaEasy Imaging Software." This submission is a "Special 510(k) Summary" for minor modifications to an already cleared predicate device (K143290). Therefore, the document focuses on demonstrating that the new version is substantially equivalent to the previous one and primarily relies on non-clinical testing rather than extensive new clinical studies for acceptance criteria.

    Based on the provided text, a detailed breakdown of acceptance criteria and a study proving those criteria is challenging because the document primarily asserts substantial equivalence through a comparison to a predicate device and relies on generalized non-clinical testing rather than specific new performance metrics for the modified device.

    Here's an attempt to extract the requested information based on the available text:


    1. A table of acceptance criteria and the reported device performance

    The document does not explicitly state numerical acceptance criteria like sensitivity, specificity, or accuracy for diagnostic performance. Instead, the "acceptance criteria" are implied to be the continued equivalent functionality and safety to the predicate device and compliance with relevant standards. The "reported device performance" is a confirmation that these functionalities are maintained.

    Acceptance Criteria (Implied)Reported Device Performance
    Compliance with medical device software life cycle requirements (IEC 623304)Developed in compliance with IEC 62304.
    Maintains intended use functionality as predicate"Continues to meet its performance specifications."
    Hardware compatibility interfaces (especially with 3rd party software)"Same intended use, functionality, and hardware compatibility interfaces with 3rd party software."
    Effective and functional with image capture devices"Bench testing, effectiveness, and functionality were successfully conducted and verified."
    DICOM complianceDBSWIN is DICOM compliant.
    No new issues of safety or effectiveness"The results of the testing did not raise new issues of safety or effectiveness."
    Meets minimum system requirementsHardware requirements table provided for various OS, CPU, RAM, etc.

    2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)

    The document does not specify a sample size for a test set in the context of clinical performance evaluation. The testing described is primarily non-clinical: "Bench testing," "Full functional software cross check testing." There is no mention of data provenance (country of origin, retrospective/prospective) because clinical data are not discussed.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)

    No information is provided regarding experts, ground truth establishment, or their qualifications because the document does not describe a clinical study requiring such a test set. The software is described as a "diagnostic aide for licensed radiologists, dentists and clinicians, who perform the actual diagnosis based on their training, qualification, and clinical experience." This implies that the human expert is the ultimate arbiter of diagnosis, not the software.

    4. Adjudication method (e.g. 2+1, 3+1, none) for the test set

    No information is provided regarding adjudication methods as no clinical test set requiring such expert review is described.

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    No MRMC study was mentioned or implied. The device is imaging software, not explicitly an AI-assisted diagnostic tool in the sense of providing automated interpretations or significant decision support that would require such a study. It's a tool for acquiring, displaying, and editing images.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    A "standalone" performance study for an algorithm in a diagnostic sense was not done. The software's performance is gauged through non-clinical functional testing and its ability to process images. Its role is as a "diagnostic aide" to a human professional.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)

    No specific type of "ground truth" (e.g., pathology, expert consensus) is mentioned because the document does not describe the evaluation of a diagnostic algorithm against such a truth. The testing focuses on functional verification and compliance with standards.

    8. The sample size for the training set

    No information about a training set is provided. This type of submission (Special 510(k)) does not typically involve the training of new algorithms but rather the verification of modified software versions against established functionalities of previously cleared devices.

    9. How the ground truth for the training set was established

    Not applicable, as no training set or specific diagnostic algorithm requiring ground truth for training is mentioned. The document describes a software update for an image management system.

    Ask a Question

    Ask a specific question about this device

    Page 1 of 1