Search Filters

Search Results

Found 1 results

510(k) Data Aggregation

    K Number
    K050336
    Manufacturer
    Date Cleared
    2005-08-24

    (195 days)

    Product Code
    Regulation Number
    886.1760
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    OPD-STATION SOFTWARE

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The OPD-Station software is indicated for use in analyzing the corneal shape and refractive powers measured by the OPD-Scan Models ARK-9000 or ARK-10000, and to display the data in the form of maps, and manage the data.

    Device Description

    Nidek has developed a stand-alone software option for users of the OPD-Scan™ device called OPD-Station, which will run on an independent PC (i.e., separate from the OPD-Scan™ device). The OPD-Station software is able to access data measured by the OPD-Scan™ device via a separate Nidek data management software package called NAVIS.

    The OPD-Station uses the same software as that used for the OPD-Scan device so that a physician can view OPD-Scan data on their PC of choice. However, the OPD-Station software has the following new functions:

    • Maps of Point Spread Function (PSF), Modulation Transfer Function (MTF), MTF . graphing, and Visual Acuity mapping
    • Improved color mapping .
    AI/ML Overview

    The provided text describes the Nidek Incorporated OPD-Station software, which is a standalone software option for users of the OPD-Scan™ device. The software analyzes corneal shape and refractive powers measured by the OPD-Scan Models ARK-9000 or ARK-10000, displaying the data in various maps and managing the data.

    Here's an analysis of the acceptance criteria and the study that proves the device meets them, based on the provided text:

    1. A table of acceptance criteria and the reported device performance

    The provided document is a 510(k) summary, which focuses on demonstrating substantial equivalence to predicate devices rather than setting specific performance acceptance criteria like sensitivity, specificity, or accuracy metrics. The primary "acceptance criterion" implied throughout this document is substantial equivalence to the predicate devices.

    Acceptance CriterionReported Device Performance
    Substantial Equivalence to Predicate DevicesThe OPD-Station software uses the same software as the OPD-Scan and has new functions (PSF, MTF maps, improved color mapping). A comparison of technological characteristics was performed, demonstrating equivalence to marketed predicate devices. The performance data indicate the OPD-Station software meets all specified requirements and is substantially equivalent.

    2. Sample size used for the test set and the data provenance

    The document does not specify any sample size used for a test set (clinical or otherwise) or the data provenance (e.g., country of origin, retrospective/prospective). The performance data mentioned is described very generally as indicating "all specified requirements" without detailing the nature of this data or how it was gathered.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts

    The document does not mention any experts used to establish ground truth or their qualifications. Given that this is a 510(k) for software intended to display and manage existing data from an already cleared device (OPD-Scan), the focus is on the software's functionality and its output being consistent with the predicate device's capabilities, rather than a clinical accuracy study requiring expert adjudication of a test set.

    4. Adjudication method for the test set

    The document does not describe any adjudication method.

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    An MRMC comparative effectiveness study was not conducted or mentioned. The OPD-Station software is not described as an AI-assisted device directly improving human reader performance but rather as a tool for analyzing and displaying existing data from another ophthalmic device.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    The document is about a standalone software ("OPD-Station software") that operates independently on a PC to analyze and display data from the OPD-Scan device. While it's standalone software, the performance described is its ability to access and process data from the OPD-Scan and display it; it's not an "algorithm only" performance in the sense of an AI algorithm making diagnostic decisions without human involvement. The software itself is the "device" in question operating independently.

    7. The type of ground truth used

    The document does not specify the type of ground truth used. The verification process appears to rely on comparing the new software's functionality and output with that of the predicate devices. This implies that the "ground truth" for demonstrating equivalence would be the established functionality and output of the cleared predicate devices.

    8. The sample size for the training set

    The document does not mention a training set or its sample size. This suggests that the software development did not involve machine learning or AI models that require training sets. The software's design likely follows deterministic algorithms based on established ophthalmic principles and data processing techniques from the original OPD-Scan.

    9. How the ground truth for the training set was established

    As there is no mention of a training set, there is no information on how its ground truth would have been established.

    Ask a Question

    Ask a specific question about this device

    Page 1 of 1