Search Filters

Search Results

Found 1 results

510(k) Data Aggregation

    K Number
    K070954
    Date Cleared
    2007-04-20

    (15 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Predicate For
    N/A
    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    PatientGallery is a Windows-based image management system indicated for use primarily by dentists to acquire, archive, display, edit, print, email, and import/export digital images.

    Device Description

    PatientGallery Imaging Software is a Windows-based image management database, or system used primarily by Dentists for the indications of acquiring, archiving, displaying, editing, printing, emailing, and importing or exporting digital images.

    The database is organized like a file cabinet by patient folder accommodates grayscale and color images which may also include textual and graphic notes and annotations. Images may be assembled into layouts which can be customized as required. Optional modules provide editing and viewing functions.

    Image acquisition from x-ray sensors, intra-oral video cameras, digital cameras and flatbed scanners can be distributed among several workstations. The software provides direct interfaces to many industry standard devices through OEM toolkits. PatientGallery Imaging Software uses standard Windows peer-to-peer networking. By default, images are stored in the native format provided by the hardware manufacturers. PatientGallery Imaging Software may be invoked by other practice management applications so that specific patient information is accessed.

    AI/ML Overview

    Here's the analysis of the provided text regarding the acceptance criteria and study for the PatientGallery Imaging Software:

    This submission (K070954) for the PatientGallery Imaging Software does not contain the detailed information requested regarding specific acceptance criteria for diagnostic performance or a study demonstrating such performance.

    Instead, the submission focuses on the functionality and safety of the software as an image management system, demonstrating substantial equivalence to a predicate device based on its intended use and general software verification and validation.

    Therefore, many of the requested fields cannot be directly populated from the provided document. I will fill in what can be inferred or explicitly stated, and note when information is missing.


    Acceptance Criteria and Device Performance Study (K070954)

    1. Table of Acceptance Criteria and Reported Device Performance

    Acceptance Criteria CategorySpecific Acceptance CriteriaReported Device Performance in Study
    Functional PerformanceAbility to acquire digital images from various sources (x-ray sensors, intra-oral video cameras, digital cameras, flatbed scanners)."The software provides direct interfaces to many industry standard devices through OEM toolkits."
    Ability to archive digital images."database... used primarily by Dentists for the indications of acquiring, archiving..."
    Ability to display digital images (grayscale and color)."displaying... digital images."
    Ability to edit digital images (including textual and graphic notes/annotations)."editing... digital images... Optional modules provide editing and viewing functions."
    Ability to print digital images."printing... digital images."
    Ability to email digital images."emailing... digital images."
    Ability to import/export digital images."importing or exporting digital images."
    Ability to store images in native format provided by hardware manufacturers."By default, images are stored in the native format provided by the hardware manufacturers."
    Ability to be invoked by other practice management applications to access specific patient information."PatientGallery Imaging Software may be invoked by other practice management applications so that specific patient information is accessed."
    Safety and ReliabilityNo unacceptable hazards identified; all identified hazards appropriately mitigated."A Hazard Analysis was performed... which led to the development of Software Requirement Specifications (SRS)."
    Software performs as indicated in specifications."The V&V testing was passed, demonstrating that the PatientGallery Imaging Software performs as indicated."
    Substantial EquivalenceDevice functions as described and is substantially equivalent to predicate device (TigerView Professional, K061035)."The information contained in this Pre-market Notification is sufficient to demonstrate that the PatientGallery Imaging Software functions as described, and is substantially equivalent to the TigerView Professional software..."

    2. Sample size used for the test set and the data provenance

    • Sample Size for Test Set: Not specified in terms of clinical images or cases. The "Test Cases" mentioned refer to software verification and validation testing, not a clinical performance study with patient data.
    • Data Provenance: Not applicable, as no clinical test set using patient data is described. The "Test Cases" would typically involve synthetic or controlled data to verify software functions.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts

    • Not applicable. The "ground truth" for the software's functional performance would be its adherence to predefined software requirements, not clinical diagnoses established by medical experts.

    4. Adjudication method for the test set

    • Not applicable. No clinical test set requiring adjudication is described. Software V&V testing typically involves comparing actual software output to expected output based on specifications.

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    • No. An MRMC comparative effectiveness study was not done. The device is an image management system, not an AI diagnostic aid.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    • The "V&V testing" refers to standalone software testing to ensure it meets its functional requirements. This is not a "standalone (algorithm only)" performance in the context of diagnostic accuracy, but rather functional software performance.

    7. The type of ground truth used

    • The "ground truth" for the V&V testing was the Software Requirement Specifications (SRS). The software was tested to ensure it performed according to these written specifications.

    8. The sample size for the training set

    • Not applicable. This device is an image management system, not a machine learning or AI algorithm that requires a training set of data.

    9. How the ground truth for the training set was established

    • Not applicable. No training set is used for this type of software.
    Ask a Question

    Ask a specific question about this device

    Page 1 of 1