Search Filters

Search Results

Found 1 results

510(k) Data Aggregation

    K Number
    K992654
    Manufacturer
    Date Cleared
    1999-11-05

    (88 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Why did this record match?
    Reference Devices :

    K980648

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    Plug 'n View 3D is a software application for the display and 3D visualization of medical image data derived from CT and MRI scans. It is intended for use by radiologists, clinicians and referring physicians to acquire, process, render, review, store, print and distribute DICOM 3.0 compliant image studies, utilizing standard PC hardware.

    Device Description

    Plug 'n View 3D is a software application for the display and 3D visualization of medical image data derived from CT and MRI scans. It is intended for use by radiologists, clinicians and referring physicians to acquire, process, render, review, store, print and distribute DICOM 3.0 compliant image studies, utilizing standard PC hardware.

    AI/ML Overview

    The provided 510(k) submission for the "Plug 'n View 3D" device does not contain a study that demonstrates the device meets specific acceptance criteria in the manner typically expected for AI/ML-based medical devices today. This submission, dated November 5, 1999, predates the common practice of extensive clinical validation studies with detailed performance metrics.

    Instead, the submission focuses on demonstrating substantial equivalence to a predicate device ("Pro Vision Diagnostic Workstation") based on technical features and intended use. The "acceptance criteria" here are implicitly the feature set and functionality of the predicate device, which the new device aims to match or exceed.

    Here's an analysis based on the provided text, addressing the requested information:

    1. A table of acceptance criteria and the reported device performance

    Since this is a submission based on substantial equivalence to a predicate device rather than performance against pre-defined clinical acceptance criteria, the "acceptance criteria" are the features and capabilities of the predicate device, and the "reported device performance" is a comparison to those features.

    Feature / Acceptance Criteria (from Predicate Device)Plug 'n View 3D Performance (vs. Predicate)
    Computer platformPentium MMX, Windows 95, 98 or NT (Different platform, but considered equivalent for functionality).
    DICOM complianceDICOM-3 for CT, MRI, NM, CR, SC, Ultrasound (Single Frame) images (Comparable, with added Ultrasound support).
    2D imaging2D image viewer with real-time window-level, zoom, pan, rotate, flip and cine. Multiple grid layouts (Same).
    Measurement2D measurement tools including line, angle and ROI statistics (Same).
    Multi-Planar Reformatting (MPR)MPR into any user-defined linear plane (Predicate also has curved plane; a difference, but likely not considered a lack of safety/effectiveness for the claimed use).
    Volume RenderingVolume rendering with interactive opacity/transparency control, clipping VOI, zoom, pan and rotate (Same).
    Maximum Intensity Projection (MIP)MIP with interactive window-level, clipping VOI, zoom, pan and rotate (Same).
    Image editingTools for removal of obscuring anatomy (Same).
    PrintingPrinting to standard Windows printers (Predicate has DICOM printing and non-DICOM laser imagers; a difference, but not affecting core visualization).
    Ease of useVisualization presets. Semi-automated steps for typical image review procedures (Comparable to predicate's customized presets).

    2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)

    The submission does not mention any specific test set, sample size, or data provenance for a performance study. The evaluation relies on a comparison of technical specifications and features to a predicate device, not on a clinical performance study with patient data.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)

    Not applicable. No ground truth establishment is described, as the submission does not involve a performance study with a test set requiring expert interpretation.

    4. Adjudication method (e.g. 2+1, 3+1, none) for the test set

    Not applicable. No adjudication method is mentioned as there is no described test set or expert review process.

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    Not applicable. The "Plug 'n View 3D" is a 3D visualization and image processing tool, not an AI-assisted diagnostic device, and no MRMC study is mentioned. The submission is from 1999, prior to widespread AI in medical imaging.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    Not applicable. The device is a user-operated visualization tool; it is not an autonomous algorithm.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc)

    Not applicable. No ground truth is mentioned. The device's "acceptance" is based on functional equivalence to a predicate, not diagnostic accuracy against a ground truth.

    8. The sample size for the training set

    Not applicable. As this device is a software application for image display and 3D visualization, and not an AI/ML diagnostic algorithm, there is no mention of a training set for machine learning.

    9. How the ground truth for the training set was established

    Not applicable. There is no training set mentioned.

    Ask a Question

    Ask a specific question about this device

    Page 1 of 1