Search Filters

Search Results

Found 2 results

510(k) Data Aggregation

    K Number
    K192038
    Manufacturer
    Date Cleared
    2019-12-02

    (125 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Why did this record match?
    Reference Devices :

    K133821, K171358

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The Emprint™ Visualization Application is a stand-alone software product that allows physicians to visualize and compare CT and MRI imaging data. The display, annotation, and volume rendering of medical images aids in ablation procedures conducted using Emprint™ ablation systems. The software is not intended for diagnosis.

    Device Description

    The Emprint™ Visualization Application is a software product that achieves its medical purpose without being part of the hardware of a medical device (SaMD). The Emprint™ Visualization Application is used to support Emprint ™-system ablation procedures by displaying patient CT and MRI images with modeled ablation zones / volumes. The application is a Windows™ desktop program that is installed on a hospital computer with local storage and a network connection. The software receives CT and MRI images by supporting DICOM connections with CT/MRI scanners and hospital PACS. The software's DICOM image viewer does not in any way alter the medical images. The device is designed to meet the procedure planning and evaluation needs of physicians conducting soft tissue ablation procedures using Emprint ™-branded systems only.

    AI/ML Overview

    Based on the provided text, the Emprint™ Visualization Application is a standalone software product (Software as a Medical Device - SaMD) that allows physicians to visualize and compare CT and MRI imaging data to aid in ablation procedures. It is not intended for diagnosis. The performance testing described focuses on various aspects of software quality and usability rather than a comparative effectiveness study with human readers or a standalone AI performance evaluation for diagnostic purposes.

    Here's an analysis of the acceptance criteria and the study that proves the device meets them, according to the provided document:

    Acceptance Criteria and Reported Device Performance

    The document describes performance testing that focused on software verification and human factors engineering. Explicit quantitative acceptance criteria are not presented in a table format with corresponding performance metrics for features like sensitivity, specificity, or accuracy in a diagnostic sense. Instead, the performance is described in terms of compliance with standards and functional verification.

    Table of Acceptance Criteria and Reported Device Performance:

    Acceptance Criterion (Implicit/Explicit)Reported Device Performance
    Functional Performance & Accuracy:
    - Ability to import and view standard DICOM images in 3-dimensions.Software receives CT and MRI images by supporting DICOM connections with CT/MRI scanners and hospital PACS. User can import standard DICOM images and view them in 3-dimensions.
    - Ability to select and view specific anatomical features.User can select and view specific anatomical features (e.g., soft-tissue lesions, anatomical landmarks).
    - Ability to measure and mark critical anatomical features/areas.User can measure and mark critical anatomical features / areas of interest. System-level testing verified the application's measurement accuracy (+/- 2 voxels).
    - Ability to overlay and position virtual images (antenna/ablation zone).User can overlay and position virtual images of the Emprint™ ablation antenna and the anticipated thermal ablation zone onto the medical image. The device references zone charts (look-up tables) that characterize Emprint™ Ablation System performance for sizing and displaying predicted ablation zones.
    - Ability to add textual annotations.User can add textual annotations to images.
    - Ability to export annotated plans.User can export annotated plans for the patient's medical record or for use in a radiology or operating suite.
    - Ability to view and compare imported images simultaneously.User can view and compare any 2 of the imported images simultaneously. Ability to compare images across patients is also mentioned.
    - No alteration of medical images.The software's DICOM image viewer does not in any way alter the medical images.
    Software Quality & Compliance:
    - Compliance with NEMA PS 3.1-3.20:2016, DICOM standard.Demonstrated compliance. The device is a DICOM image viewer and supports DICOM connections.
    - Compliance with IEC 62304:2006, Medical Device Software Life Cycle.Demonstrated compliance.
    Usability/Human Factors:
    - Meets user needs and expectations.A human-factors engineering (HFE) process was followed, and simulated-use, validation testing was conducted to confirm that the visualization application met user needs and expectations. Optimizations for workflow and user interface were performed based on intraprocedural use in the CT suite.

    Study Details:

    1. Sample sizes used for the test set and data provenance:

      • The document does not specify a sample size for a "test set" in the context of image data for diagnostic performance.
      • It mentions "extensive software verification testing" including "software subsystem and system-level verification" and "simulated-use, validation testing" for human factors.
      • The data provenance of the images used in these tests (e.g., country of origin, retrospective or prospective) is not explicitly stated. The device uses patient CT and MRI images, which would be from clinical practice, implicitly retrospective for testing purposes if not generated specifically for the study.
    2. Number of experts used to establish the ground truth for the test set and their qualifications:

      • This information is not provided. Since the device is "not intended for diagnosis" and the testing focused on functional verification and usability (measurement accuracy, workflow), the concept of "ground truth" established by experts for diagnostic performance (e.g., presence/absence of disease) is not applicable or described in this submission summary. The "ground truth" for the measurement accuracy test (e.g., +/- 2 voxels) would likely be based on known geometric properties of test objects or simulated environments rather than expert interpretation of pathology.
    3. Adjudication method for the test set:

      • Not applicable/described as there's no mention of a diagnostic performance study requiring expert adjudication of cases.
    4. If a Multi-Reader Multi-Case (MRMC) comparative effectiveness study was done, and the effect size of how much human readers improve with AI vs. without AI assistance:

      • No, a MRMC comparative effectiveness study was not done.
      • The device is a visualization and planning tool, not an AI for diagnosis or a system designed to directly improve human reader performance in interpreting images for diagnostic tasks. Its purpose is to aid in ablation procedures by displaying, annotating, and volume rendering medical images.
    5. If a standalone (i.e., algorithm only without human-in-the-loop performance) was done:

      • The document doesn't describe a standalone performance evaluation in the typical sense of an AI algorithm making a diagnostic decision. The device itself is a "stand-alone software product" (SaMD), but its function is visualization and planning aid, not autonomous decision-making. Its "standalone" nature refers to it being a software app distinct from a hardware device. Performance testing focused on software functions, DICOM compliance, and measurement accuracy, which are "algorithm-only" in the sense that the software correctly performs its programmed tasks.
    6. The type of ground truth used:

      • For measurement accuracy: The document states "system-level testing was conducted to verify the application's measurement accuracy (+/- 2 voxels)". The ground truth for this would likely be an engineered or known value within test objects or simulated datasets, not expert consensus or pathological findings from real patients.
      • For functional correctness: The ground truth is the expected behavior and output of the software as per its design specifications and standard compliance (e.g., DICOM standard conformance, correct display of images, faithful representation of ablation zones based on look-up tables).
      • For usability: The ground truth is user needs and expectations, assessed through human factors engineering and simulated-use validation testing.
    7. The sample size for the training set:

      • The document does not mention a "training set" in the context of machine learning or AI models that learn from data. The device's description suggests it primarily uses rule-based logic (e.g., referencing "zone charts (look-up tables)" for ablation zones) and established imaging principles rather than a deep learning model requiring a large training dataset.
    8. How the ground truth for the training set was established:

      • Not applicable, as no training set for a machine learning model is described. The "zone charts" mentioned for ablation zone prediction are likely derived from preclinical studies or physical principles of the ablation systems, not from ground truth established by experts interpreting images.
    Ask a Question

    Ask a specific question about this device

    K Number
    K180192
    Manufacturer
    Date Cleared
    2018-03-21

    (56 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Why did this record match?
    Reference Devices :

    K133821, K171358

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The Emprint™ Ablation Visualization is a stand-alone software product that allows physicians to visualize and compare CT imaging data. The display, annotation, and volume rendering of medical images aids in ablation procedures conducted using Emprint™ ablation systems. The software is not intended for diagnosis.

    Device Description

    The Emprint™ Ablation Visualization Application is a software product that achieves its medical purpose without being part of the hardware of a medical device (SaMD). The visualization application is a DICOM image viewer that allows surgeons and interventionists to utilize a health care facility's PACS (Picture Archiving and Communications System) server (or other digital media transfer process) to view and interact with CT images. Using preprocedure, intra-procedure and post-procedure CT images, physicians can both plan and evaluate soft-tissue ablation procedures conducted with the Emprint™ Ablation System (K133821) and the Emprint™ SX Ablation Platform (K171358). The application is designed to streamline and enhance the procedure planning, execution and evaluation process; it is not required for the safe and effective conduct of an Emprint™ ablation procedure.

    Using the application's three ablation workflows (Liver, Lung and Kidney Ablation), physicians can prepare for or evaluate an Emprint™ system ablation procedure by viewing and annotating patient-specific anatomy. The visualization's Compare Workflow facilitates the comparison of images across patients, or the comparison of images from the same patient before and after an ablation procedure. Physicians can use the software to perform the following, key, workflow-based tasks:

    • Import standard DICOM images and render them in 3-dimensions
    • Select and view specific anatomical features (e.g., soft-tissue lesions, anatomical landmarks)
    • Measure and mark critical anatomical features / areas of interest
    • Overlay and position virtual images of the Emprint ablation antenna and the anticipated thermal ablation zone onto the medical image
    • Add textual annotations to images
    • Save annotated plans for future viewing
    • Export annotated plans for the patient's medical record or for use in a radiology or operating suite
    • View and compare any 2 of the imported DICOM datasets, simultaneously

    The visualization application is designed for installation and use on Windows™-based computers (Windows™ 7 or 10) and is compatible with procedure plans that were generated with the predicate device (Emprint™ Procedure Planning Application, K142048).

    AI/ML Overview

    The provided text describes a software device called "Emprint™ Ablation Visualization Application" and its performance testing to demonstrate substantial equivalence to a predicate device. However, it does not contain specific acceptance criteria or detailed results of a study that proves the device meets those criteria, especially in terms of metrics like sensitivity, specificity, accuracy, or comparative performance with human readers if it were an AI-driven diagnostic aid.

    The information provided focuses on the device's function as a DICOM image viewer, its intended use for visualizing and comparing CT images for ablation procedures, and general software verification testing. It explicitly states, "The software is not intended for diagnosis." This is crucial. Since it's not a diagnostic AI, the typical performance metrics associated with AI devices (like sensitivity, specificity, or MRMC studies) are not pertinent or captured in this submission.

    Therefore, many of the requested points regarding acceptance criteria and performance against those criteria cannot be directly extracted from the provided text. I will address the points that can be inferred or explicitly stated.

    Here's an analysis based on the provided text:

    1. A table of acceptance criteria and the reported device performance

    The document does not provide a table of performance-based acceptance criteria beyond general statements about software functioning and measurement accuracy.

    Acceptance Criteria (Inferred from text)Reported Device Performance
    Proper functioning of user interface for visualization workflows"System-level verification was conducted to fully evaluate the user interface for the visualization's workflows." (Implies successful verification)
    Measurement accuracy"System-level testing was conducted to verify the application's measurement accuracy (+/- 2 voxels)." (Passes this accuracy)
    Compliance with IEC 62304:2006 (Software Life Cycle Processes)"Performance testing demonstrated the Emprint™ Ablation Visualization Application's compliance with... IEC 62304:2006"
    Compliance with NEMA PS 3.1-3.20:2014 (DICOM)"Performance testing demonstrated the Emprint™ Ablation Visualization Application's compliance with... NEMA PS 3.1-3.20:2014" AND "The subject and predicate devices are both DICOM image viewers and comply with the associated NEMA DICOM Standard."
    Meeting user needs and expectations (Human Factors)"During the product's development, Covidien followed a human factors engineering (HFE) process and conducted simulated-use, validation testing to confirm that the visualization application met user needs and expectations." (Implies successful validation)

    2. Sample size used for the test set and the data provenance

    The document does not specify a "test set sample size" in the context of a dataset of patient images for diagnostic performance evaluation, as the device is not for diagnosis. The testing seems to be functional and human factors related.

    • Data Provenance: Not specified, but given it's a visualization tool and not a diagnostic AI, the provenance of "data" in the typical sense (e.g., patient cases) is not detailed. It mentions using "preprocedure, intra-procedure and post-procedure CT images".

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts

    Not applicable. Since the device is a visualization tool and not a diagnostic AI, there is no mention of establishing ground truth using experts for diagnostic purposes. The ground truth for its functional performance would be self-evident (e.g., does it display the image correctly, does it measure accurately within a defined tolerance).

    4. Adjudication method for the test set

    Not applicable. No diagnostic ground truth was established by experts requiring adjudication.

    5. If a Multi-Reader Multi-Case (MRMC) comparative effectiveness study was done, if so, what was the effect size of how much human readers improve with AI vs without AI assistance

    No. The document explicitly states: "The software is not intended for diagnosis." Therefore, a comparative effectiveness study assessing human reader improvement with AI assistance would not be relevant or performed for this type of device.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    The device itself is standalone software ("SaMD"). Its performance was evaluated through "software subsystem and system-level verification" and "simulated-use, validation testing." These are essentially "algorithm only" tests in the sense that they evaluate the software's inherent functions (e.g., display, annotation, measurement accuracy) rather than its interaction within a diagnostic human-in-the-loop workflow.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)

    For the measurement accuracy, the "ground truth" would likely be the known voxel dimensions or a calibrated phantom. For other functional aspects (display, annotation, saving), the "ground truth" is simply whether the software performs the intended action correctly according to its specifications. It is not a diagnostic ground truth based on clinical findings or pathology.

    8. The sample size for the training set

    Not applicable. This is not an AI/Machine Learning device that undergoes a "training set" in the conventional sense (i.e., learning from data to make predictions or classifications). It's a DICOM viewer with visualization and measurement tools.

    9. How the ground truth for the training set was established

    Not applicable, as there is no "training set."

    Ask a Question

    Ask a specific question about this device

    Page 1 of 1