Search Filters

Search Results

Found 1 results

510(k) Data Aggregation

    K Number
    K130383
    Date Cleared
    2013-04-09

    (54 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Predicate For
    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    Synapse 3D Cardiac Tools is medical imaging software used with Synapse 3D Base Tools that is intended to provide trained medical professionals with tools to aid them in reading, interpreting, reporting, and treatment planning. Synapse 3D Cardiac Tools accepts DICOM compliant medical images acquired from a variety of imaging devices including, CT, MR, NM, and XA.

    This product is not intended for use with or for the primary diagnostic interpretation of Mammography images. Addition to the tools in Synapse 3D Base Tools, Synapse 3D Cardiac Tools for specific clinical applications which provide targeted workflows, custom UI, targeted measurements and reporting functions including:

    • A Functional cardiac analysis for CT left ventriculography images: which is intended to evaluate the functional characteristics of heart
    • A Functional cardiac analysis for non-contrast MR heart images: which is intended to evaluate the functional characteristics of heart
    • A Coronary artery analysis for CT coronary arteriography images: which is intended for the qualitative and quantitative analysis of coronary arteries
    • A Coronary artery analysis for MR heart images: which is intended for the qualitative and quantitative analysis of coronary arteries
    • A Calcium scoring for non-contrast CT heart images: which is intended for non-invasive identification and quantification of calcified atherosclerotic plaques in the coronary arteries using tomographic medical image data and clinically accepted calcium scoring algorithms
    • A Cardiac Fusion: which is intended to analyze cardiac anatomy and pathology with a fused image of functional data (e.g. NM image, Bulls eye) and anatomical data
    • A Aortic Valve Analysis for contrast CT heart images: which is intended for visualization of the heart, aorta regions, and contour of the aorta, measurement of the vicinity of the aortic valve, measurement of the calcification area in the aorta.
    Device Description

    Synapse 3D Cardiac Tools is the updated version of previously-cleared Synapse 3D Cardiac Tools software (cleared by CDRH via K120636 on 07/05/2012).

    Synapse 3D Cardiac Tools is used in addition to the Synapse 3D Base Tools (K120361) to analyze the images acquired from CT and MR. Synapse 3D Cardiac Tools is intended to provide trained medical professionals with tools to aid them in reading, interpreting, and treatment planning of DICOM compliant medical images.

    Synapse 3D Cardiac Tools is an application that supports the cardiac function, cardiac fusion, and coronary artery analysis of both the computed tomography (CT) and magnetic resonance (MR) images. Synapse 3D Cardiac Tools also supports the calcium scoring for non-contrast CT images.

    AI/ML Overview

    Here's a breakdown of the acceptance criteria and the study that proves the device meets them, based on the provided text:

    Preamble: The provided 510(k) summary focuses on demonstrating substantial equivalence to predicate devices, primarily K120636 and K120367. The testing described is general to software verification and validation, rather than a specific clinical performance study with detailed acceptance criteria for diagnostic metrics. The document emphasizes that the device is an "updated version" and an "application that supports" trained medical professionals, suggesting it's a tool for analysis rather than an autonomous diagnostic algorithm.

    1. Table of Acceptance Criteria and Reported Device Performance

    The provided 510(k) document does not contain a specific table of quantitative acceptance criteria for diagnostic performance metrics (e.g., sensitivity, specificity, accuracy) for a standalone AI algorithm. It describes general verification and validation activities for software and accuracy for measurements.

    The document states: "Pass/Fail criteria were based on the requirements and intended use of the product. Test results showed that all tests successfully passed."

    Based on the information, here's a conceptual table. Since quantitative diagnostic performance metrics are not given, the "Reported Device Performance" is inferred from the overall statement of successful testing.

    Acceptance Criteria CategorySpecific Criteria (Inferred/General)Reported Device Performance
    System FunctionalityMeets Software Requirements Specification, intended use.All tests passed successfully.
    Segmentation AccuracyAccurate segmentation of cardiac structures (details not specified).Achieved expected accuracy performance.
    Measurement AccuracyAccurate measurements (e.g., ejection fraction, volumes, calcification scores, aortic valve dimensions).Achieved expected accuracy performance.
    InterfacingSeamless integration with DICOM-compliant systems.All tests passed successfully.
    UsabilityUser-friendly interface and workflow (details not specified).All tests passed successfully.
    ServiceabilityMaintainable and serviceable (details not specified).All tests passed successfully.
    LabelingComplies with labeling requirements.All tests passed successfully.
    Risk MitigationAll identified hazards appropriately mitigated.All tests passed successfully.
    Overall PerformanceSafe and effective, substantially equivalent to predicate devices.Demonstrated substantial equivalence; found safe and effective.

    2. Sample Size for the Test Set and Data Provenance

    The document states: "In addition, we conducted the bench performance testing using actual clinical images to help demonstrate that the proposed device achieved the expected accuracy performance."

    • Sample Size for Test Set: Not specified. The exact number of "actual clinical images" used for bench performance testing is not mentioned.
    • Data Provenance: The images were "actual clinical images." The country of origin and whether they were retrospective or prospective are not specified.

    3. Number of Experts Used to Establish Ground Truth for the Test Set and Qualifications

    The document does not specify the number of experts used or their qualifications for establishing ground truth for the "actual clinical images."

    4. Adjudication Method for the Test Set

    The document does not specify any adjudication method (e.g., 2+1, 3+1) for establishing ground truth for the "actual clinical images."

    5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study Was Done

    No. The document does not mention a Multi-Reader Multi-Case (MRMC) comparative effectiveness study. The focus is on the software's inherent functionality and measurement accuracy as a tool rather than its impact on human reader performance.

    6. If a Standalone (Algorithm Only Without Human-in-the-Loop Performance) Was Done

    The document describes "bench performance testing using actual clinical images to help demonstrate that the proposed device achieved the expected accuracy performance." This implies a standalone assessment of the device's accuracy in segmentation and measurement.

    • Yes, a standalone assessment of the device's performance was conducted, specifically for "segmentation accuracy test" and "measurement accuracy test." However, the specific metrics (e.g., Dice scores, absolute difference in measurements) and their success thresholds are not detailed.

    7. The Type of Ground Truth Used

    The document does not explicitly state the specific type of ground truth used (e.g., pathology, clinical outcomes). Given the context of "segmentation accuracy test" and "measurement accuracy test" using "actual clinical images" for cardiac analysis, the ground truth would most likely have been expert consensus or reference measurements manually performed by qualified experts on these clinical images.

    8. The Sample Size for the Training Set

    The document does not mention or specify a training set size. This is typical for 510(k) submissions for image processing tools that might use rule-based algorithms or pre-trained models. If machine learning was involved in specific features, the training data and methods are not detailed in this summary.

    9. How the Ground Truth for the Training Set Was Established

    Since a training set is not explicitly mentioned, the method for establishing its ground truth is also not described.

    Ask a Question

    Ask a specific question about this device

    Page 1 of 1