Search Filters

Search Results

Found 1 results

510(k) Data Aggregation

    K Number
    K170952
    Device Name
    syngo.CT View&GO
    Date Cleared
    2017-04-28

    (28 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Why did this record match?
    Reference Devices :

    K971717

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    syngo. CT View&GO is intended for basic visualization of medical images that are used for diagnostic purposes. The software package is designed to support trained technicians and trained physicians in basic qualitative and basic quantitative measurements as well as in the analysis of clinical data that has been acquired and reconstructed on Computed Tomography scanners. The software package shall also provide the possibility to save image data and to trigger the transfer of image data to other systems, such as printers or archiving systems. The software package shall provide an interface to integrate additional advanced visualization and measurement tools.

    Basic visualization of medical images includes, for example:

    • · Adjusting of windowing level presets
    • · Zooming and panning of images
    • · Multiplanar reconstruction (MPR) display
    • · Maximum intensity projection (MIP) display
    • · Volume rendering techniques (VRT) display

    Basic qualitative and basic quantitative measurements include, for example:

    • · Distance measurements
    • · Region of interest (ROI) measurements
    • · Pixel lens to measure local HU values
    Device Description

    The application syngo.CT View&GO is intended for basic visualization of medical images that are used for diagnostic purposes. It is designed to support trained technicians and trained physicians in basic qualitative and basic quantitative measurements as well as in the analysis of clinical data that has been acquired and reconstructed on Computed Tomography scanners. The application also provides the possibility to save image data and to trigger the transfer of image data to other systems, such as printers or archiving systems. In addition, syngo.CT View&GO provides an interface to integrate additional advanced post processing tools through the plug-in functionality of syngo.CT View&GO.

    The application provides the basic visualization features (for example):

    • Adjusting of windowing level presets ●
    • Zooming and panning of images ●
    • Multiplanar reconstruction (MPR) display ●
    • . Maximum intensity projection (MIP) display
    • Volume rendering techniques (VRT) display

    Furthermore, basic qualitative and quantitative measurements are supported (for example):

    • Distance measurements ●
    • Region of interest (ROI) measurements ●
    • Pixel lens to measure local HU values ●

    syngo.CT View&GO also provides an interface to extend this application for additional advanced post processing tools through the plug-in functionality of syngo.CT View&GO.

    AI/ML Overview

    The provided text outlines the acceptance criteria and the study conducted for the syngo.CT View&GO device. However, it's important to note that this device is a software application for basic medical image visualization and measurements, not an AI/ML diagnostic algorithm. Therefore, many of the typical acceptance criteria and study components associated with AI/ML devices (e.g., specific performance metrics like sensitivity/specificity, expert ground truth for AI output, MRMC studies for AI assistance) are not detailed in this submission. The "study" here primarily refers to non-clinical verification and validation testing to demonstrate functional performance and safety in comparison to a predicate device, rather than a clinical efficacy study often seen with AI models.

    Here's a breakdown based on the provided text, addressing the requested information as much as possible within the context of this submission:


    Acceptance Criteria and Device Performance

    Given that syngo.CT View&GO is a medical image visualization and measurement software, the acceptance criteria are framed in terms of functional performance, adherence to standards, and risk mitigation, rather than diagnostic accuracy metrics. The document emphasizes "all software specifications have met the acceptance criteria" and "the subject device performs as intended." Specific quantitative performance metrics (like sensitivity, specificity, or AUC) are not directly applicable or reported for this type of device's primary functions.

    Table of Acceptance Criteria and Reported Device Performance:

    Acceptance Criteria CategorySpecific Criteria (Implicit/Explicit from text)Reported Device Performance and Evidence
    Functional Performance- Basic visualization features (Adjusting windowing level presets, Zooming/panning, MPR, MIP, VRT display) function as intended.
    • Basic qualitative and quantitative measurements (Distance, ROI, Pixel lens for HU values) function as intended.
    • Ability to save image data.
    • Ability to trigger transfer of image data to other systems (printers, archiving systems).
    • Interface for integrating additional advanced visualization and measurement tools (plug-in functionality).
    • Improved Shaded Surface Display (Endoscopic View/Fly Through) functions as intended.
    • Workflow improvements (Tool Box/Favorite Tools, Distribution Step) function as intended. | - "Performance tests were conducted to test the functionality of the syngo.CT View&GO. These tests have been performed to test the ability of the included features of the results of these tests demonstrate that the subject device performs as intended."
    • "Verification and Validation testing for the endoscopic view feature was conducted to demonstrate successful software integration and performance..."
    • "The result of all conducted testing was found acceptable to support the claim of substantial equivalence." |
      | Software Quality | - Conformance with software specifications.
    • Proper software integration.
    • All identified risks are mitigated. | - "All verification testing has been completed and meets Siemens acceptance criteria."
    • "All the software specifications have met the acceptance criteria."
    • "Verification and Validation testing supports the claims of substantial equivalence."
    • "The modifications described in this Premarket Notification were supported with verification/validation testing." |
      | Safety & Effectiveness | - The device is comparable to the predicate device in terms of technological characteristics, safety, and effectiveness.
    • Risk analysis is completed, and identified hazards are mitigated.
    • Conformance to general safety and effectiveness concerns (e.g., labeling, warnings).
    • Compliance with relevant recognized standards (DICOM, ISO 14971, IEC 62304, IEC 62366-1, AAMI/ANSI ES60601-1). | - "The testing results support that all the software specifications have met the acceptance criteria."
    • "Testing for verification and validation of the device was found acceptable to support the claims of substantial equivalence."
    • "The risk analysis was completed and risk control implemented to mitigate identified hazards. The testing supports that all software specifications have met the acceptance criteria."
    • "syngo.CT View&GO is designed to fulfill recognized and established industry practice and standards."
    • (Table lists compliance with specific standards). |
      | Cybersecurity | - Process for preventing unauthorized access, modification, misuse, or denial of use of information. | - "Cybersecurity information in accordance with guidance document 'Content of Premarket Submissions for Management of Cybersecurity Medical Devices issues on October 2, 2014' is included within this submission." |

    Study Details (Non-Clinical Validation)

    1. Sample Size Used for the Test Set and Data Provenance:

      • The document states: "Non-clinical tests (integration and functional) were conducted for syngo.CT View&GO during product development." and "Performance tests were conducted to test the functionality of the syngo.CT View&GO."
      • Specific sample sizes (e.g., number of images or datasets) for these non-clinical tests are not provided in the submitted text.
      • Data Provenance: Not explicitly stated (e.g., country of origin, retrospective/prospective). Given that it's "non-clinical" and for a PACS-like software, it likely involves internal test data or anonymized/synthetic datasets rather than clinical patient data used for diagnostic performance evaluation. It's not a study on diagnostic accuracy, but rather on functionality.
    2. Number of Experts Used to Establish Ground Truth for the Test Set and Qualifications:

      • Since this is primarily a functional verification and validation of a visualization and measurement software (not a diagnostic AI), the concept of "ground truth" in the clinical diagnostic sense (e.g., disease presence/absence confirmed by pathology) is not applicable here.
      • The "ground truth" for these tests would likely involve predefined expected outputs/behaviors for various functions, verified by software testers and potentially design engineers, rather than clinical experts establishing a medical diagnosis.
      • Therefore, this information is not provided nor would it be expected for this type of submission.
    3. Adjudication Method for the Test Set:

      • Not applicable as it's not a study requiring adjudication of clinical findings or algorithm outputs. The verification and validation process would involve comparing software behavior to predefined requirements.
    4. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study was Done:

      • No, an MRMC comparative effectiveness study was not done. This type of study is typically performed to demonstrate the impact of a diagnostic AI system on human reader performance. syngo.CT View&GO is a visualization and measurement tool, not a diagnostic AI that assists in interpretation. Its purpose is to provide basic tools for trained technicians and physicians, assuming they already have the necessary knowledge to use these tools for analysis.
    5. If a Standalone (i.e., algorithm only without human-in-the-loop performance) Study was Done:

      • This is not a standalone diagnostic algorithm. It is a software application designed to be used by humans (trained technicians and physicians) for visualization and measurement. The "performance tests" mentioned are essentially "standalone" in the sense that they test the software's functions in isolation to ensure they meet specifications, but not in the context of an AI-driven "decision-making" process.
    6. The Type of Ground Truth Used:

      • For this software, the "ground truth" for testing is the design specifications and functional requirements of the software. For example, a distance measurement tool would be tested against known distances in test images, or a MIP function would be verified to correctly project maximum intensity pixels from a volume. It is not clinical diagnosis, pathology, or outcomes data.
    7. The Sample Size for the Training Set:

      • This device is not an AI/ML model that undergoes a "training" phase with a dataset. It is traditional software.
      • Therefore, there is no training set or sample size for it.
    8. How the Ground Truth for the Training Set Was Established:

      • As there is no training set for this traditional software, this question is not applicable.
    Ask a Question

    Ask a specific question about this device

    Page 1 of 1