Search Results
Found 2 results
510(k) Data Aggregation
(268 days)
SYNGO.VIA WEB VIEWER
syngo.via WebViewer is a software-only device indicated for reviewing medical images from syngo via. It supports interpretation and evaluations within healtheare institutions, for example, in Radiology, Nuclear Medicine and Cardiology environments (supported Image types: CT, MR, CR, DR, DX, PET). It is not intended for storage or distribution of medical images.
syngo.via WebViewer is an option for the syngo.via system and cannot be run without it. It is client server architecture and the client is intended to run on web clients which are connected to the healtheare institution IT infrastructure where the customer will insure HIPAA compliance. The communication of syngo.via WebViewer with connected medical IT systems will be done via standard interfaces such as but not limited to DICOM.
The system is not intended for the display of digital mammography images for diagnosis.
The syngo.via WebViewer is a software-based Picture Archiving and Communications System (PACS) used with the syngo.via system. The syngo.via WebViewer provides secure access to rendered medical image data and basic image manipulation through web browsers and mobile devices within the reach of the hospital network.
It extends the syngo.via WebViewer software application previous cleared under K111079. New image types supported are PET and X-Ray images. It also provides functionality for displaying images via a mobile application on an iPad.
The Siemens syngo.via WebViewer (K130998) is a PACS viewing software. The provided document does not contain acceptance criteria or a study that directly proves the device meets specific performance criteria through metrics like sensitivity, specificity, or accuracy for diagnostic tasks. Instead, the submission focuses on establishing substantial equivalence to a predicate device (syngo.via WebViewer K111079) based on its intended use, technical characteristics, and the results of non-clinical software verification and validation.
1. Table of Acceptance Criteria and Reported Device Performance:
As the device is a PACS viewing software, the acceptance criteria are not typically expressed in terms of diagnostic performance metrics (e.g., sensitivity, specificity) but rather in terms of functional performance, adherence to standards, and safety. The document states that "software verification and validation (Unit Test Level, Integration Test Level and System Test Level) was performed for all newly developed components and the complete system according to the following standards." The table below summarizes the implied acceptance criteria from the non-clinical tests and the device's adherence:
Acceptance Criterion (Implied from Standards & V&V) | Reported Device Performance |
---|---|
Adherence to DICOM Standard | Software verification and validation performed |
Adherence to ISO/IEC 15444-1:2005+TC 1:2007 (JPEG 2000) | Software verification and validation performed |
Adherence to ISO/IEC 10918-1:1994 + TC 1:2005 (JPEG) | Software verification and validation performed |
Adherence to HL7 [2006] | Software verification and validation performed |
Adherence to IEC 62304:2006 (Medical device software) | Software verification and validation performed |
Adherence to IEC 62366:2007 (Usability) | Software verification and validation performed |
Adherence to ISO 14971:2007 (Risk Management) | Software verification and validation performed; Risk analysis performed to identify potential hazards |
Adherence to IEC 60601-1-4:2000 (Safety) | Software verification and validation performed |
Secure access to rendered medical image data | Ensured via syngo.via WebViewer Data Management |
Basic image manipulation functionality | Provided via web browsers and mobile devices |
Compatibility with supported image types (CT, MR, CR, DR, DX, PET) | DICOM formatted images supported |
Compatibility with connected medical IT systems via standard interfaces (e.g., DICOM, HL7) | Communication via standard interfaces |
System safety and effectiveness | Instructions for use, cautions, and warnings in labeling; Risk management process followed |
Substantial equivalence to predicate device | Confirmed through comparison of intended use and technical characteristics |
2. Sample Size Used for the Test Set and Data Provenance:
The document does not describe a clinical study or a test set of medical images with a specific sample size used to evaluate diagnostic performance. The validation mentioned is "software verification and validation (Unit Test Level, Integration Test Level and System Test Level)", which refers to engineering and software quality assurance testing rather than a clinical performance study using patient data. Therefore, there is no information on sample size or data provenance (e.g., country of origin, retrospective/prospective).
3. Number of Experts Used to Establish the Ground Truth for the Test Set and Qualifications of Those Experts:
This level of detail is not provided as the submission focuses on software validation and substantial equivalence, not a clinical study involving ground truth establishment by experts for diagnostic performance.
4. Adjudication Method for the Test Set:
Not applicable, as no clinical test set for diagnostic performance requiring expert adjudication is described.
5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study:
No MRMC comparative effectiveness study is mentioned or appears to have been performed for this 510(k) submission. The document focuses on showing substantial equivalence to a predicate device rather than demonstrating a performance improvement with or without AI assistance.
6. Standalone Performance Study:
No standalone (algorithm only without human-in-the-loop performance) study is described, as the device is a medical image viewing software, not an AI diagnostic algorithm. The "software-only device" refers to its deployment model, not its functionality as an autonomous diagnostic tool.
7. Type of Ground Truth Used:
Ground truth as understood in the context of diagnostic accuracy studies (e.g., pathology, expert consensus) is not mentioned. The "ground truth" for the software validation activities would be the expected functional behavior and adherence to standards, checked against specified requirements.
8. Sample Size for the Training Set:
Not applicable. The device is a viewing software, not an AI/ML algorithm that requires a training set of data.
9. How the Ground Truth for the Training Set Was Established:
Not applicable, as no training set is relevant for this type of device.
Ask a specific question about this device
(25 days)
SYNGO.VIA WEB VIEWER
syngo.via WebViewer is intended to be a software-only solution for reviewing medical images from syngo.via. The system cannot be used as stand-alone device. It is intended to be an option for syngo.via system only. syngo.via WebViewer is not intended for storage or distribution of medical images from one medical device to another. syngo.via WebViewer is a client server architecture and the client is intended to run on web clients which are connected to the healthcare institution IT infrastructure where the customer has to ensure HIPPA compliance. syngo.via WebViewer supports interpretation and evaluation of examinations within healthcare institutions, for example, in Radiology, Nuclear Medicine and Cardiology environments. The communication of syngo.via WebViewer with connected medical IT systems will be done via standard interfaces such as but not limited to DICOM. The system is not intended for the displaying of digital mammography images for diagnosis in the U.S.
This premarket notification covers Siemens ' PACS syngo.via WebViewer. syngo.via WebViewer is intended to be a software-only solution for reviewing medical images from syngo.via. The system cannot be used as stand-alone device. It is intended to be an option for syngo via system only. syngo. via Web Viewer is not intended for storage or distribution of medical images from one medical device to another. syngo.via WebViewer is a client server architecture and the client is intended to run on web clients which are connected to the healthcare institution IT infrastructure where the customer has to ensure HIPPA compliance. syngo. via Web Viewer supports interpretation and evaluation of examinations within healthcare institutions, for example, in Radiology, Nuclear Medicine and Cardiology environments. The communication of syngo.via WebViewer with connected medical IT systems will be done via standard interfaces such as but not limited to DICOM. The system is not intended for the displaying of digital mammography images for diagnosis in the U.S. The system is a software only medical device. It defines minimum requirements to the hardware it runs on. The hardware itself is not seen as a medical device and not in the scope of this 510(k) submission. It supports the physician in diagnosis and treatment planning.
The provided text is a 510(k) Summary for the Siemens syngo.via WebViewer. This document describes the device, its intended use, and its substantial equivalence to predicate devices. However, this document does not contain explicit acceptance criteria or a study demonstrating that the device meets such criteria.
The 510(k) submission process for a Picture Archiving and Communications System (PACS) like syngo.via WebViewer primarily focuses on establishing substantial equivalence to existing legally marketed devices, rather than conducting new clinical trials with specific statistical performance metrics and acceptance criteria as might be expected for an AI-powered diagnostic device.
Here's a breakdown of why the requested information is largely absent based solely on the provided text, and what could be inferred or is generally understood for this type of device:
1. A table of acceptance criteria and the reported device performance
- Acceptance Criteria: Not explicitly stated in the document. For a PACS viewer, acceptance criteria would typically revolve around functional performance (e.g., image display accuracy, speed, compliance with DICOM standards, user interface usability, data integrity, security).
- Reported Device Performance: Not quantitively reported in the document. The document affirms that the device "passed all necessary verification and validation steps to demonstrate safety and effectiveness" and "does not introduce any new significant potential safety risks and is substantially equivalent to and performs as well as the predicate devices." This is a qualitative statement of performance relative to predicates rather than a measured performance against specific criteria.
2. Sample sized used for the test set and the data provenance
- Test Set Sample Size: Not specified. Validation for a PACS viewer typically involves testing against various types of DICOM images and objects (CT, MR, SC, PDF) to ensure correct display and functionality. The "sample" here would be representative medical images, but the document doesn't quantify how many were used.
- Data Provenance: Not specified. It's implied that various types of DICOM images would be used for testing, but their origin (country, retrospective/prospective) is not detailed.
3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts
- Ground Truth Experts: Not applicable or specified. For a PACS viewer, "ground truth" isn't generally established by experts in the same way it would be for a diagnostic AI algorithm. The viewer's "ground truth" is adherence to DICOM standards for image display and manipulation, and the successful execution of its functions. The verification and validation would be conducted by software testers and engineers, likely with medical domain knowledge.
4. Adjudication method for the test set
- Adjudication Method: Not applicable or specified. Again, for a PACS viewer, the "test set" isn't typically adjudicated like an AI diagnostic outcome. Functional and performance testing would involve objective checks against specifications and industry standards.
5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance
- MRMC Study: No. The syngo.via WebViewer is a medical image viewing and management system, not an AI-assisted diagnostic tool designed to improve human reader performance. Therefore, an MRMC study and
"effect size of how much human readers improve with AI vs without AI assistance" are not relevant to this device as described.
6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done
- Standalone Performance: Not applicable as a diagnostic algorithm. The device itself is described as a "software-only solution for reviewing medical images" and "cannot be used as stand-alone device" meaning it requires the syngo.via system to function. It functions as a standalone software component (algorithm only in its specified role), but its "performance" is in displaying images and managing data, not making diagnoses without a human.
7. The type of ground truth used
- Type of Ground Truth: Not applicable in the traditional sense of a diagnostic device (e.g., pathology, outcomes data). The "ground truth" for a PACS viewer essentially refers to the correct and accurate display of DICOM images as per specifications and standards, and the proper functioning of its features (e.g., measurements, windowing). This is verified through comparison against known correct outputs or adherence to technical specifications.
8. The sample size for the training set
- Training Set Sample Size: Not applicable. The syngo.via WebViewer is not described as an AI/ML device that requires a "training set" in the context of machine learning model development. It's a traditional software application.
9. How the ground truth for the training set was established
- Ground Truth for Training Set: Not applicable, as there is no "training set."
Summary based on the provided document:
The 510(k) summary focuses on demonstrating the substantial equivalence of the syngo.via WebViewer to previously cleared predicate devices (Siemens syngo.x and Siemens InSpace 4D). It highlights similarities in intended use, technological characteristics (image communication, processing, supported image types, user interface, hardware), and adherence to relevant industry standards (IEC 62304 for software lifecycle, ISO 14971 for risk management, IEC 60601-1-1-6 for usability, DICOM, HL7, JPEG, JPEG2000 for communication and compression).
The "study" referenced implicitly is the verification and validation activities performed during the device's development cycle, which are stated to be compliant with QSR design processes, IEC 62304, ISO 14971, and IEC 60601-1-1-6. These activities would have ensured that the software functions as intended and meets its specifications, including accurate rendering of medical images and adherence to interface standards. However, direct evidence of specific acceptance criteria met through a formal study with statistical outcomes is not part of this 510(k) summary.
Ask a specific question about this device
Page 1 of 1