(79 days)
The software comprising the syngo.MR post-processing applications are post-processing software / applications to be used for viewing and evaluating the designated images provided by a magnetic resonance diagnostic device. All of the software applications comprising the syngo.MR post-processing applications have their own indications for use.
syngo.MR Neurology is a syngo based post-processing software for viewing, manipulating, and evaluating MR neurological images.
syngo.MR Oncology is a syngo based post-processing software for viewing, manipulating, and evaluating MR oncological images.
syngo.MR Neurology and syngo.MR Oncology are syngo.via-based post-processing software / applications to be used for viewing and evaluating MR images provided by a magnetic resonance diagnostic device and enabling structured evaluation of MR images.
syngo.MR Neurology and syngo.MR Oncology comprise of the following:
syngo.MR Neurology covers single and engine applications:
• syngo.MR Neuro Perfusion
• syngo.MR Neuro Perfusion Mismatch
• syngo.MR Neuro fMRI
• syngo.MR Tractography
• syngo.MR Neuro Perfusion Engine
• syngo.MR Neuro Perfusion Engine Pro (NEW)
• syngo.MR Neuro 3D Engine
• syngo.MR Neuro Dynamics (NEW)
syngo.MR Oncology covers single and engine applications:
syngo.MR Onco
syngo.MR 3D Lesion Segmentation
syngo.MR Tissue4D
syngo.MR Onco Engine
syngo.MR Onco Engine Pro (NEW)
syngo.MR OncoCare (NEW)
The provided text is a 510(k) summary for the syngo.MR Neurology and syngo.MR Oncology post-processing software. This document primarily focuses on establishing substantial equivalence to a predicate device rather than detailing specific acceptance criteria and a standalone study proving the device meets those criteria.
Therefore, much of the requested information regarding specific acceptance criteria, detailed study results, sample sizes, ground truth establishment for a test set, and multi-reader multi-case studies is not present in this document.
However, based on the provided text, here's what can be extracted and what information is missing:
1. A table of acceptance criteria and the reported device performance
This document does not provide specific quantitative acceptance criteria or a table of reported device performance metrics like sensitivity, specificity, or accuracy. The clearance is based on substantial equivalence to a predicate device (syngo.MR Post-Processing Software Version SMRVA16B, K133401) under the regulation for Picture Archiving and Communication Systems (PACS) (21 CFR 892.2050), which typically involves functional equivalence and safety rather than a diagnostic performance study against a clinical gold standard for the post-processing applications themselves.
2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)
This information is not provided in the document. The document refers to "non-clinical data" suggesting an equivalent safety and performance profile, but it does not detail a specific test set, its size, or provenance for a clinical performance study.
3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)
This information is not provided in the document. Since no specific clinical performance study is detailed, there's no mention of experts or their qualifications for establishing ground truth.
4. Adjudication method (e.g. 2+1, 3+1, none) for the test set
This information is not provided in the document.
5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance
This information is not provided in the document. The document focuses on the capabilities of the post-processing software for viewing, manipulating, and evaluating images, and it does not describe a comparative effectiveness study involving human readers with and without AI assistance for improved diagnostic accuracy.
6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done
A standalone performance study against a clinical ground truth for diagnostic parameters (like sensitivity/specificity) is not detailed in this document. The description of the device is as "post-processing software / applications to be used for viewing and evaluating," implying human interpretation of the processed images.
7. The type of ground truth used (expert consensus, pathology, outcomes data, etc)
This information is not provided in the document.
8. The sample size for the training set
Given that this is a 510(k) for a PACS-related software and not a de novo AI diagnostic device, there is no mention of a training set in the context of machine learning model development. The "syngo.MR Neurology" and "syngo.MR Oncology" applications and their sub-components are described as post-processing software that perform tasks like visualization of temporal variations, calculation of differences between lesions, workflow-oriented visualization of fMRI, and 3D tractographic data utilization. While these involve algorithms, the document doesn't frame them as AI models requiring a training set in the typical sense for a diagnostic claim.
9. How the ground truth for the training set was established
Since there is no mention of a training set or machine learning model development, this information is not applicable/provided in the document.
Summary of what the document indicates regarding "acceptance criteria" and "study":
The "acceptance criteria" for this 510(k) appear to be based on the general safety and effectiveness of the device, primarily by demonstrating substantial equivalence to an already cleared predicate device (syngo.MR Post-Processing Software Version SMRVA16B, K133401). The "study" that proves the device meets these criteria is an internal assessment against recognized standards and the predicate device.
- General Safety and Effectiveness Concerns: The document states that the device labeling contains instructions for use and warnings, and that risk management is ensured via a Risk Analysis compliant with ISO 14971:2007. Risks are controlled via measures in software development, SW testing, and product labeling.
- Conformance to Standards: The device conforms to applicable FDA recognized and international IEC, ISO, and NEMA standards.
- Substantial Equivalence: The primary "proof" is the comparison to the predicate device. The new functionalities (syngo.MR Neuro Dynamics and syngo.MR OncoCare) are stated to give the device "greater capabilities than the predicate" but that "the Intended Use, the basic technological characteristics and functionalities remain the same."
- Conclusion: Siemens believes the device "do[es] not raise new questions of safety or effectiveness and are substantially equivalent to the currently marketed device."
In essence, this 510(k) submission relies on the established safety and performance profile of a predicate device and adherence to general device safety and quality standards, rather than presenting a de novo clinical performance study with defined acceptance criteria for diagnostic accuracy.
§ 892.2050 Medical image management and processing system.
(a)
Identification. A medical image management and processing system is a device that provides one or more capabilities relating to the review and digital processing of medical images for the purposes of interpretation by a trained practitioner of disease detection, diagnosis, or patient management. The software components may provide advanced or complex image processing functions for image manipulation, enhancement, or quantification that are intended for use in the interpretation and analysis of medical images. Advanced image manipulation functions may include image segmentation, multimodality image registration, or 3D visualization. Complex quantitative functions may include semi-automated measurements or time-series measurements.(b)
Classification. Class II (special controls; voluntary standards—Digital Imaging and Communications in Medicine (DICOM) Std., Joint Photographic Experts Group (JPEG) Std., Society of Motion Picture and Television Engineers (SMPTE) Test Pattern).