K Number
K151115
Device Name
MR Core Software
Manufacturer
Date Cleared
2015-06-03

(37 days)

Product Code
Regulation Number
892.2050
Panel
RA
Reference & Predicate Devices
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

MR Core is an option within Vitrea® that allows the examination of a series of medical images obtained from MRI scanners.

The option also enables clinicians to compare multiple series for the same patient, side-by-side, and switch to other integrated applications to further examine the data.

Device Description

MR Core allows intuitive navigation, quantification, and manipulation of medical images obtained from MRI scanners. This application enables clinicians to compare multiple series of the same patient, side-by-side, and switch to other integrated applications to further examine the data. It provides rich clinical tools to review images for efficient and effective patient care.

AI/ML Overview

The provided document is a 510(k) summary for a medical device called "MR Core software." This document outlines the device's indications for use, its comparison to predicate and reference devices, and a summary of non-clinical tests conducted. However, it explicitly states that "The subject of traditional 510(k) notification, MR Core software, did not require clinical studies to support safety and effectiveness of the software."

Therefore, I cannot provide details on acceptance criteria and a study proving the device meets those criteria from this document. The document primarily focuses on demonstrating substantial equivalence to a predicate device (Softread Software K040305) and a reference device (Myrian K091001) through technological and functional comparisons, not through a standalone clinical performance study with specific acceptance criteria.

Despite the lack of explicit clinical performance study data, here's what can be inferred or stated based on the provided text regarding verification and validation:

1. Table of Acceptance Criteria and Reported Device Performance:

Since no clinical study with explicit performance metrics is mentioned, a table of acceptance criteria and reported device performance directly from a clinical study cannot be created. The document focuses on showing that the functional equivalency and design controls were met. Implicit "acceptance criteria" can be derived from the non-clinical tests described, which confirm that the software operates according to its requirements and fulfills its intended uses.

Acceptance Criteria (Inferred from Non-Clinical Tests)Reported Device Performance (as described in the document)
Software fully satisfies all expected system requirements and features.Test cases executed against system features and requirements; Requirements Traceability Matrix (RTM) reviewed for coverage.
Software conforms to user needs and intended use.Workflow testing conducted, providing evidence that system requirements and features were implemented, reviewed, and met.
Experienced medical professionals confirm MR Core software fulfills its intended uses.All validators confirmed that the MR Core software fulfills its intended uses.
Benefits outweigh risks, and residual risks are acceptable.Each risk assessed; benefits outweigh risks; all risks reduced as low as possible; overall residual risk deemed acceptable.
New features operate according to their requirements.Software testing completed to ensure new features operate according to requirements.
Compliance with recognized consensus standards (DICOM, ISO 14971, IEC 62304).MR Core software complies with listed voluntary recognized consensus standards.

2. Sample Size Used for the Test Set and Data Provenance:

  • Sample Size for Test Set: Not explicitly stated. The document mentions "previously acquired medical images" were used for verification, validation, and evaluation. It does not provide the number of images or cases.
  • Data Provenance: Not explicitly stated. The medical images were "previously acquired," but their country of origin or whether they were retrospective or prospective is not mentioned.

3. Number of Experts Used to Establish Ground Truth for the Test Set and Qualifications:

  • Number of Experts: "Experienced medical professionals" evaluated the application during external validation. The exact number is not specified.
  • Qualifications of Experts: They are referred to as "experienced medical professionals." No specific qualifications (e.g., radiologist with X years of experience) are provided.

4. Adjudication Method for the Test Set:

  • Adjudication Method: Not explicitly mentioned. The document states, "All validators confirmed that the MR Core software fulfills its intended uses." This suggests a consensus or individual validation approach, but no formal adjudication process (like 2+1 or 3+1) is described.

5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study Was Done:

  • No, an MRMC comparative effectiveness study was not done. The document explicitly states that "MR Core software, did not require clinical studies to support safety and effectiveness of the software." The validation involved experienced medical professionals reviewing the application, which is more akin to usability testing than a comparative effectiveness study measuring improved reader performance with or without AI assistance.

6. If a Standalone (i.e., algorithm only without human-in-the-loop performance) Was Done:

  • The device is "Radiological Image Processing Software" and its intended use involves "examination and manipulation of a series of medical images obtained from MRI scanners" by clinicians. The features described (viewing, manipulation, analysis tools) are meant to be used by clinicians. The "external validation" involved "experienced medical professionals [evaluating] the application." Therefore, the focus is on a human-in-the-loop performance, not a standalone algorithm's diagnostic performance. There is no mention of an algorithm-only standalone performance evaluation.

7. The Type of Ground Truth Used:

  • Given the nature of the device (image viewing and manipulation tools) and the non-clinical validation, the "ground truth" was centered on expert confirmation of functional correctness and fulfillment of intended use. This means the "ground truth" was established by expert consensus/opinion regarding whether the software tools worked as expected and met user needs, rather than a definitive medical diagnosis (e.g., pathology, outcomes data).

8. The Sample Size for the Training Set:

  • Not applicable / not provided. The document describes a traditional 510(k) for a software device that primarily offers image viewing and manipulation functionalities, not an AI/ML algorithm that requires a training set. The descriptions relate to software engineering practices (requirements, code reviews, verification, validation) rather than machine learning model development.

9. How the Ground Truth for the Training Set Was Established:

  • Not applicable / not provided. As there is no mention of a training set for an AI/ML model, there is no discussion of how any "ground truth" for such a set was established.

§ 892.2050 Medical image management and processing system.

(a)
Identification. A medical image management and processing system is a device that provides one or more capabilities relating to the review and digital processing of medical images for the purposes of interpretation by a trained practitioner of disease detection, diagnosis, or patient management. The software components may provide advanced or complex image processing functions for image manipulation, enhancement, or quantification that are intended for use in the interpretation and analysis of medical images. Advanced image manipulation functions may include image segmentation, multimodality image registration, or 3D visualization. Complex quantitative functions may include semi-automated measurements or time-series measurements.(b)
Classification. Class II (special controls; voluntary standards—Digital Imaging and Communications in Medicine (DICOM) Std., Joint Photographic Experts Group (JPEG) Std., Society of Motion Picture and Television Engineers (SMPTE) Test Pattern).