K Number
K142919
Device Name
EXA
Manufacturer
Date Cleared
2014-12-16

(69 days)

Product Code
Regulation Number
892.2050
Panel
RA
Reference & Predicate Devices
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

EXA™ is a software device that receives digital images and data from various sources (i.e. CT scanners, ultrasound systems, R/F Units, computed & direct radiographic devices, secondary capture devices, scanners, imaging gateways or other imaging sources). Images and data can be stored, communicated, processed and displayed within the system and or across computer networks at distributed locations. Lossy compressed mammographic images are not intended for diagnostic review. Mammographic images should only be viewed with a monitor approved by FDA for viewing mammographic images. For primary diagnosis, post process DICOM "for presentation" images must be used. Typical users of this system are trained professionals, nurses, and technicians.

Device Description

EXA™ is a software suite of web based PACS applications that was developed specifically to handle the DICOM protocol, for both transmitting and viewing DICOM images and data elements. The applications were developed so that access to the PACS can occur from any Microsoft Windows computer with internet capabilities, and offer an interface that users find to be quite intuitive after some initial learning. (Not intended for use on mobile devices) The EXA™ applications deal with all manner of DICOM images and modalities, including MR, CT, CR, US, MG and many others. These images can be viewed, annotated, transmitted to other facilities, printed, animated and stored using the EXA™ RAD suite.

AI/ML Overview

The provided document is a 510(k) premarket notification for a Picture Archiving and Communications System (PACS) named EXA™. This document focuses on demonstrating substantial equivalence to a predicate device (Viztek OPAL-RAD™) through software validation and a comparison of characteristics rather than a clinical study with specific acceptance criteria related to diagnostic performance.

Therefore, the requested information regarding acceptance criteria for device performance, sample sizes for test and training sets, expert qualifications, ground truth establishment, or clinical effectiveness studies (MRMC or standalone AI) are not explicitly present or applicable in the context of this 510(k) submission.

Here's a breakdown based on the information available:

1. Table of Acceptance Criteria and Reported Device Performance

Not applicable in the conventional sense of diagnostic performance metrics (e.g., sensitivity, specificity). The acceptance criteria were based on software verification and validation, ensuring the device meets its predetermined software requirements specifications and performs equivalently to the predicate device in terms of functionality and image handling.

Acceptance Criteria (Implied)Reported Device Performance (Implied)
Software meets predetermined requirementsSoftware Verification and Validation performed; device meets predetermined Software Requirements Specifications.
Equivalent functionality to predicate deviceComparison to predicate device indicates similar functionality, with described improvements.
Safe and effective as predicate deviceConcluded to be as safe and effective as predicate device based on software validation and comparison.
Image quality equal to or better than predicateClinical images collected (likely for comparison purposes) demonstrate equal or better image quality compared to the predicate.

2. Sample size used for the test set and the data provenance

  • Test Set Sample Size: Not explicitly stated. The testing involved "Software Verification and Validation" and "Clinical images collected." It's likely these were internal tests rather than a formal clinical study with a defined patient cohort.
  • Data Provenance: Not specified. "Clinical images collected" suggests real-world data, but details about country of origin or whether it was retrospective/prospective are not provided. Given the nature of a PACS, the images themselves would originate from various imaging modalities.

3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts

Not applicable. The submission focuses on software functionality and image handling, not on diagnostic accuracy requiring expert ground truth in a clinical trial. The "clinical images collected" were likely used to verify image quality and system functionality, not for diagnostic performance evaluation against an expert-established ground truth.

4. Adjudication method for the test set

Not applicable. No formal adjudication method is mentioned as there's no clinical diagnostic performance study described.

5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

No MRMC comparative effectiveness study was done. This device is a PACS, which is infrastructure for displaying and managing medical images, not an AI-assisted diagnostic tool.

6. If a standalone (i.e. algorithm only without human-in-the loop performance) was done

Not applicable. This is not an AI algorithm with standalone performance. It is a PACS system.

7. The type of ground truth used

Not applicable. For a PACS, the "ground truth" would relate to the accurate storage, retrieval, processing, and display of image data conforming to DICOM standards and system specifications, rather than a clinical diagnosis.

8. The sample size for the training set

Not applicable. This is not an AI device that requires a training set in the machine learning sense. The "training" for such a system refers to software development and testing based on predetermined requirements.

9. How the ground truth for the training set was established

Not applicable.

Summary of the Study that Proves the Device Meets Acceptance Criteria:

The study that proves the device meets the (implied) acceptance criteria is described as Software Verification and Validation and Risk Analysis.

  • Methodology: The submission states, "The results of software validation and comparison to our predicate device indicates that the new device is as safe and effective as our predicate device. Clinical images collected demonstrate equal or better image quality as compared to our predicate." It further clarifies, "The testing showed that the device meets its predetermined Software Requirements Specifications."
  • Conclusion: Based on the software validation and risk analysis, Viztek Inc. concluded that EXA™ is "as safe and effective as the predicate device, have few technological differences, and has the same indications for use, thus rendering it substantially equivalent to the predicate device."

Essentially, the "study" was a comprehensive software testing and comparison effort, confirming that the new PACS system performed its intended functions in a manner comparable to or better than a previously cleared PACS, and that all software requirements were met. It did not involve a clinical utility study to evaluate diagnostic accuracy or improvements for human readers.

§ 892.2050 Medical image management and processing system.

(a)
Identification. A medical image management and processing system is a device that provides one or more capabilities relating to the review and digital processing of medical images for the purposes of interpretation by a trained practitioner of disease detection, diagnosis, or patient management. The software components may provide advanced or complex image processing functions for image manipulation, enhancement, or quantification that are intended for use in the interpretation and analysis of medical images. Advanced image manipulation functions may include image segmentation, multimodality image registration, or 3D visualization. Complex quantitative functions may include semi-automated measurements or time-series measurements.(b)
Classification. Class II (special controls; voluntary standards—Digital Imaging and Communications in Medicine (DICOM) Std., Joint Photographic Experts Group (JPEG) Std., Society of Motion Picture and Television Engineers (SMPTE) Test Pattern).