K Number
K043412
Date Cleared
2004-12-20

(10 days)

Product Code
Regulation Number
892.2050
Panel
RA
Reference & Predicate Devices
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

ViewStation supports image and information flow among health care facility personnel. ViewStation can be used whenever digital images and associated data are the means for communicating information. ViewStation is not intended for use in diagnosis.

The intended use of ViewStation is to provide health care facility personnel with an effective means to utilize patient images during the course of therapy or treatment. ViewStation allows users to import, view, annotate, manipulate, enhance, manage, and archive patient images and associated information are stored in a database, providing users access to the information necessary to perform their functions.

Device Description

The primary function of ViewStation is to provide a means to more effectively manage image information in a therapy or treatment environment. ViewStation provides the ability to import, view, annotate, manipulate, enhance, and archive patient images during the course of therapy, treatment, and follow-up.

ViewStation imports existing digital images acquired or generated by other products. ViewStation retains the original image, which was acquired or generated by a third party product. With these facts in mind, the goal of ViewStation is to make electronic patient image information more accessible throughout the department. IMPAC is providing a tool to increase department productivity since digital images, unlike films, do not have to be physically transferred from one station to another.

AI/ML Overview

The ViewStation is an Image Management System which is explicitly not intended for diagnostic use but rather for managing images and information flow in a healthcare facility. Given this, the submission does not contain a study involving clinical efficacy or diagnostic performance. Instead, the "acceptance criteria" and "study" are focused on demonstrating that the updated software maintains the safety and effectiveness of the predicate device for its intended non-diagnostic use.

Here's an breakdown:

1. A table of acceptance criteria and the reported device performance

The submission does not present a table of acceptance criteria in the traditional sense of diagnostic performance metrics (e.g., sensitivity, specificity, AUC). Instead, the "acceptance criteria" are implied by the requirements for regulatory compliance, internal quality standards, and successful software development and testing. The "reported device performance" is the successful completion of these processes, affirming that the updated ViewStation maintains its intended non-diagnostic functionality and safety.

Aspect of Acceptance/PerformanceReported Performance/Method of Meeting
Safety and EffectivenessProduct change does not diminish safety or effectiveness. System Hazard Analysis (SHA2102) performed, documented, reviewed, and implemented. Hazard identification traced through evaluation, design, specification, implementation, and testing. Design Review Team confirmed no increased health or safety risk.
Intended UseIdentical indications for use to predicate device. "The total sum of all feature enhancements does not affect the intended use of ViewStation."
Technological Characteristics"Technological characteristics remain principally the same." "Evolutionary product changes does not raise any new questions of safety and effectiveness, nor do the changes require novel methods of verification or validation."
Basic Functionality"The sum of the changes does not affect the basic functionality of ViewStation remains dedicated to providing healthcare personnel with a means to import, view, annotate, manage, and archive patient images."
Software QualityDeveloped according to IMPAC Software Design Control Procedure (SDCP). IMPAC Quality System complies with ISO 9001:2000, ISO 13485:2003, ISO 14971:2000, EN 60601-1-4:1996, ISO/IEC 9003:2004, and 93/42/EEC.
Verification and ValidationTraceability Matrix created. System Test Plan for full application, integration, and system testing. Test Procedures capture detailed parameters, results, and certification. Test certification statement confirms planned testing completed successfully. Design Reviews performed at each phase.
Algorithm/Technical ChangesEngineering performed to ensure algorithms and all other technical changes function exactly as intended. Testing demonstrated successful implementation.
Regulatory ComplianceSubmitted under 510(k) Premarket Notification as substantially equivalent to predicate devices (K011694 and K942346).

2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)

This information is not applicable and therefore not provided in the submission. Since the device is explicitly not intended for diagnosis and the changes are evolutionary software updates to an existing image management system, no clinical "test set" of patient data (images) was used to evaluate diagnostic performance. The testing performed was related to software verification and validation, hazard analysis, and functional integrity.

3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)

This information is not applicable and therefore not provided in the submission. As no clinical "test set" using patient data for diagnostic evaluation was involved, no experts were required to establish ground truth for such a purpose. The "experts" involved would be software engineers, quality assurance personnel, and potentially medical professionals (users) providing feedback on the system's usability and functionality, but not establishing diagnostic ground truth.

4. Adjudication method (e.g. 2+1, 3+1, none) for the test set

This information is not applicable and therefore not provided in the submission. Adjudication methods are typically used in studies involving expert review of diagnostic performance. The testing described focuses on software functionality, safety, and compliance with quality systems, not diagnostic accuracy.

5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

A Multi-Reader, Multi-Case (MRMC) comparative effectiveness study was not done. The ViewStation is an image management system and explicitly states it is "not intended for use in diagnosis." Therefore, there is no AI component for diagnostic assistance, and no study to evaluate reader improvement with or without AI.

6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

This information is not applicable. The ViewStation is a software system with human-in-the-loop functionality, and not a standalone diagnostic algorithm. Its purpose is to manage images for human users.

7. The type of ground truth used (expert consensus, pathology, outcomes data, etc)

This information is not applicable. The device is not for diagnosis, so there is no "ground truth" related to disease presence or absence established from pathology, expert consensus, or outcomes data. The "ground truth" for the software testing would be the expected functional behavior and safety requirements defined in the design specifications.

8. The sample size for the training set

This information is not applicable and therefore not provided in the submission. The ViewStation is a conventional image management software, not a machine learning or AI-driven diagnostic algorithm that requires a "training set" of data in the typical sense. The software's development is guided by established engineering principles and quality systems rather than data-driven machine learning.

9. How the ground truth for the training set was established

This information is not applicable. As there is no "training set" in the context of machine learning, there is no ground truth establishment for such a set. The "ground truth" for software development is based on user requirements, regulatory standards, and design specifications.

§ 892.2050 Medical image management and processing system.

(a)
Identification. A medical image management and processing system is a device that provides one or more capabilities relating to the review and digital processing of medical images for the purposes of interpretation by a trained practitioner of disease detection, diagnosis, or patient management. The software components may provide advanced or complex image processing functions for image manipulation, enhancement, or quantification that are intended for use in the interpretation and analysis of medical images. Advanced image manipulation functions may include image segmentation, multimodality image registration, or 3D visualization. Complex quantitative functions may include semi-automated measurements or time-series measurements.(b)
Classification. Class II (special controls; voluntary standards—Digital Imaging and Communications in Medicine (DICOM) Std., Joint Photographic Experts Group (JPEG) Std., Society of Motion Picture and Television Engineers (SMPTE) Test Pattern).