(69 days)
The RadWorks Medical Imaging Software, from Applicare Medical Imaging, B.V., when installed on an appropriate hardware platform, is intended to provide capability for the acceptance, display, storage, and digital processing of medical images. Options allow for additional capability, including transmission of images over local area networks or public communications channels, digitization of film images, acceptance of digital images directly from different medical image modalities, and quality control review and revision of studies.
The RadWorks Quality Control Module is intended to be used by authorized staff to perform various quality control operations on RadWorks imaging studies before they are made available to other locations on the network. These operations include confirming or editing patient characteristics, reviewing the status history of the study, adding or removing images, combining with another study, renumbering images, editing patient orientation information, and setting or editing routing information.
The provided text does not contain information about specific acceptance criteria or an explicit study proving that the device meets those criteria. The submission is focused on demonstrating substantial equivalence to a predicate device, as confirmed by the FDA's letter (K982862).
Here's an analysis based on the information provided, highlighting what is present and what is missing:
1. Table of Acceptance Criteria and Reported Device Performance
Not explicitly provided. The document describes the "RadWorks Medical Imaging Software with Quality Control Module" as having various quality control operations (confirming/editing patient characteristics, reviewing status history, adding/removing images, combining studies, renumbering images, editing orientation, setting/editing routing information). However, it does not state specific performance metrics (e.g., accuracy, speed, user-friendliness) for these operations, nor does it define acceptance criteria for such metrics.
2. Sample Size Used for the Test Set and Data Provenance
Not explicitly provided. The document mentions "Software testing of the new module followed Applicare's normal procedures" and that "a software test plan is developed, containing a detailed description of relevant test procedures." However, it does not specify the details of the test set, including its sample size or data provenance (e.g., country of origin, retrospective/prospective nature).
3. Number of Experts Used to Establish Ground Truth for the Test Set and Qualifications
Not applicable/Not explicitly provided. Since no specific performance claims or a detailed test set are described for the Quality Control Module's operations, there's no mention of experts being used to establish a "ground truth" for the test set. The module's functions are primarily for data manipulation and quality control, not diagnostic interpretation requiring expert consensus on complex medical conditions.
4. Adjudication Method for the Test Set
Not applicable/Not explicitly provided. As no expert review or diagnostic assessment is detailed, an adjudication method is not mentioned.
5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study Was Done
No. A MRMC study typically compares human readers' diagnostic performance with and without AI assistance. The RadWorks Medical Imaging Software with Quality Control Module is described as a tool for managing and manipulating medical images, not for diagnostic interpretation. Therefore, an MRMC study is not relevant to its stated function and was not conducted or reported.
6. If a Standalone (Algorithm Only Without Human-in-the-Loop Performance) Was Done
Not explicitly provided within the context of performance metrics. The software performs operations for authorized staff, implying a human-in-the-loop interaction rather than a fully autonomous diagnostic or analytical algorithm. The testing described focuses on software functionality, not algorithmic performance in a standalone capacity.
7. The Type of Ground Truth Used
Not explicitly provided. Given the nature of the software (image management and quality control), "ground truth" would likely relate to the correct execution of software functions (e.g., if an image was successfully moved, if patient characteristics were correctly edited, if routing information was accurately set). This would be verified through functional testing rather than clinical ground truth like pathology or outcomes data.
8. The Sample Size for the Training Set
Not applicable. The RadWorks Quality Control Module is described as software for performing various operations on imaging studies. It is not an AI/ML algorithm that learns from a "training set" in the conventional sense (i.e., a dataset used to train a predictive model). Its functions are programmed, not learned.
9. How the Ground Truth for the Training Set Was Established
Not applicable. As it's not a machine learning model, there is no "training set" or ground truth establishment for such a set.
Summary of what is present:
- The 510(k) submission is for a modification to an existing device (K962699).
- The modified device (RadWorks Medical Imaging Software with Quality Control Module) adds specific quality control operations (confirming/editing patient characteristics, reviewing status history, adding/removing images, combining studies, renumbering images, editing orientation, setting/editing routing information).
- The submission asserts that the technological characteristics of the modified device are "identical" to the original.
- Software testing followed Applicare's internal procedures, including a test plan describing what to test, expected results, when, by whom, resources used, and how results are recorded.
- The conclusion is that the intended use is the same as the predicate, and technological characteristics are sufficient to demonstrate substantial equivalence.
In essence, this 510(k) submission primarily relies on demonstrating substantial equivalence to a predicate device and adherence to internal software testing procedures, rather than presenting a performance study with specific acceptance criteria, ground truth, or statistical analysis of algorithmic performance.
§ 892.2050 Medical image management and processing system.
(a)
Identification. A medical image management and processing system is a device that provides one or more capabilities relating to the review and digital processing of medical images for the purposes of interpretation by a trained practitioner of disease detection, diagnosis, or patient management. The software components may provide advanced or complex image processing functions for image manipulation, enhancement, or quantification that are intended for use in the interpretation and analysis of medical images. Advanced image manipulation functions may include image segmentation, multimodality image registration, or 3D visualization. Complex quantitative functions may include semi-automated measurements or time-series measurements.(b)
Classification. Class II (special controls; voluntary standards—Digital Imaging and Communications in Medicine (DICOM) Std., Joint Photographic Experts Group (JPEG) Std., Society of Motion Picture and Television Engineers (SMPTE) Test Pattern).