(28 days)
Harmony is a comprehensive software platform intended for use in importing, processing, measurement, analysis and storage of clinical images and videos of the eye as well as in management of patient data, diagnostic data, clinical information, reports from ophthalmic diagnostic instruments through either a direct connection with the instruments or through computerized networks.
Harmony is a modification to Synergy ODM cleared in K151952. The differences between the new version and the currently cleared version are modifications to the GUI using a responsive design and a change to the front end platform from Microsoft Silverlight to HTML5.
Harmony is a comprehensive software platform intended for use in importing, processing, measurement, analysis and storage of clinical images and videos of the eye, as well as for management of patient data, diagnostic data, clinical information, reports from ophthalmic diagnostic instruments through either a direct connection with the instruments or through computerized networks.
Harmony is used together with a number of computerized digital imaging devices, including:
- Optical Coherence Tomography devices
- Mydriatic retinal cameras
- Non-mydriatic retinal cameras
- Biomicroscopes (slit lamps)
This document (K182376) is a 510(k) premarket notification for a software platform called Harmony. It primarily focuses on demonstrating substantial equivalence to a predicate device (Topcon Corporation Synergy ODM, K151952) rather than presenting new performance study data with acceptance criteria.
The key takeaway is that no performance data was required or provided to prove the device meets acceptance criteria as this submission is for a modification to an already cleared device. The manufacturer states that "Software validation and verification demonstrate that Harmony performs as intended and meets its' specifications." This implies that the acceptance criteria are related to the software's functional specifications, which were validated internally, but no detailed performance study meeting specific clinical or diagnostic metrics is presented here.
Therefore, many of the requested sections regarding a study that proves the device meets acceptance criteria cannot be extracted from this document because such a study was not deemed necessary for this particular submission.
Here's a breakdown of what can be inferred and what cannot:
1. A table of acceptance criteria and the reported device performance:
- Acceptance Criteria: Not explicitly stated with specific numerical targets. Based on the "Performance Data" section, the acceptance criteria are implicitly tied to the software's functional specifications and its ability to perform as intended. These would likely include criteria such as:
- Successful import, processing, measurement, analysis, and storage of clinical images and videos of the eye.
- Proper management of patient data, clinical information, and reports from ophthalmic diagnostic instruments.
- Successful connection (direct or network) with ophthalmic diagnostic instruments.
- Correct execution of viewing operations (zoom, pan, contrast/brightness adjustment, drawing tools).
- Accurate line and area measurement capabilities.
- Accurate Cup to Disc ratio and MPS (Macular Photocoagulation Study) measurements.
- Proper functioning of network and security features (web-based access, DICOM communication).
- Correct printing, archiving, and backup functionality.
- Reported Device Performance: The document states, "Software validation and verification demonstrate that Harmony performs as intended and meets its' specifications." No specific performance metrics (e.g., accuracy, sensitivity, specificity, resolution, speed) are provided.
Table (based on inference):
Acceptance Criterion (Inferred from functionality) | Reported Device Performance (Inferred) |
---|---|
Successful import, processing, measurement, analysis, storage of ophthalmic data. | Performs as intended. |
Management of patient & clinical data, reports from ophthalmic instruments. | Meets specifications. |
Direct/network connection with ophthalmic instruments. | Performs as intended. |
Correct viewing operations, image enhancements, drawing tools. | Meets specifications. |
Accurate line/area measurements, Cup to Disc ratio, MPS measurements. | Performs as intended. |
Proper network, security (web-based access, DICOM). | Meets specifications. |
Correct printing, archiving, backup functionality. | Performs as intended. |
2. Sample size used for the test set and the data provenance:
- Sample Size: Not specified. As no performance study with a test set was required, this information is not provided. The "validation and verification" likely refer to internal software testing rather than a clinical study with a patient data test set.
- Data Provenance: Not specified.
3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts:
- Not applicable/Not specified, as no clinical test set requiring expert ground truth establishment was described.
4. Adjudication method for the test set:
- Not applicable/Not specified, as no clinical test set requiring adjudication was described.
5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:
- No MRMC comparative effectiveness study was done or reported. The device is described as a "comprehensive software platform" and "image management system," not an AI diagnostic tool designed to assist human readers in a diagnostic capacity that would typically warrant such a study.
6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done:
- Not applicable/Not specified. The device is a "software platform intended for use in importing, processing, measurement, and storage...as well as in management of patient data, clinical information, reports." This describes an infrastructure and management tool, not an algorithm with standalone diagnostic performance.
7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.):
- Not applicable/Not specified. The "ground truth" in this context would likely be the expected functional behavior and output of the software, verified through software testing, rather than clinical ground truth from patient data.
8. The sample size for the training set:
- Not applicable/Not specified. This device is not described as an AI or machine learning model that would require a training set in the typical sense. It is a software platform for managing images and data.
9. How the ground truth for the training set was established:
- Not applicable/Not specified, as there is no mention of a training set.
§ 892.2050 Medical image management and processing system.
(a)
Identification. A medical image management and processing system is a device that provides one or more capabilities relating to the review and digital processing of medical images for the purposes of interpretation by a trained practitioner of disease detection, diagnosis, or patient management. The software components may provide advanced or complex image processing functions for image manipulation, enhancement, or quantification that are intended for use in the interpretation and analysis of medical images. Advanced image manipulation functions may include image segmentation, multimodality image registration, or 3D visualization. Complex quantitative functions may include semi-automated measurements or time-series measurements.(b)
Classification. Class II (special controls; voluntary standards—Digital Imaging and Communications in Medicine (DICOM) Std., Joint Photographic Experts Group (JPEG) Std., Society of Motion Picture and Television Engineers (SMPTE) Test Pattern).