K Number
K181345
Manufacturer
Date Cleared
2018-07-19

(59 days)

Product Code
Regulation Number
892.2050
Panel
OP
Reference & Predicate Devices
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

The Image Filing Software NAVIS -EX is a software system intended for use to store, manage, process, measure, analyze and display patient data and clinical information from computerized diagnostic instruments through networks. It is intended to work with compatible NIDEK ophthalmic devices.

Device Description

The NAVIS-EX is an application of client-server model. Patient information and examination data are managed in a server database. These data are saved in the database from a device connected with the NAVIS-EX or from software related to the NAVIS-EX. In the client, the examination data can be displayed and analyzed, and the images can be processed. In addition, those results can be printed or be transferred to an external system in the form of a report. The NAVIS-EX system includes specific optional viewers AL-Scan Viewer, CEM Viewer, and Data Acquisition Service (DAS).

AI/ML Overview

The provided 510(k) submission for the Nidek Image Filing Software NAVIS-EX focuses on establishing substantial equivalence to a predicate device (FORUM, FORUM Archive, FORUM Archive and Viewer by Carl Zeiss Meditec AG). It does not contain an independent clinical study with specific acceptance criteria and detailed performance data in the typical sense of a diagnostic Artificial Intelligence (AI) device.

Instead, the submission primarily relies on:

  1. Comparison of Technological Characteristics: Demonstrating that the NAVIS-EX has similar functions (filing, external interface, image acquisition, image processing) to the predicate and reference devices.
  2. Compliance with Standards: Stating that testing according to ISO 14971 (risk management), AAMI/ANSI/IEC 62304 (software life cycle processes), and IEC 62366-1 (usability) was performed and showed the device performs as intended and is safe.

Therefore, many of the requested details about acceptance criteria, specific device performance metrics, sample sizes, ground truth establishment, and MRMC studies are not available in this document.

Here's a breakdown of what can be gleaned from the document regarding your request:


1. Table of Acceptance Criteria and the Reported Device Performance:

The document does not explicitly state quantitative acceptance criteria or report specific performance metrics (e.g., accuracy, sensitivity, specificity, AUC) for the NAVIS-EX as an AI/CAD-like device. The "performance" claimed is primarily functional equivalence and safety as demonstrated by compliance with general medical device standards.

Acceptance Criteria (Implied)Reported Device Performance
Performs as intended (functionality and safety)Demonstrated by compliance with ISO 14971, AAMI/ANSI/IEC 62304, IEC 62366-1.
Substantially equivalent in technological characteristicsFunctional comparisons show similar features for image filing, display, search, zoom, print, external interfaces, image acquisition, and processing compared to predicate and reference devices. Minor differences are argued not to raise new safety/efficacy issues.

2. Sample Size Used for the Test Set and Data Provenance:

No information regarding a specific "test set" in terms of patient data or images used for clinical performance evaluation is provided. The testing mentioned refers to engineering and software validation tests against standards, not a clinical performance study on a dataset.


3. Number of Experts Used to Establish the Ground Truth for the Test Set and Qualifications:

Not applicable, as no clinical test set using expert-established ground truth is described in this submission.


4. Adjudication Method for the Test Set:

Not applicable.


5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study was done, and effect size:

No MRMC comparative effectiveness study is mentioned in this submission. The device is image filing software, not a diagnostic AI that would typically undergo such a study for improved reader performance.


6. If a Standalone (i.e., algorithm only without human-in-the-loop) Performance Study was done:

No standalone clinical performance study is described. The "performance" assessment is based on functional equivalence and safety/software standards compliance.


7. The Type of Ground Truth Used:

Not applicable, as no clinical performance study with defined ground truth is described.


8. The Sample Size for the Training Set:

Not applicable, as this is not an AI/ML device that requires a training set in the conventional sense for diagnostic performance.


9. How the Ground Truth for the Training Set was Established:

Not applicable.


Summary of the Study that Proves the Device Meets Acceptance Criteria:

The "study" that proves the device meets its (implied) acceptance criteria is a combination of:

  • Functional Benchmarking/Comparison: The device's features for image filing, management, processing, and display were compared against those of a predicate device (Carl Zeiss Meditec AG's FORUM) and several NIDEK reference devices (K132323, K113451, K152729, K133132, K173980). This comparison (detailed in the tables on pages 6-7 of the document) served to demonstrate that the NAVIS-EX has substantially similar technological characteristics.
  • Compliance with International Standards: The submission states that testing according to ISO 14971 (Medical devices – Application of risk management to medical devices), AAMI/ANSI/IEC 62304 (Medical device software – Software life cycle processes), and IEC 62366-1 (Medical devices – Application of usability engineering to medical devices) was performed. These tests are intended to ensure the software is safe, functions reliably, and is usable, thus implying the performance necessary for its intended use without raising new safety or effectiveness concerns.

The conclusion is that the "test results and comparison results show that the proposed device is substantially equivalent to the predicate device in performance." This suggests that the "acceptance criteria" were met by demonstrating an equivalent level of function and safety through these comparisons and standard compliance, rather than through specific quantitative clinical performance metrics typical of AI diagnostic tools.

§ 892.2050 Medical image management and processing system.

(a)
Identification. A medical image management and processing system is a device that provides one or more capabilities relating to the review and digital processing of medical images for the purposes of interpretation by a trained practitioner of disease detection, diagnosis, or patient management. The software components may provide advanced or complex image processing functions for image manipulation, enhancement, or quantification that are intended for use in the interpretation and analysis of medical images. Advanced image manipulation functions may include image segmentation, multimodality image registration, or 3D visualization. Complex quantitative functions may include semi-automated measurements or time-series measurements.(b)
Classification. Class II (special controls; voluntary standards—Digital Imaging and Communications in Medicine (DICOM) Std., Joint Photographic Experts Group (JPEG) Std., Society of Motion Picture and Television Engineers (SMPTE) Test Pattern).