(181 days)
IDx-DR is indicated for use by healthcare providers to automatically detect more than mild diabetic retinopathy in adults diagnosed with diabetes who have not been previously diagnosed with diabetic retinopathy. IDx-DR is indicated for use with the Topcon NW400.
The IDx-DR device consists of several component parts. A camera is attached to a computer, where IDx-DR client is installed. Guided by the Client, users acquire two fundus images per eye to be dispatched to IDx-Service. IDx-Service is installed on a server hosted at a secure datacenter. From IDx-Service, images are transferred to IDx-DR Analysis. No information other than the fundus images is required to perform the analysis. IDx-DR Analysis, which runs on dedicated servers hosted in the same secure datacenter as IDx-Service, processes the fundus images and returns information on the exam quality and the presence or absence of mtmDR to IDx-Service. IDx-Service then transports the results to the IDx-DR Client that displays them to the user.
Here's an analysis of the acceptance criteria and study information for the IDx-DR device, based on the provided text:
1. Table of Acceptance Criteria and Reported Device Performance
The provided 510(k) summary (K203629) states that the device modifications do not affect clinical performance and refers to the predicate device (DEN180001) for clinical trial details. Therefore, the acceptance criteria and reported device performance are identical to the predicate device. To provide complete information, one would need to refer to the DEN180001 submission. However, based solely on the provided document K203629, the table would look like this:
| Acceptance Criterion | Reported Device Performance (from K203629) |
|---|---|
| Auto-detect more than mild diabetic retinopathy (mtmDR) | Not explicitly stated in K203629. K203629 states: "The device modifications do not affect clinical performance." Performance is considered "Equivalent" to predicate device DEN180001. |
| Refer to an eye care professional for mtmDR detected | Not explicitly stated in K203629. K203629 states: "The device modifications do not affect clinical performance." Performance is considered "Equivalent" to predicate device DEN180001. |
| Rescreen in 12 months for mtmDR not detected | Not explicitly stated in K203629. K203629 states: "The device modifications do not affect clinical performance." Performance is considered "Equivalent" to predicate device DEN180001. |
| Insufficient image quality identified | Implied as an output, but no performance metric given. K203629 states: "The device modifications do not affect clinical performance." Performance is considered "Equivalent" to predicate device DEN180001. |
Important Note: To get the actual numerical acceptance criteria (e.g., sensitivity, specificity thresholds) and the reported performance values, the DEN180001 submission would need to be reviewed. This document explicitly avoids providing those details for the current submission.
2. Sample Size Used for the Test Set and Data Provenance
Since the current submission (K203629) states that "The determination of substantial equivalence is not based on an assessment of clinical performance data" and refers to DEN180001 for clinical trial details, this information is not available in the provided text.
3. Number of Experts Used to Establish the Ground Truth for the Test Set and Their Qualifications
This information is not provided in the K203629 document. It would be found in the clinical trial details for the predicate device (DEN180001).
4. Adjudication Method for the Test Set
This information is not provided in the K203629 document. It would be found in the clinical trial details for the predicate device (DEN180001).
5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study Was Done, and the Effect Size of How Much Human Readers Improve with AI vs. Without AI Assistance
A Multi-Reader Multi-Case (MRMC) comparative effectiveness study comparing human readers with AI assistance versus without AI assistance is not mentioned in the furnished K203629 document. The document explicitly states that the substantial equivalence determination is not based on new clinical performance data and refers to the predicate device's clinical trial.
6. If a Standalone (i.e., algorithm only without human-in-the-loop performance) Study Was Done
The K203629 document describes the IDx-DR Analysis component as "Software that analyzes the patient's images and determines exam quality and the presence/absence of diabetic retinopathy." This implies a standalone algorithmic assessment. However, the performance metrics of this specific version of the standalone algorithm are not presented in this document, as it relies on the predicate device's clinical performance. The "Outputs" section of Table 1 supports the standalone nature of the output, as it directly states the detection of DR and referral decisions.
7. The Type of Ground Truth Used
This information is not provided in the K203629 document. It would be found in the clinical trial details for the predicate device (DEN180001). Typically, for diabetic retinopathy, ground truth is established by a panel of expert ophthalmologists or retina specialists through consensus reading of images, potentially correlated with other clinical findings.
8. The Sample Size for the Training Set
The document does not specify the sample size for the training set. It mentions "Future algorithm improvements will be made under a consistent medically relevant framework" and "A protocol was provided to mitigate the risk of algorithm changes," but no details on training data for the current or previous versions are given.
9. How the Ground Truth for the Training Set Was Established
The document does not provide details on how the ground truth for the training set was established.
{0}------------------------------------------------
June 10, 2021
Image /page/0/Picture/1 description: The image shows the logo of the U.S. Food and Drug Administration (FDA). The logo consists of two parts: the Department of Health & Human Services logo on the left and the FDA logo on the right. The FDA logo is in blue and includes the letters "FDA" in a square, followed by the words "U.S. FOOD & DRUG ADMINISTRATION".
Digital Diagnostics Inc. Ashley Miller Regulatory Affairs Manager 2300 Oakdale Blvd. Coralville, Iowa 52241
Re: K203629
Trade/Device Name: IDx-DR Regulation Number: 21 CFR 886.1100 Regulation Name: Retinal Diagnostic Device Regulatory Class: Class II Product Code: PIB Dated: May 11, 2021 Received: May 11, 2021
Dear Ashley Miller:
We have reviewed your Section 510(k) premarket notification of intent to market the device referenced above and have determined the device is substantially equivalent (for the indications for use stated in the enclosure) to legally marketed predicate devices marketed in interstate commerce prior to May 28, 1976, the enactment date of the Medical Device Amendments, or to devices that have been reclassified in accordance with the provisions of the Federal Food, Drug, and Cosmetic Act (Act) that do not require approval of a premarket approval application (PMA). You may, therefore, market the device, subject to the general controls provisions of the Act. Although this letter refers to your product as a device, please be aware that some cleared products may instead be combination products. The 510(k) Premarket Notification Database located at https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfpmn/pmn.cfm identifies.combination product submissions. The general controls provisions of the Act include requirements for annual registration, listing of devices, good manufacturing practice, labeling, and prohibitions against misbranding and adulteration. Please note: CDRH does not evaluate information related to contract liability warranties. We remind you, however, that device labeling must be truthful and not misleading.
If your device is classified (see above) into either class II (Special Controls) or class III (PMA), it may be subject to additional controls. Existing major regulations affecting your device can be found in the Code of Federal Regulations, Title 21, Parts 800 to 898. In addition, FDA may publish further announcements concerning your device in the Federal Register.
Please be advised that FDA's issuance of a substantial equivalence determination does not mean that FDA has made a determination that your device complies with other requirements of the Act or any Federal statutes and regulations administered by other Federal agencies. You must comply with all the Act's requirements, including, but not limited to: registration and listing (21 CFR Part 807): labeling (21 CFR Part
{1}------------------------------------------------
801): medical device reporting of medical device-related adverse events) (21 CFR 803) for devices or postmarketing safety reporting (21 CFR 4, Subpart B) for combination products (see https://www.fda.gov/combination-products/guidance-regulatory-information/postmarketing-safety-reportingcombination-products); good manufacturing practice requirements as set forth in the quality systems (QS) regulation (21 CFR Part 820) for devices or current good manufacturing practices (21 CFR 4, Subpart A) for combination products; and, if applicable, the electronic product radiation control provisions (Sections 531-542 of the Act); 21 CFR 1000-1050.
Also, please note the regulation entitled, "Misbranding by reference to premarket notification" (21 CFR Part 807.97). For questions regarding the reporting of adverse events under the MDR regulation (21 CFR Part 803), please go to https://www.fda.gov/medical-device-safety/medical-device-reportingmdr-how-report-medical-device-problems.
For comprehensive regulatory information about medical devices and radiation-emitting products, including information about labeling regulations, please see Device (https://www.fda.gov/medicaldevices/device-advice-comprehensive-regulatory-assistance) and CDRH Learn (https://www.fda.gov/training-and-continuing-education/cdrh-learn). Additionally, you may contact the Division of Industry and Consumer Education (DICE) to ask a question about a specific regulatory topic. See the DICE website (https://www.fda.gov/medical-device-advice-comprehensive-regulatoryassistance/contact-us-division-industry-and-consumer-education-dice) for more information or contact DICE by email (DICE@fda.hhs.gov) or phone (1-800-638-2041 or 301-796-7100).
Sincerely,
Elvin Ng Assistant Director DHT1A: Division of Ophthalmic Devices OHT1: Office of Ophthalmic, Anesthesia, Respiratory, ENT and Dental Devices Office of Product Evaluation and Ouality Center for Devices and Radiological Health
Enclosure
{2}------------------------------------------------
Indications for Use
510(k) Number (if known) K203629
Device Name IDx-DR
Indications for Use (Describe)
IDx-DR is indicated for use by healthcare providers to automatically detect more than mild diabetic retinopathy in adults diagnosed with diabetes who have not been previously diagnosed with diabetic retinopathy. IDx-DR is indicated for use with the Topcon NW400.
| Type of Use (Select one or both, as applicable) | |
|---|---|
| X Prescription Use (Part 21 CER 801 Subnart D) | Over-The-Counter Ise (21 CER 801 Submart C) |
CONTINUE ON A SEPARATE PAGE IF NEEDED.
This section applies only to requirements of the Paperwork Reduction Act of 1995.
DO NOT SEND YOUR COMPLETED FORM TO THE PRA STAFF EMAIL ADDRESS BELOW.
The burden time for this collection of information is estimated to average 79 hours per response, including the time to review instructions, search existing data sources, gather and maintain the data needed and complete and review the collection of information. Send comments regarding this burden estimate or any other aspect of this information collection, including suggestions for reducing this burden, to:
Department of Health and Human Services Food and Drug Administration Office of Chief Information Officer Paperwork Reduction Act (PRA) Staff PRAStaff(@fda.hhs.gov
"An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB number."
{3}------------------------------------------------
V. 510(k) Summary
I. Submitter
Digital Diagnostics Inc. 2300 Oakdale Blvd. Coralville, IA 52241 Phone: 319-248-5620
Contact Person: Ashley Miller Date Prepared: December 4, 2020
II. Device
Name of Device: IDx-DR Common or Usual Name: Diabetic Retinopathy Detection Device Classification Name: Retinal diagnostic software device Regulatory Class: II Regulation: 21 CFR 886.1100 Product Code: PIB
III. Predicate Device
IDx-DR, Diabetic Retinopathy Detection Device, DEN180001 This predicate has not been subject to a design-related recall.
No reference devices were used in this submission.
IV. Indications for Use
IDx-DR is indicated for use by healthcare providers to automatically detect more than mild diabetic retinopathy in adults diagnosed with diabetes who have not been previously diagnosed with diabetic retinopathy. IDx-DR is indicated for use with the Topcon NW400.
The Indications for Use statement is identical to the predicate device.
V. Device Description
The IDx-DR device consists of several component parts (see image below). A camera is attached to a computer, where IDx-DR client is installed. Guided by the Client, users acquire two fundus images per eye to be dispatched to IDx-Service. IDx-Service is installed on a server hosted at a secure datacenter. From IDx-Service, images are
{4}------------------------------------------------
transferred to IDx-DR Analysis. No information other than the fundus images is required to perform the analysis. IDx-DR Analysis, which runs on dedicated servers hosted in the same secure datacenter as IDx-Service, processes the fundus images and returns information on the exam quality and the presence or absence of mtmDR to IDx-Service. IDx-Service then transports the results to the IDx-DR Client that displays them to the user.
Image /page/4/Figure/3 description: The image shows a diagram of the IDx secure server system. The diagram shows the flow of data from the patient's camera to the IDx-DR client on the customer's computer. The data is then sent to the IDx web service on the IDx secure servers. Finally, the data is sent to the IDx-DR analysis on the IDx secure servers, and the results are sent back to the customer's computer.
Figure 1: IDx-DR Components
The component parts of IDx-DR illustrated above are summarized as follows:
- IDx-DR Analysis: Software that analyzes the patient's images and determines exam quality and the presence/absence of diabetic retinopathy.
- IDx-DR Client: A software application component running on a computer, usually connected to the fundus camera, at the customer site. Using this software, the customer can transfer images to IDx-DR Analysis via IDx-Service and receive results back.
- IDx-Service: IDx-Service comprises a general exam analysis service delivery software package. IDx-Service contains a webserver front-end that securely handles incoming requests, a database that stores customer information, and a logging system that records information about each transaction through IDx-Service. IDx-Service is also primarily responsible for device cybersecurity.
VI. Comparison of Technological Characteristics with the Predicate Device
IDx-DR, the subject device of this 510(k), has the same intended use and indications for use as the predicate IDx-DR device cleared under DEN180001.
Artificial intelligence software as a medical device is the main technological principle for both the subject and predicate devices. The software as a medical device uses artificial intelligence technology to analyze specific disease features from fundus retinal images for diagnostic screening of diabetic retinopathy. The subject and predicate devices are based on the same general technological elements:
{5}------------------------------------------------
- . Fundus camera to obtain retinal images
- IDx-DR Client installed on a computer to guide the user to acquire images using ● the fundus camera, transfer the images to IDx-DR Analysis via IDx-Service, and receive the results
- . IDx-DR Analysis to analyze the patient's images for exam quality and the presence/absence of diabetic retinopathy
- . IDx-Service to facilitate secure transfer of exam data from IDx-DR Client to IDx-DR Analysis and the results from IDx-DR Analysis back to IDx-DR Client
The major technological differences that exist between each component of the subject and predicate devices are described below.
IDx-Service
- The subject device allows of the submission of DICOM images
- . The subject device provides improved feedback to the client by distinguishing between the states "submission not found" and "submission not ready"
- . The subject device tracks image analysis statistics, such as analysis start and end
IDx-DR Client
- . The subject device does not require the user to "refresh" the user interface to display new exams
- . The subject device incorporates customer configuration options:
- Submission of DICOM images O
- O Output filename structure
- Highlight most recent exam о
- Masking of exam result on the user interface (the result is unchanged in о the final report)
- o Dark mode for viewing based on customer preference
- The subject device incorporates a guided workflow in IDx-DR Client to guide the . user through the image acquisition/submission and display step-by-step instructions directly on the user interface
- . The subject device incorporates local image retention
- . The subject device incorporates a training mode
- . The subject device in-exam image quality feedback and allows the user to resubmit images when applicable
{6}------------------------------------------------
IDx-DR Analysis
- . The subject device determines image fixations when the number of left and right eye images does not meet protocol requirements
Table 3 provides a comparison between the technical characteristics and indications for use of the subject and predicate devices.
| Subject DeviceIDx-DR | Predicate DeviceIDx-DR, DEN180001 | Discussion | |
|---|---|---|---|
| ComponentSoftwareVersions | IDx-DR Client v3.2.0IDx-DR Analysis v2.1.1IDx-Service v1.1.2See above for the majortechnological differencesbetween each componentof the subject andpredicate device. | IDx-DR Client v2.0.1IDx-DR Analysis v2.0.1IDx-Service v1.0.0 | SubstantiallyEquivalent. Thechanges describedabove do notsignificantly affectthe use of thedevice, clinicalfunctionality, norperformance of thedevice assupported bysoftwareverification andvalidation testing. |
| TechnologicalPrinciple | Artificial intelligencesoftware as a medicaldevice. | Artificial intelligencesoftware as a medicaldevice. | Equivalent |
| Indicationsfor Use | For use by healthcareproviders to automaticallydetect more than milddiabetic retinopathy inadults diagnosed withdiabetes who have notbeen previously diagnosedwith diabetic retinopathy. | For use by healthcareproviders to automaticallydetect more than milddiabetic retinopathy inadults diagnosed withdiabetes who have notbeen previously diagnosedwith diabetic retinopathy. | Equivalent |
| IndicatedCamera | Topcon NW400 funduscamera | Topcon NW400 funduscamera | Equivalent |
| Inputs | Macula and disc centeredcolor fundus images with45° field of view, 2 pereye. | Macula and disc centeredcolor fundus images with45° field of view, 2 pereye. | Equivalent |
| Outputs | Detection of diabeticretinopathy and referraldecision: | Detection of more thanmild diabetic retinopathy(mtmDR) and referraldecision: | Equivalent |
| mtmDR detected:Refer to an eye care professional mtmDR not detected:Rescreen in 12 months Insufficient image quality | mtmDR detected:Refer to an eye care professional mtmDR not detected:Rescreen in 12 months Insufficient image quality | ||
| Architecture | User facing client softwaretransfers images to andreceives results fromanalysis software througha web server. | User facing client softwaretransfers images to andreceives results fromanalysis software througha web server. | Equivalent |
| Workflow | The graphical userinterface includes on-screen prompts to guidethe user through the imageacquisition workflow oneimage at a time andsubmission of the exam. | Labeling (the QuickReference Guide) guidesthe user through the imageacquisition workflow andsubmission of the exam. | SubstantiallyEquivalent. Thelabeling is beingincorporateddirectly into theuser interface andinstructions aredisplayed as theuser progressthrough each stepfor improvedinteraction withthe device. Theoverall workflowand directions foruse do not change. |
Table 1: Comparison of the Subject and Predicate Device
{7}------------------------------------------------
Performance Data VII.
The following performance data were provided in support of the substantial equivalence determination.
A. Summary of Non-clinical Studies
Software
IDx-DR (software version 2) was identified as having a major level of concern as defined in the FDA guidance document Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices. The software documentation includes:
-
- Software/Firmware Description
-
- Device Hazard Analysis
{8}------------------------------------------------
-
- Software Requirement Specifications
-
- Architecture Design Chart
-
- Software Design Specifications
-
- Traceability
-
- Software Development Environment Description
-
- Revision Level History
-
- Unresolved Anomalies
-
- Cybersecurity
A comprehensive risk analysis was performed on IDx-DR with identification and detailed description of the hazards, their causes and severity, as well as acceptable methods for control of the identified hazards. A description of acceptable verification and validation activities, at the unit, integration, and system level, including test protocols with pass/fail criteria and a report of the results, was provided. The expected impact of various hardware features on performance was assessed and minimum specifications for acceptable images for analysis were specified.
The cybersecurity considerations of data confidentiality, data integrity, data availability, denial of service attacks, and malware were adequately addressed utilizing platform controls, application controls, and procedure controls, and evidence was provided for the intended performance of the controls. Risks related to failure of various software components and their potential impact on patient reports and operator failures were also adequately addressed in the risk analysis. This software documentation information provided sufficient evidence of safe and effective software performance.
A full characterization of the technical parameters of all of the components of the software, including a description of the algorithms that analyzes the patient's images to determine exam quality and the diagnostic screening of diabetic retinopathy, has been provided. IDx-DR requires one optic disc centered image and one macula centered image from a fundus camera with at least 22 pixels per degree on the retina. So, a 1000 pixel field of view diameter for a 45 degree field of view image.
The IDx-DR artificial intelligence device design has the ability to perform analysis on the specific disease features that are important to a retina specialist for diagnostic screening of diabetic retinopathy. Future algorithm improvements will be made under a consistent medically relevant framework. A protocol was provided to mitigate the risk of algorithm changes leading to changes in the device technical specifications, which would lead to changes in false positive or false negative results. These changes could significantly affect clinical functionality or performance specifications directly associated with the intended use of the device. The protocol specifies the level of change in device specifications that could significantly affect the safety or effectiveness of the device, triggering the requirement for a new 510(k) premarket notification submission before commercial introduction. The protocol incorporates a risk management approach and
{9}------------------------------------------------
other approaches provided in the FDA guidance document Deciding When to Submit a 510(k) for a Software Change to an Existing Device: Guidance for Industry and FDA Staff in development, validation, and execution of the device changes.
Usability
Usability validation testing was performed under simulated-use to assess the user interface (IDx-DR Client) of the subject device. The testing was performed in an environment equivalent to the intended use environment of IDx-DR with subjects that had no prior experience using the IDx-DR Client. The critical task for the IDx-DR system is the ability to capture four images of sufficient quality. The purpose of the usability validation test plan was to demonstrate that the intended image capture workflow and training methodology can successfully be used by the intended operators to capture four retinal images. The results of the usability validation study indicated that no existing critical tasks were impacted by the modification and no new critical tasks were introduced, and demonstrated that previously untrained camera operators can capture four retinal images of sufficient quality following the imaging protocol and using the indicated camera system and standardized training and operating materials.
B. Summary of Clinical Studies
The determination of substantial equivalence is not based on an assessment of clinical performance data. The device modifications do not affect clinical performance. Refer to DEN180001 for details about the pivotal clinical trial of the IDx-DR device.
VIII. Conclusions
IDx-DR is substantially equivalent to the predicate IDx-DR device cleared under DEN180001. The subject and predicate devices have the same indications for use, technological characteristics, and performance specifications.
§ 886.1100 Retinal diagnostic software device.
(a)
Identification. A retinal diagnostic software device is a prescription software device that incorporates an adaptive algorithm to evaluate ophthalmic images for diagnostic screening to identify retinal diseases or conditions.(b)
Classification. Class II (special controls). The special controls for this device are:(1) Software verification and validation documentation, based on a comprehensive hazard analysis, must fulfill the following:
(i) Software documentation must provide a full characterization of technical parameters of the software, including algorithm(s).
(ii) Software documentation must describe the expected impact of applicable image acquisition hardware characteristics on performance and associated minimum specifications.
(iii) Software documentation must include a cybersecurity vulnerability and management process to assure software functionality.
(iv) Software documentation must include mitigation measures to manage failure of any subsystem components with respect to incorrect patient reports and operator failures.
(2) Clinical performance data supporting the indications for use must be provided, including the following:
(i) Clinical performance testing must evaluate sensitivity, specificity, positive predictive value, and negative predictive value for each endpoint reported for the indicated disease or condition across the range of available device outcomes.
(ii) Clinical performance testing must evaluate performance under anticipated conditions of use.
(iii) Statistical methods must include the following:
(A) Where multiple samples from the same patient are used, statistical analysis must not assume statistical independence without adequate justification.
(B) Statistical analysis must provide confidence intervals for each performance metric.
(iv) Clinical data must evaluate the variability in output performance due to both the user and the image acquisition device used.
(3) A training program with instructions on how to acquire and process quality images must be provided.
(4) Human factors validation testing that evaluates the effect of the training program on user performance must be provided.
(5) A protocol must be developed that describes the level of change in device technical specifications that could significantly affect the safety or effectiveness of the device.
(6) Labeling must include:
(i) Instructions for use, including a description of how to obtain quality images and how device performance is affected by user interaction and user training;
(ii) The type of imaging data used, what the device outputs to the user, and whether the output is qualitative or quantitative;
(iii) Warnings regarding image acquisition factors that affect image quality;
(iv) Warnings regarding interpretation of the provided outcomes, including:
(A) A warning that the device is not to be used to screen for the presence of diseases or conditions beyond its indicated uses;
(B) A warning that the device provides a screening diagnosis only and that it is critical that the patient be advised to receive followup care; and
(C) A warning that the device does not treat the screened disease;
(v) A summary of the clinical performance of the device for each output, with confidence intervals; and
(vi) A summary of the clinical performance testing conducted with the device, including a description of the patient population and clinical environment under which it was evaluated.