K Number
K202097
Device Name
Fundus Camera
Manufacturer
Date Cleared
2021-02-02

(188 days)

Product Code
Regulation Number
886.1120
Panel
OP
Reference & Predicate Devices
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

HFC-1 fundus camera is intended to capture digital images for the anterior and retina segment of the eye without the use of a mydriatic agent. It is intended for use as an aid to clinicians in the evaluation and diagnosis of ocular health.

Device Description

HFC-1 Fundus Camera captures, store and display color fundus images with built-in 20 Mega pixel colored channel up to 45-degree field of view. HFC-1 Fundus Camera is designed as a non-contact, non-invasive and high resolution digital imaging device. HFC-1 Fundus Camera has a retinal imaging system that provides digital images of the eyes to assist physicians in diagnostic examinations. The anterior of an eye is illuminated by IR light, the retina of an eye is illuminated by a white LED, emitted by the fundus illumination optical system. The fundus observation/photography optical system obtains an image with image sensors and images are observed and manipulated on the display panel.

AI/ML Overview

The Huvitz Co., Ltd. HFC-1 Fundus Camera is intended to capture digital images for the anterior and retina segment of the eye without the use of a mydriatic agent, aiding clinicians in evaluating and diagnosing ocular health.

The device's performance was evaluated through a series of bench tests, including electrical and mechanical safety testing, electromagnetic compatibility, light hazard testing, and disinfection tests, all adhering to relevant international standards. The primary effectiveness study involved comparing the HFC-1's image quality and technical features against a predicate device, the Nidek AFC-330 Non-Mydriatic Auto Fundus Camera (K113451), and assessing its conformity to ISO 10940:2009 (Ophthalmic Instruments-Fundus Cameras).

Here's a breakdown of the acceptance criteria and study details:

1. Table of Acceptance Criteria and Reported Device Performance:

Test listAcceptance CriteriaReported Device PerformancePass/Fail
Resolution①Center: 60 line pairs / mm or more
②Middle: more than 40 line pairs / mm
③Around: 25 line pairs / mm or more (Established based on ISO 10940 Standard)Center: 62 (6G3E), 70.23(-1G6E)
Middle: 41 (6G2E), 62.59(-1G5E)
Around: 28 (6G1E), 39.41(-1G1E)Pass
Image Capture Angle45° ± 5% (normal mode), i.e., 42.75°47.25° (787.0869.8) (Established based on ISO 10940 Standard)43.1 ° (ø 790, r 395)Pass
Pupil diameter① 4.0 mm or more (normal mode)
② 3.3 mm or more (Minimum pupil measurement mode)① Pass (Possible to shoot model eye with 4.0mm pupil diameter)
② Pass (Possible to shoot model eye with 3.3mm pupil diameter)Pass
Pixel pitch of sensor in fundus3.69um ± 7% (3.4317 ~ 3.9483) (According to ISO 10940)3.53 umPass
Light intensity controlStep 10 should be. (Each level of light intensity should be well-operated and well-controlled)Pass (Each level of light intensity was well-operated and well-controlled)Pass
Objective lens reflected light and black spotThe difference between the circumference and 10 should be less. (Established considering Huvitz senior engineer and researcher's opinion)Pass (Result met the test standard)Pass
Working DistanceCapture fundus image: 33mm± 1mmPass (Result met the test standard)Pass
Diopter adjustment rangeTotal: -33D ~ + 33D
(1)Without correction lens: -13D ~ + 13D
(2)With Corrected lens entrance: + 7D~ + 33D
(3)With compensation lens: -33D ~ -7DPass (Captured image was clear within ranges)Pass
Moving range (Body)Body front and back: 70mm ± 5mm
Body right and left: 100mm ± 5mm
Body top and bottom: 30mm ±5mmFront Back: 70 mm
Left Right: 102 mm
Up down: 30.5 mmPass
Moving range (Chin rest)Top and bottom of chin rest: 62mm ± 5mmUp down: 65 mmPass
Auto TrackingTop and Bottom: 30mm ±1mm
Right and Left: 10mm ±1mm
Front and Rear: 10mm ±1mmTop and Bottom: 30 mm
Right Left: 11 mm
Front Back: 10 mmPass
Sleep mode5 Min ±5 Sec (Established considering Huvitz senior engineer and researcher's opinion)Pass (Result met the test standard)Pass
LCD Tilting Angle70° ± 5% (66.5~73.5) (Established considering Huvitz senior engineer and researcher's opinion)Angle 71 °Pass
Cornea FlareThe ring of light is located at the center of the mask. Equal width and upper and lower, left, right sides should be constant when rotated (2nd step). (Established considering Huvitz senior engineer and researcher's opinion)Pass (Result met the test standard)Pass
Lens FlareThe ring of light is located at the center of the mask. Equal width and upper and lower, left, right sides should be constant when rotated (3rd step). (Established considering Huvitz senior engineer and researcher's opinion)Pass (Result met the test standard)Pass
Image Quality Comparison Test (HFC-1 vs AFC-330)"Supportive of equivalence of HFC-1 to the predicate device with regard to image quality." (Implicit acceptance criteria: comparable image quality to the predicate)Result was supportive of equivalence of HFC-1 to the predicate device with regard to image quality.Pass
Resolving Power, Field of View, and Panorama Function Comparison Test (HFC-1 vs AFC-330)"HFC-1 is as effective as AFC-330." (Implicit acceptance criteria: comparable performance to the predicate in these aspects)HFC-1 is as effective as AFC-330. Test demonstrates HFC-1 has panorama function like AFC-330.Pass

2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)

  • Sample Size: The document does not specify a distinct "test set" in terms of patient images or clinical cases for the performance evaluation. Instead, the performance tests relied on model eyes, standardized targets (e.g., USAF chart, scales), and physical measurements of the device.
  • Data Provenance: The testing appears to be prospective bench testing conducted by the manufacturer, Huvitz Co., Ltd., which is based in Gyeonggi-do, Republic of Korea. No patient data or clinical data from specific countries are mentioned for these performance tests.

3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g., radiologist with 10 years of experience)

  • For the objective quantitative tests (e.g., resolution, image capture angle, pupil diameter, pixel pitch, working distance, moving range, auto tracking, sleep mode, LCD tilting angle, cornea flare, lens flare), the acceptance criteria were established based on ISO 10940 Standard or opinions of Huvitz senior engineers and researchers. There is no mention of external experts or their specific qualifications for establishing ground truth for these objective measurements.
  • For the Image Quality Comparison Test, images from the HFC-1 and the predicate device were "shown to the physician for comparison in image quality." The document does not specify the number of physicians, their qualifications, or how their comparisons were aggregated to form a "ground truth" or judgment.

4. Adjudication method (e.g. 2+1, 3+1, none) for the test set

  • For the objective bench tests, the adjudication method was none in the sense of expert consensus. The results were compared directly against pre-defined numerical or descriptive criteria derived from ISO standards or internal expert opinion.
  • For the Image Quality Comparison Test, the document states images were "shown to the physician for comparison," but it does not describe an adjudication method (e.g., majority vote, consensus meeting) for interpreting the physician's comparison.

5. If a multi-reader, multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

  • No MRMC comparative effectiveness study was done. The device is a fundus camera, which is an imaging device, not an AI-powered diagnostic algorithm designed to assist human readers. The effectiveness study focused on the image capture capabilities and image quality of the camera itself.

6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

  • This question is not applicable as the HFC-1 Fundus Camera is an imaging device, not an AI algorithm. Its performance is about its ability to capture images, not interpret them.

7. The type of ground truth used (expert consensus, pathology, outcomes data, etc)

  • For the objective performance tests, the "ground truth" was based on internationally recognized standards (ISO 10940) and internal engineering specifications/expert opinions.
  • For the Image Quality Comparison Test, the "ground truth" was based on the direct visual comparison of images by an unnamed physician. This is closer to a subjective expert assessment rather than objective and independently verified ground truth like pathology.

8. The sample size for the training set

  • No training set is mentioned or applicable, as the HFC-1 Fundus Camera is an imaging device, not a machine learning algorithm that requires training data.

9. How the ground truth for the training set was established

  • Not applicable as there is no training set for this device.

§ 886.1120 Ophthalmic camera.

(a)
Identification. An ophthalmic camera is an AC-powered device intended to take photographs of the eye and the surrounding area.(b)
Classification. Class II (special controls). The device, when it is a photorefractor or a general-use ophthalmic camera, is exempt from the premarket notification procedures in subpart E of part 807 of this chapter subject to the limitations in § 886.9.