(66 days)
The EyeBOX is intended to measure and analyze eye movements as an aid in the diagnosis of concussion, also known as mild traumatic brain injury (mTBI), within one week of head injury in patients 5 through 67 years of age in conjunction with a standard neurological assessment of concussion.
A negative EyeBOX classification may correspond to eye movement that is consistent with a lack of concussion. A positive EyeBOX classification corresponds to eye movement that may be present in both patients with or without concussion.
Oculogica's EyeBOX is an eye-tracking device with custom software. The device is comprised of a host PC with integrated touchscreen interface for the operator, eye tracking camera, LCD stimulus screen and head-stabilizing rest (chin rest and forehead rest) for the patient, and data processing algorithm. The data processing algorithm detects subtle changes in eye movements resulting from concussion. The eye tracking task takes about 4 minutes to complete and involves watching a video move around the perimeter of an LCD monitor positioned in front of the patient while a high speed near-infrared (IR) camera records gaze positions 500 times per second. The post-processed data are analyzed automatically to produce one or more outcome measures.
The provided text describes a 510(k) premarket notification for the Oculogica EyeBOX, Model OCL 02.5. This submission is for a modification to a previously cleared device (EyeBOX Model OCL 02). The document emphasizes that the new device has the same intended use, principles of operation, and similar technological characteristics as the predicate, and that the changes do not raise new questions of safety or effectiveness.
Therefore, the performance data presented is primarily focused on demonstrating that the modifications to the device do not adversely impact performance, rather than establishing new acceptance criteria or a comprehensive study proving the device meets those criteria from scratch. The document explicitly states: "comprehensive testing demonstrates that these changes do not adversely impact performance."
Given this context, here's a breakdown of the information requested, based on the provided text:
1. A table of acceptance criteria and the reported device performance
The document does not explicitly state acceptance criteria for diagnostic performance (e.g., sensitivity, specificity, accuracy) for the EyeBOX Model OCL 02.5, as it is a modification submission aiming to demonstrate that performance is not adversely affected by changes. The reported performance relates to the functionality of the new camera system.
Acceptance Criteria (for new camera system) | Reported Device Performance (for new camera system) |
---|---|
Spatial precision of the new camera met performance requirements. | Bench testing on an artificial eye demonstrated that the spatial precision met performance requirements. |
Step response of the new camera met performance requirements. | Bench testing on an artificial eye demonstrated that the step response met performance requirements. |
Reliability in detecting blinks. | Testing in N=84 human participants demonstrated the new camera and analysis could reliably detect blinks. |
Reliability in detecting gaze position across the range of gaze positions measured by the device. | Testing in N=84 human participants demonstrated the new camera and analysis could reliably detect gaze position across the range of gaze positions measured by the device. |
Electromagnetic emissions and immunity according to IEC 60601-1-2:2014 | Testing performed to IEC 60601-1-2:2014. |
Light hazard protection for ophthalmic instruments according to Z80.36-2016 | Testing performed to Z80.36-2016. |
Software functionality | Software verification and user testing performed. |
2. Sample size used for the test set and the data provenance
- Sample Size for test set: N=84 human participants for testing the new camera's reliability in detecting blinks and gaze position.
- Data Provenance: Not explicitly stated (e.g., country of origin, retrospective/prospective). It mentions "human participants," implying prospective testing.
3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts
Not applicable/provided for the described testing. The testing of the new camera's performance (blink and gaze detection) does not appear to involve expert-established ground truth for performance against a diagnosis of concussion. The core diagnostic performance is based on the predicate device's clearance.
4. Adjudication method for the test set
Not applicable/provided for the described testing. The testing focused on technical performance metrics (blink/gaze detection) rather than diagnostic accuracy requiring adjudication.
5. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance
Not applicable. The document describes a device that "measures and analyzes eye movements as an aid in the diagnosis of concussion." It's an AI-based diagnostic aid, but the reported testing for this 510(k) submission is to show that a modified version of the device performs equivalently to the predicate device, not a comparative effectiveness study with human readers.
6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done
Yes, implicit in the device's function. The EyeBOX relies on a "data processing algorithm" that "analyzes automatically to produce a BOX score." The initial clearance (K191183 for Model OCL 02) would have established the standalone performance of this algorithm. The current submission is for confirming that changes to hardware/software (not the core algorithm's diagnostic output for a given eye movement pattern) do not negatively impact this.
7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)
For the specific performance testing described for the OCL 02.5 model (new camera system), the ground truth for "reliably detect blinks and gaze position" would be the actual occurrence of blinks and gaze positions, likely established by controlled experimental conditions or other objective measurements. For the overall diagnostic capability of the EyeBOX (as established by the predicate device), the document states it's an "aid in the diagnosis of concussion...in conjunction with a standard neurological assessment of concussion." This implies that the ground truth for concussion diagnosis in the original predicate study would have involved clinical assessment outcomes.
8. The sample size for the training set
Not mentioned in the provided text, as this document refers to a modification of a previously cleared device. The training set information would have been part of the original K191183 submission for EyeBOX Model OCL 02.
9. How the ground truth for the training set was established
Not mentioned in the provided text for the same reason as point 8.
§ 882.1455 Traumatic brain injury eye movement assessment aid.
(a)
Identification. A traumatic brain injury eye movement assessment aid is a prescription device that uses a patient's tracked eye movements to provide an interpretation of the functional condition of the patient's brain. This device is an assessment aid that is not intended for standalone detection or diagnostic purposes.(b)
Classification. Class II (special controls). The special controls for this device are:(1) Clinical performance data under anticipated conditions of use must evaluate tracked eye movement in supporting the indications for use and include the following:
(i) Evaluation of sensitivity, specificity, positive predictive value, and negative predictive value using a reference method of diagnosis;
(ii) Evaluation of device test-retest reliability; and
(iii) A description of the development of the reference method of diagnosis, which may include a normative database, to include the following:
(A) A discussion of how the clinical work-up was completed to establish the reference method of diagnosis, including the establishment of inclusion and exclusion criteria; and
(B) If using a normative database, a description of how the “normal” population was established, and the statistical methods and model assumptions used.
(2) Software verification, validation, and hazard analysis must be performed. Software documentation must include a description of the algorithms used to generate device output.
(3) Performance testing must demonstrate the electrical safety and electromagnetic compatibility (EMC) of the device.
(4) The patient-contacting components of the device must be demonstrated to be biocompatible.
(5) A light hazard assessment must be performed for all eye-tracking and visual display light sources.
(6) Labeling must include:
(i) A summary of clinical performance testing conducted with the device, including sensitivity, specificity, positive predictive value, negative predictive value, and test-retest reliability;
(ii) A description of any normative database that includes the following:
(A) The clinical definition used to establish a “normal” population and the specific selection criteria;
(B) The format for reporting normal values;
(C) Examples of screen displays and reports generated to provide the user results and normative data;
(D) Statistical methods and model assumptions; and
(E) Any adjustments for age and gender.
(iii) A warning that the device should only be used by trained healthcare professionals;
(iv) A warning that the device does not identify the presence or absence of traumatic brain injury or other clinical diagnoses;
(v) A warning that the device is not a standalone diagnostic; and
(vi) Any instructions to convey to patients regarding the administration of the test and collection of test data.