K Number
K213795
Manufacturer
Date Cleared
2022-04-21

(136 days)

Product Code
Regulation Number
892.2070
Reference & Predicate Devices
N/A
Predicate For
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

Videa Caries Assist is a computer-assisted detection (CADe) device that analyzes intraoral radiographs to identify and localize carious lesions. Videa Caries Assist is indicated for use by board licensed dentists for the concurrent review of bitewing (BW) radiographs acquired from adult patients aged 22 years or older.

Device Description

Videa Caries Assist (VCA) software is a cloud-based AI-powered medical device for the automatic detection of carious lesions in dental radiographs. The device itself is available as a service via an API (Application Programming Interface) behind a firewalled network. Provided proper authentication and a bitewing image, the device returns a set of bounding boxes representing the carious lesions detected. VCA is accessed by the dental practitioner through their Dental Viewer. From within the Dental Viewer the user can upload a radiograph to VCA and then review the results. The device outputs a binary indication to identify the presence or absence of findings are present the device outputs the coordinates of the bounding boxes for each finding. If no findings are present the device outputs a clear indication that there are no carious lesions identified.

AI/ML Overview

Here's a summary of the acceptance criteria and the study details for the Videa Caries Assist device, based on the provided document:

Acceptance Criteria and Device Performance

MetricAcceptance Criteria (Implicit)Reported Device Performance (Standalone Study)Reported Device Performance (Reader Study - Aided)Reported Device Performance (Reader Study - Unaided)
Overall average AFROC FOMImprovement over unaided reads0.740 (95% CI: 0.721, 0.760)0.739 (95% CI: 0.705, 0.773)0.667 (95% CI: 0.633, 0.701)
Difference in Overall average AFROC FOM (Aided - Unaided)> 0N/A0.072 (95% CI: 0.047, 0.097, p < 0.0001)N/A
Overall average Se (image-based)N/A70.8% (95% CI: 68.0, 73.7)N/AN/A
Overall average PPV (image-based)N/A59.5% (95% CI: 56.5, 62.5)N/AN/A
Overall average Se (lesion-based, pooled)N/A73.6% (95% CI: 71.1, 76.0)N/AN/A
Overall average PPV (lesion-based, pooled)N/A64.9% (95% CI: 62.3, 67.6)N/AN/A

Note: The document primarily focuses on demonstrating superiority in the reader study using AFROC FOM as the primary endpoint. While standalone metrics are reported, explicit acceptance criteria for these specific values are not provided within this document. The implicit acceptance criterion for the clinical reader study was that the AFROC FOM for aided reads is statistically significantly superior to unaided reads.

Study Details

  1. Sample size used for the test set and the data provenance:

    • Standalone Study: 1034 adult radiographs.
      • Data Provenance: Collected from 10 US sites.
      • Retrospective/Prospective: Not explicitly stated, but the mention of "collected from" suggests retrospective for the purpose of this study.
    • Clinical Data (Reader Study): 226 adult radiographs.
      • Data Provenance: Collected from 10 US sites.
      • Retrospective/Prospective: Not explicitly stated, but the mention of "collected from" suggests retrospective for the purpose of this study.
  2. Number of experts used to establish the ground truth for the test set and the qualifications of those experts:

    • Standalone Study: Three US board-certified dentists.
    • Clinical Data (Reader Study): Three US board-certified dentists.
  3. Adjudication method for the test set:

    • The document states that the ground truth was "ground-truthed by three US board-certified dentists." It does not explicitly detail an adjudication method beyond this, such as "2+1" or "3+1" (e.g., if at least two out of three agreed, or a tie-breaker by a chief expert). It implies a consensus by the three experts.
  4. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:

    • Yes, a fully crossed randomized, multiple reader multiple case (MRMC) controlled study was performed.
    • Effect Size: The overall average AFROC FOM for readers aided by VCA was 0.739, compared to 0.667 for unaided reads. The difference was 0.072 (95% CI: 0.047, 0.097; p < 0.0001). This indicates an improvement of 0.072 in AFROC FOM when readers used AI assistance.
  5. If a standalone (i.e., algorithm only without human-in-the-loop performance) was done:

    • Yes, a "Bench Testing (Standalone Study)" was conducted. The performance metrics are reported in the table above, including an overall average AFROC FOM of 0.740, image-based Sensitivity of 70.8%, and PPV of 59.5%.
  6. The type of ground truth used (expert consensus, pathology, outcomes data, etc.):

    • Expert Consensus: The ground truth for both the standalone and reader study test sets was established by "three US board-certified dentists." This implies expert consensus based on their review of the radiographs.
  7. The sample size for the training set:

    • The document does not explicitly state the sample size for the training set. It only mentions "Supervised Deep Learning" as the development technology.
  8. How the ground truth for the training set was established:

    • The document does not explicitly state how the ground truth for the training set was established. It only mentions "Supervised Deep Learning," which necessitates a labeled training set, but the process of labeling is not described.

{0}------------------------------------------------

April 21, 2022

Image /page/0/Picture/1 description: The image shows the logo of the U.S. Food and Drug Administration (FDA). On the left is the Department of Health & Human Services logo. To the right of that is the FDA logo, which consists of the letters "FDA" in a blue square, followed by the words "U.S. FOOD & DRUG ADMINISTRATION" in blue text.

VideaHealth, Inc % Donna-Bea Tillman Senior Consultant Biologics Consulting Group 1555 King St. Suite 300 ALEXANDRIA VA 22314

Re: K213795

Trade/Device Name: Videa Caries Assist Regulation Number: 21 CFR 892.2070 Regulation Name: Medical Image Analyzer Regulatory Class: Class II Product Code: MYN Dated: March 23, 2022 Received: March 24, 2022

Dear Donna-Bea Tillman:

We have reviewed your Section 510(k) premarket notification of intent to market the device referenced above and have determined the device is substantially equivalent (for the indications for use stated in the enclosure) to legally marketed predicate devices marketed in interstate commerce prior to May 28, 1976, the enactment date of the Medical Device Amendments, or to devices that have been reclassified in accordance with the provisions of the Federal Food, Drug, and Cosmetic Act (Act) that do not require approval of a premarket approval application (PMA). You may, therefore, market the device, subject to the general controls provisions of the Act. Although this letter refers to your product as a device, please be aware that some cleared products may instead be combination products. The 510(k) Premarket Notification Database located at https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfpmn/pmn.cfm identifies combination product submissions. The general controls provisions of the Act include requirements for annual registration, listing of devices, good manufacturing practice, labeling, and prohibitions against misbranding and adulteration. Please note: CDRH does not evaluate information related to contract liability warranties. We remind you, however, that device labeling must be truthful and not misleading.

If your device is classified (see above) into either class II (Special Controls) or class III (PMA), it may be subject to additional controls. Existing major regulations affecting your device can be found in the Code of Federal Regulations, Title 21, Parts 800 to 898. In addition, FDA may publish further announcements concerning your device in the Federal Register.

Please be advised that FDA's issuance of a substantial equivalence determination does not mean that FDA has made a determination that your device complies with other requirements of the Act or any Federal statutes and regulations administered by other Federal agencies. You must comply with all the Act's

{1}------------------------------------------------

requirements, including, but not limited to: registration and listing (21 CFR Part 807); labeling (21 CFR Part 801); medical device reporting of medical device-related adverse events) (21 CFR 803) for devices or postmarketing safety reporting (21 CFR 4, Subpart B) for combination products (see https://www.fda.gov/combination-products/guidance-regulatory-information/postmarketing-safety-reportingcombination-products); good manufacturing practice requirements as set forth in the quality systems (OS) regulation (21 CFR Part 820) for devices or current good manufacturing practices (21 CFR 4, Subpart A) for combination products; and, if applicable, the electronic product radiation control provisions (Sections 531-542 of the Act); 21 CFR 1000-1050.

Also, please note the regulation entitled, "Misbranding by reference to premarket notification" (21 CFR Part 807.97). For questions regarding the reporting of adverse events under the MDR regulation (21 CFR Part 803), please go to https://www.fda.gov/medical-device-safety/medical-device-reportingmdr-how-report-medical-device-problems.

For comprehensive regulatory information about mediation-emitting products, including information about labeling regulations, please see Device Advice (https://www.fda.gov/medicaldevices/device-advice-comprehensive-regulatory-assistance) and CDRH Learn (https://www.fda.gov/training-and-continuing-education/cdrh-learn). Additionally, you may contact the Division of Industry and Consumer Education (DICE) to ask a question about a specific regulatory topic. See the DICE website (https://www.fda.gov/medical-device-advice-comprehensive-regulatoryassistance/contact-us-division-industry-and-consumer-education-dice) for more information or contact DICE by email (DICE@fda.hhs.gov) or phone (1-800-638-2041 or 301-796-7100).

Sincerely,

Laurel Burk, Ph.D. Assistant Director Diagnostic X-ray Systems Team Division of Radiological Health OHT7: Office of In Vitro Diagnostics and Radiological Health Office of Product Evaluation and Quality Center for Devices and Radiological Health

Enclosure

{2}------------------------------------------------

Indications for Use

510(k) Number (if known) K213795

Device Name Videa Caries Assist

Indications for Use (Describe)

Videa Caries Assist is a computer-assisted detection (CADe) device that analyzes intraoral radiographs to identify and localize carious lesions. Videa Caries Assist is indicated for use by board licensed dentists for the concurrent review of bitewing (BW) radiographs acquired from adult patients aged 22 years or older.

Type of Use (Select one or both, as applicable)
☑ Prescription Use (Part 21 CFR 801 Subpart D)☐ Over-The-Counter Use (21 CFR 801 Subpart C)

CONTINUE ON A SEPARATE PAGE IF NEEDED.

This section applies only to requirements of the Paperwork Reduction Act of 1995.

DO NOT SEND YOUR COMPLETED FORM TO THE PRA STAFF EMAIL ADDRESS BELOW.

The burden time for this collection of information is estimated to average 79 hours per response, including the time to review instructions, search existing data sources, gather and maintain the data needed and complete and review the collection of information. Send comments regarding this burden estimate or any other aspect of this information collection, including suggestions for reducing this burden, to:

Department of Health and Human Services Food and Drug Administration Office of Chief Information Officer Paperwork Reduction Act (PRA) Staff PRAStaff(@fda.hhs.gov

"An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB number."

{3}------------------------------------------------

In accordance with 21 CFR 807.87(h) and 21 CFR 807.92 the 510(k) Summary for the Videa Caries Assist device is provided below.

1. SUBMITTER

Applicant:VideaHealth, Inc.19 Kingston St, Floor 3Boston, MA, 02114+1 617-340-9940florian.hillen@videahealth.io
Contact:Florian HillenCEO and FounderVideaHealth, Inc.617-528-8643florian.hillen@videahealth.io
Submission Correspondent:Donna-Bea Tillman, Ph.D.Senior ConsultantBiologics Consulting1555 King St, Suite 300Alexandria, VA 22314410-531-6542dtillman@biologicsconsulting.com
Date Prepared:March 23, 2022

2. DEVICE

Device Trade Name:Videa Caries Assist
Device Common Name:Medical Image Analyzer
Classification Name21 CFR 892.2070 Analyzer, Medical Image
Regulatory Class:2
Product Code:MYN

3. PREDICATE DEVICE

Predicate Device: P980025 Logicon Caries Detection Software (Carestream Dental LLC)

{4}------------------------------------------------

DEVICE DESCRIPTION 4.

Videa Caries Assist (VCA) software is a cloud-based AI-powered medical device for the automatic detection of carious lesions in dental radiographs. The device itself is available as a service via an API (Application Programming Interface) behind a firewalled network. Provided proper authentication and a bitewing image, the device returns a set of bounding boxes representing the carious lesions detected.

VCA is accessed by the dental practitioner through their Dental Viewer. From within the Dental Viewer the user can upload a radiograph to VCA and then review the results. The device outputs a binary indication to identify the presence or absence of findings are present the device outputs the coordinates of the bounding boxes for each finding. If no findings are present the device outputs a clear indication that there are no carious lesions identified.

INTENDED USE/INDICATIONS FOR USE ട്.

Videa Caries Assist is a computer-assisted detection (CADe) device that analyzes intraoral radiographs to identify and localize carious lesions. Videa Caries Assist is indicated for use by board licensed dentists for the concurrent review of bitewing (BW) radiographs acquired from adult patients aged 22 years or older.

SUBSTANTIAL EQUIVALENCE 6.

Comparison of Indications

Logicon Caries Detector and Videa Caries Assist both analyze dental radiographs and identify regions of interest. Logicon Caries Detector aids in diagnosis of caries that have penetrated in the dentin for each tooth, where Videa Caries Assist detects carious lesions for all types of caries lesions. However, both devices are only intended as an aid to the physician and not intended to replace the diagnosis by the physician. The differences in Indications for Use do not constitute a new intended use, as both devices are intended to assist dental professional by identifying and marking Regions of Interest (ROI) in dental radiographs.

Technological Comparisons

Table 1 compares the key technological feature of the subject devices to the predicate device (Logicon Caries Detector, P980025).

Proposed DevicePredicate Device
510(k) NumberTBDP980025
ApplicantVideaHealth, Inc.Carestream Dental LLC
Device NameVidea Caries AssistLogicon Caries Detector
Classification Regulation892.2070892.2070
Proposed DevicePredicate Device
Product CodeMYNMYN
Indications for UseVidea Caries Assist is acomputer-assisted detection(CADe) device that analyzesintraoral radiographs toidentify and localize cariouslesions. Videa Caries Assist isindicated for use by boardlicensed dentists for theconcurrent review of bitewing(BW) radiographs acquiredfrom adult patients aged 22years or older.Logicon Caries Detector is asoftware device that is an aid in thediagnosis of caries that havepenetrated into the dentin on un-restored proximal surfaces ofsecondary dentition through thestatistical analysis of digital intra-oral radiographic imagery. Thedevice provides additionalinformation for the clinician to use inhis/her diagnosis of a tooth surfacesuspected of being carious. It isdesigned to work in conjunction withan existing CareStream Dental RVGDigital X-Ray Radiographic Systemwith Dental Imaging Software (DIS)for Windows XP or higher.
Image ModalityX-RayX-Ray
Study TypeBitewing ImagesDigital intra-oral radiographicimagery
Clinical FindingActive and Secondary Cariesat all penetration depthsCaries penetrating into dentin
Tooth SurfaceProximal, Buccal/Lingual,Occlusal, Root, CervicalProximal
Clinical OutputMessage indicating if andhow many carious lesion weredetectedSet of togglable boundingboxes around suspectedlesionsMessage indicating if a cariouslesion was detected.An outline of the potential lesion siteis shown
Patient PopulationAdults ≥ 22 years of ageAdults ≥ 22 years of age
Intended UserUS licensed dentistsDentists
Development TechnologySupervised Deep LearningComputer Vision
Image SourceX-Ray SensorX-Ray Sensor
Image ViewingImage ViewerCareStream Dental RVG Digital X-Ray Radiographic System withDental Imaging Software (DIS)

Table 1: Technological Comparison

{5}------------------------------------------------

{6}------------------------------------------------

7. PERFORMANCE DATA

Biocompatibility, Sterilization, and Reprocessing

Not applicable. The subject device is a software-only device. There are no direct or indirect patient-contacting components of the subject device. There are no sterile or reprocessed components.

Electrical Safety and Electromagnetic Compatibility (EMC)

Not applicable. The subject device is a software-only device. It contains no electric components, generates no electrical emissions, and uses no electrical energy of any type.

Software Verification and Validation Testing

Software verification and validation testing were conducted and documentation was provided as recommended by FDA's Guidance for Industry and FDA Staff, "Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices." The software for this device was considered as a moderate level of concern.

Bench Testing (Standalone Study)

A Standalone Performance Assessment was conducted to measure and report the performance of Videa Caries Assist by itself, in the absence of any interaction with a dentist. The dataset was 1034 adult radiographs collected from 10 US sites that were ground-truthed by three US boardcertified dentists. The patients in the dataset were 53% female and 47% male, with 55% having age 22-40, 34% age 41-60, 9% age 61-75, and 2% over age 76. The number of lesions per image was: 0 lesions (39%), 1 lesions (22%), 2-3 lesions (26%), and 4+ lesions (13%). Image sensors included in the study were: DEXIS Platinum, DEXIS Titanium, Gendex GXS-700, Kodak RVG, 6100, RVG 5200, RVG 6200, and Schick 33.

The standalone overall average Alternative Free-response Receiver Operating Characteristic Figure of Merit (AFROC FOM) was found to be 0.740 (95% confidence interval: 0.721, 0.760) with a corresponding average image-based Sensitivity of 70.8% and PPV of 59.5% (Table 2).

Mean95% Confidence Interval
Overall average FOM0.740(0.721, 0.760)
Overall average Se - image-based (%)70.8(68.0, 73.7)
Overall average PPV - image-based (%)59.5(56.5, 62.5)
Overall average Se (%) - lesion-based (pooled)73.6(71.1, 76.0)
Overall average PPV (%) - lesion-based(pooled)64.9(62.3, 67.6)

Standalone: AFROC FOM, Image-based Se, PPV. Lesion-based estimates in italics. Table 2:

{7}------------------------------------------------

We observed a False Positive Fraction FPF of 0.335 and a Non-Lesion Fraction of 0.599 Comparing these results with the results from the readers study shows a decrease in the absolute number of false positives per image.

VCA's standalone performance was assessed against the following potential subject and image confounders: age, sex, number of lesions per image, image quality, imaging sensor model, effective resolution, bit-depth, and image size. We observed very good generalizability for all confounders with the exception of image sensor model, with all reported confidence intervals including 0.74, the average AFROC FOM calculated on the full dataset. We did observe a somewhat lower average AFROC FOM of 0.608 for the Schick 33 sensor. However, when we performed a sub-analysis of the reader study results for the Schick 33 images (n=28), we found a lower unaided reader performance for this sensor versus the mean performance across all sensors, and an improvement in mean AFROC FOM performance for aided (0.706) versus unaided (0.614) reads that was very similar to what was seen for the entire study dataset alleviating any concerns.

Animal Testing

Not applicable. Animal studies are not necessary to establish the substantial equivalence of this device.

Clinical Data (Reader Study)

A fully crossed randomized, multiple reader multiple case (MRMC) controlled study was performed to determine whether the diagnostic accuracy of readers aided by VCA is superior to reader accuracy when unaided by VCA, as determined by the AFROC Figure of Merit (AFROC FOM). The hypothesis to be tested is:

Ho: AFROC FOMaided - AFROC FOMunaided ≤ 0

H1: AFROC FOMaided - AFROC FOMunaided > 0

where AFROC FOMaided is the population-mean AFROC FOM for aided reads, and similarly with AFROC FOMunaided for unaided reads.

The dataset was 226 adult radiographs collected from 10 US sites that were ground-truthed by three US board-certified dentists. The patients in the dataset were 55% female and 45% male, with 49% having age 22-40, 38% age 41-60, 11% age 61-75, and 6% over age 76. Image sensors included in the study were: DEXIS Platinum, DEXIS Titanium, Gendex GXS-700, Kodak RVG, 6100, RVG 5200, RVG 6200, and Schick 33.

The overall average AFROC FOM for reads aided by VCA was 0.739 as compared to 0.667 for unaided reads (Table 3). The difference was 0.072 (95% CI: 0.047, 0.097; p < 0.0001), thus rejecting the null hypothesis. This demonstrates that the performance of readers assisted by VCA was better than that of readers who were unassisted, thus meeting the primary study objective. The improvement in aided reader performance over unaided reader performance was seen for each of the 21 readers

{8}------------------------------------------------

AidedUnaidedDifference
Overall, average FOM0.7390.6670.072
95% Confidence Interval(0.705, 0.773)(0.633, 0.701)(0.047, 0.097)

Table 3: Overall Image-based AFROC FOM for Aided vs Unaided reads

The image-based - AFROC Curve for aided and unaided reads is shown below in Figure 1.

Image-based- AFROC Curve of aided (blue) and unaided (orange) reading Figure 1: modalities.

Image /page/8/Figure/6 description: The image is a plot comparing the sensitivity versus the false positive fraction for two modalities, aided and unaided. The x-axis is labeled "false positive fraction" and ranges from 0.0 to 1.0. The y-axis is labeled "sensitivity" and ranges from 0.0 to 1.0. The aided modality has a higher sensitivity for any given false positive fraction.

Sensitivity was significantly increased for aided versus unaided reads for both the readeraveraged and the lesion-based analyses. The Non-Lesion (NLF) is a measure of the average number of false positive lesions expected per image. NLF was reduced in aided reads for both normal (no lesions) and abnormal (one or more lesions) images. The False Positive Fraction (FPF) is a measure of how many normal images have at least one false positive. Although the FPF is slightly increased for aided reads (0.33) versus unaided reads (0.29), this difference is small and not statistically significant (95% CI [-0.03, 0.12]). Furthermore, the device did demonstrate an overall improvement in reader performance as demonstrated by the primary endpoint analysis.

Conclusion

The predicate Logicon Caries Detector (P980025) and the proposed Videa Caries Assist device have the same intended use, as they are both computer-assisted detection devices that accept dental radiographs as inputs and use Supervised Deep Learning to identify and highlight ROIs. Although there are technological differences, as discussed above these differences in technological characteristics do not raise different questions of safety and effectiveness, as overall functionality as a reading aid for dental radiographs and utility within the associated

{9}------------------------------------------------

clinical workflows offered to the dental professionals by Videa Caries Assist and Logicon Caries Detector are the same.

Software testing verified the device functioned as intended. The results of the Standalone Performance Assessment and Clinical Performance Assessment demonstrate that the performance of Videa Caries Assist is comparable to that of Logicon Caries Detector. Therefore, Videa Caries Assist can be found substantially equivalent to Logicon Caries Detector.

§ 892.2070 Medical image analyzer.

(a)
Identification. Medical image analyzers, including computer-assisted/aided detection (CADe) devices for mammography breast cancer, ultrasound breast lesions, radiograph lung nodules, and radiograph dental caries detection, is a prescription device that is intended to identify, mark, highlight, or in any other manner direct the clinicians' attention to portions of a radiology image that may reveal abnormalities during interpretation of patient radiology images by the clinicians. This device incorporates pattern recognition and data analysis capabilities and operates on previously acquired medical images. This device is not intended to replace the review by a qualified radiologist, and is not intended to be used for triage, or to recommend diagnosis.(b)
Classification. Class II (special controls). The special controls for this device are:(1) Design verification and validation must include:
(i) A detailed description of the image analysis algorithms including a description of the algorithm inputs and outputs, each major component or block, and algorithm limitations.
(ii) A detailed description of pre-specified performance testing methods and dataset(s) used to assess whether the device will improve reader performance as intended and to characterize the standalone device performance. Performance testing includes one or more standalone tests, side-by-side comparisons, or a reader study, as applicable.
(iii) Results from performance testing that demonstrate that the device improves reader performance in the intended use population when used in accordance with the instructions for use. The performance assessment must be based on appropriate diagnostic accuracy measures (
e.g., receiver operator characteristic plot, sensitivity, specificity, predictive value, and diagnostic likelihood ratio). The test dataset must contain a sufficient number of cases from important cohorts (e.g., subsets defined by clinically relevant confounders, effect modifiers, concomitant diseases, and subsets defined by image acquisition characteristics) such that the performance estimates and confidence intervals of the device for these individual subsets can be characterized for the intended use population and imaging equipment.(iv) Appropriate software documentation (
e.g., device hazard analysis; software requirements specification document; software design specification document; traceability analysis; description of verification and validation activities including system level test protocol, pass/fail criteria, and results; and cybersecurity).(2) Labeling must include the following:
(i) A detailed description of the patient population for which the device is indicated for use.
(ii) A detailed description of the intended reading protocol.
(iii) A detailed description of the intended user and user training that addresses appropriate reading protocols for the device.
(iv) A detailed description of the device inputs and outputs.
(v) A detailed description of compatible imaging hardware and imaging protocols.
(vi) Discussion of warnings, precautions, and limitations must include situations in which the device may fail or may not operate at its expected performance level (
e.g., poor image quality or for certain subpopulations), as applicable.(vii) Device operating instructions.
(viii) A detailed summary of the performance testing, including: test methods, dataset characteristics, results, and a summary of sub-analyses on case distributions stratified by relevant confounders, such as lesion and organ characteristics, disease stages, and imaging equipment.