(282 days)
SugarBug is a radiological, automated, concurrent read, computer-assisted detection software intended to aid in the detection and segmentation of caries on bitewing radiographs. The device provides additional information for the dentist to use in their diagnosis of a tooth surface suspected of being carious. Sugarbug is intended to be used on patients 18 years and older. The device is not intended as a replacement for a complete dentist's review or their clinical judgment that takes into account other relevant information from the image, patient history, and actual in vivo clinical assessment.
SugarBug is a software as a medical device (SaMD) that uses machine learning to label features that the reader should examine for evidence of decay. SugarBug uses convolutional neural network to perform a semantic segmentation task. The algorithm goes through every pixel in an image and assigns a probability value to it for the possibility that it contains decay. A threshold is used to determine which pixels are labeled in the device's output. The software reads the selected image using local processing; images are not imported or sent to a cloud server any time during routine use.
Here's a breakdown of the acceptance criteria and the study details for the SugarBug (1.x) device, based on the provided FDA 510(k) clearance letter:
1. Acceptance Criteria and Reported Device Performance
The direct "acceptance criteria" are not explicitly stated in a quantitative table for this device. However, based on the clinical study results and the stated objectives, the implicit acceptance criteria would have been:
- Statistically significant improvement in overall diagnostic performance (wAFROC-AUC) for aided readers compared to unaided readers.
- Demonstrated improvement in lesion-level sensitivity for aided readers.
- Maintain or improve lesion annotation quality (DICE scores) with aid.
- Standalone performance metrics (sensitivity, FPPI, DICE coefficient) within an acceptable range.
Here's a table summarizing the reported device performance against these implicit criteria:
| Metric | Acceptance Criteria (Implicit) | Reported Unaided Reader Performance | Reported Aided Reader Performance | Reported Difference (Aided vs. Unaided) | Statistical Significance | Standalone Device Performance |
|---|---|---|---|---|---|---|
| MRMC Study (Aided vs. Unaided) | ||||||
| wAFROC-AUC (Primary Endpoint) | Statistically significant improvement with aid | 0.659 (0.611, 0.707) | 0.725 (0.683, 0.767) | 0.066 (0.030, 0.102) | p = 0.001 (Significant) | N/A |
| Lesion-Level Sensitivity | Statistically significant improvement with aid | 0.540 (0.445, 0.621) | 0.674 (0.615, 0.728) | 0.134 (0.066, 0.206) | Significant | N/A |
| Mean FPPI | Maintain or improve (small or negative difference) | 0.328 (0.102, 0.331) | 0.325 (0.128, 0.310) | -0.003 (-0.103, 0.086) | Not statistically significant (small improvement) | N/A |
| Mean DICE Scores (Readers) | Improvement in lesion delineation | 0.695 (0.688, 0.702) | 0.740 (0.733, 0.747) | 0.045 (0.035, 0.055) | N/A (modest improvement) | N/A |
| Standalone Study | ||||||
| Lesion-level sensitivity | Acceptable range | N/A | N/A | N/A | N/A | 0.686 (0.655, 0.717) |
| Mean FPPI | Acceptable range | N/A | N/A | N/A | N/A | 0.231 (0.111, 0.303) |
| DICE coefficient (vs. ground truth) | Acceptable range | N/A | N/A | N/A | N/A | 0.746 (0.724, 0.768) |
2. Sample Size Used for the Test Set and Data Provenance
- Sample Size for Test Set (MRMC Study): 300 bitewing radiographic images.
- Sample Size for Test Set (Standalone Study): 400 de-identified images.
- Data Provenance: Retrospectively collected from routine dental examinations of patients aged 18 and older from the US. The images were sampled to be representative of a range of x-ray sensor types (Vatech HD 29%, iSensor H2 11%, Schick 33: 45%, Dexis Platinum 15%).
3. Number of Experts Used to Establish the Ground Truth for the Test Set and Their Qualifications
- Number of Experts: 3 US licensed general dentists.
- Qualifications: Mean of 27 years of clinical experience.
4. Adjudication Method for the Test Set
- Adjudication Method (Ground Truth): Consensus labels of the 3 US licensed general dentists.
5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study
- Was a MRMC study done? Yes.
- Effect size of human readers improvement with AI vs. without AI assistance:
- wAFROC-AUC Improvement: 0.066 (0.030, 0.102) with a p-value of 0.001.
- Lesion-Level Sensitivity Improvement: 0.134 (0.066, 0.206).
6. Standalone (Algorithm Only) Performance Study
- Was a standalone study done? Yes.
- Performance metrics:
- Lesion-level sensitivity: 0.686 (0.655, 0.717)
- Mean FPPI: 0.231 (0.111, 0.303)
- DICE coefficient versus ground truth: 0.746 (0.724, 0.768)
7. Type of Ground Truth Used
- Type of Ground Truth: Expert consensus (established by 3 US licensed general dentists).
8. Sample Size for the Training Set
- The document does not explicitly state the sample size used for the training set. It only describes the test sets.
9. How the Ground Truth for the Training Set Was Established
- The document does not explicitly state how the ground truth for the training set was established. It only mentions that the standalone testing data (which could be considered a "test set" for the standalone algorithm) was "collected and labeled in the same procedure as the MRMC study," implying expert consensus was used for that, but it doesn't specify for the training data.
FDA 510(k) Clearance Letter - SugarBug (1.x)
Page 1
U.S. Food & Drug Administration
10903 New Hampshire Avenue
Silver Spring, MD 20993
www.fda.gov
Doc ID # 04017.08.02
November 7, 2025
Bench7 Inc.
℅ Adam Heroux
Regulatory Consultant
Highland Biomedical Inc.
4190 Grove St
DENVER, CO 80211
Re: K250264
Trade/Device Name: SugarBug (1.x)
Regulation Number: 21 CFR 892.2070
Regulation Name: Medical Image Analyzer
Regulatory Class: Class II
Product Code: MYN
Dated: January 24, 2025
Received: October 7, 2025
Dear Adam Heroux:
We have reviewed your section 510(k) premarket notification of intent to market the device referenced above and have determined the device is substantially equivalent (for the indications for use stated in the enclosure) to legally marketed predicate devices marketed in interstate commerce prior to May 28, 1976, the enactment date of the Medical Device Amendments, or to devices that have been reclassified in accordance with the provisions of the Federal Food, Drug, and Cosmetic Act (the Act) that do not require approval of a premarket approval application (PMA). You may, therefore, market the device, subject to the general controls provisions of the Act. Although this letter refers to your product as a device, please be aware that some cleared products may instead be combination products. The 510(k) Premarket Notification Database available at https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfpmn/pmn.cfm identifies combination product submissions. The general controls provisions of the Act include requirements for annual registration, listing of devices, good manufacturing practice, labeling, and prohibitions against misbranding and adulteration. Please note: CDRH does not evaluate information related to contract liability warranties. We remind you, however, that device labeling must be truthful and not misleading.
If your device is classified (see above) into either class II (Special Controls) or class III (PMA), it may be subject to additional controls. Existing major regulations affecting your device can be found in the Code of Federal Regulations, Title 21, Parts 800 to 898. In addition, FDA may publish further announcements concerning your device in the Federal Register.
Page 2
K250264 - Adam Heroux
Page 2
Additional information about changes that may require a new premarket notification are provided in the FDA guidance documents entitled "Deciding When to Submit a 510(k) for a Change to an Existing Device" (https://www.fda.gov/media/99812/download) and "Deciding When to Submit a 510(k) for a Software Change to an Existing Device" (https://www.fda.gov/media/99785/download).
Your device is also subject to, among other requirements, the Quality System (QS) regulation (21 CFR Part 820), which includes, but is not limited to, 21 CFR 820.30, Design controls; 21 CFR 820.90, Nonconforming product; and 21 CFR 820.100, Corrective and preventive action. Please note that regardless of whether a change requires premarket review, the QS regulation requires device manufacturers to review and approve changes to device design and production (21 CFR 820.30 and 21 CFR 820.70) and document changes and approvals in the device master record (21 CFR 820.181).
Please be advised that FDA's issuance of a substantial equivalence determination does not mean that FDA has made a determination that your device complies with other requirements of the Act or any Federal statutes and regulations administered by other Federal agencies. You must comply with all the Act's requirements, including, but not limited to: registration and listing (21 CFR Part 807); labeling (21 CFR Part 801); medical device reporting (reporting of medical device-related adverse events) (21 CFR Part 803) for devices or postmarketing safety reporting (21 CFR Part 4, Subpart B) for combination products (see https://www.fda.gov/combination-products/guidance-regulatory-information/postmarketing-safety-reporting-combination-products); good manufacturing practice requirements as set forth in the quality systems (QS) regulation (21 CFR Part 820) for devices or current good manufacturing practices (21 CFR Part 4, Subpart A) for combination products; and, if applicable, the electronic product radiation control provisions (Sections 531-542 of the Act); 21 CFR Parts 1000-1050.
All medical devices, including Class I and unclassified devices and combination product device constituent parts are required to be in compliance with the final Unique Device Identification System rule ("UDI Rule"). The UDI Rule requires, among other things, that a device bear a unique device identifier (UDI) on its label and package (21 CFR 801.20(a)) unless an exception or alternative applies (21 CFR 801.20(b)) and that the dates on the device label be formatted in accordance with 21 CFR 801.18. The UDI Rule (21 CFR 830.300(a) and 830.320(b)) also requires that certain information be submitted to the Global Unique Device Identification Database (GUDID) (21 CFR Part 830 Subpart E). For additional information on these requirements, please see the UDI System webpage at https://www.fda.gov/medical-devices/device-advice-comprehensive-regulatory-assistance/unique-device-identification-system-udi-system.
Also, please note the regulation entitled, "Misbranding by reference to premarket notification" (21 CFR 807.97). For questions regarding the reporting of adverse events under the MDR regulation (21 CFR Part 803), please go to https://www.fda.gov/medical-devices/medical-device-safety/medical-device-reporting-mdr-how-report-medical-device-problems.
For comprehensive regulatory information about medical devices and radiation-emitting products, including information about labeling regulations, please see Device Advice (https://www.fda.gov/medical-devices/device-advice-comprehensive-regulatory-assistance) and CDRH Learn (https://www.fda.gov/training-and-continuing-education/cdrh-learn). Additionally, you may contact the Division of Industry and Consumer Education (DICE) to ask a question about a specific regulatory topic. See the DICE website (https://www.fda.gov/medical-devices/device-advice-comprehensive-regulatory-
Page 3
K250264 - Adam Heroux
Page 3
assistance/contact-us-division-industry-and-consumer-education-dice) for more information or contact DICE by email (DICE@fda.hhs.gov) or phone (1-800-638-2041 or 301-796-7100).
Sincerely,
Lu Jiang
Lu Jiang, Ph.D.
Assistant Director
Diagnostic X-ray Systems Team
DHT8B: Division of Radiologic Imaging
Devices and Electronic Products
OHT8: Office of Radiological Health
Office of Product Evaluation and Quality
Center for Devices and Radiological Health
Enclosure
Page 4
DEPARTMENT OF HEALTH AND HUMAN SERVICES
Food and Drug Administration
Form Approved: OMB No. 0910-0120
Expiration Date: 07/31/2026
See PRA Statement below.
Indications for Use
510(k) Number (if known)
K250264
Device Name
SugarBug (1.x)
Indications for Use (Describe)
SugarBug is a radiological, automated, concurrent read, computer-assisted detection software intended to aid in the detection and segmentation of caries on bitewing radiographs. The device provides additional information for the dentist to use in their diagnosis of a tooth surface suspected of being carious. Sugarbug is intended to be used on patients 18 years and older. The device is not intended as a replacement for a complete dentist's review or their clinical judgment that takes into account other relevant information from the image, patient history, and actual in vivo clinical assessment.
Type of Use (Select one or both, as applicable)
☑ Prescription Use (Part 21 CFR 801 Subpart D)
☐ Over-The-Counter Use (21 CFR 801 Subpart C)
CONTINUE ON A SEPARATE PAGE IF NEEDED.
This section applies only to requirements of the Paperwork Reduction Act of 1995.
DO NOT SEND YOUR COMPLETED FORM TO THE PRA STAFF EMAIL ADDRESS BELOW.
The burden time for this collection of information is estimated to average 79 hours per response, including the time to review instructions, search existing data sources, gather and maintain the data needed and complete and review the collection of information. Send comments regarding this burden estimate or any other aspect of this information collection, including suggestions for reducing this burden, to:
Department of Health and Human Services
Food and Drug Administration
Office of Chief Information Officer
Paperwork Reduction Act (PRA) Staff
PRAStaff@fda.hhs.gov
"An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB number."
FORM FDA 3881 (8/23)
Page 1 of 1
PSC Publishing Services (301) 443-6740 EF
Page 5
510(k) Summary
510(k)# K250264
03NOV2025
Contact Details - 21 CFR 807.92(a)(1)
Applicant Information:
Bench7 Inc.
929 Parkview Drive
Bismarck ND 58501 United States
701-426-4614
Dr. John Persson
john@bench7tech.com
Correspondent:
Highland Biomedical Inc.
4190 Grove St
Denver CO 80211 United States
617-823-0515
Adam Heroux
adam.heroux@highlandbiomed.com
Device Name - 21 CFR 807.92(a)(2)
Device Trade Name: SugarBug (1.x)
Common Name: Medical image analyzer
Classification Name: Medical Image Analyzer
Regulation Number: 21 CFR 892.2070
Product Code(s): MYN
Legally Marketed Predicate Devices - 21 CFR 807.92(a)(3)
Predicate # K222746
Predicate Trade Name: Overjet Caries Assist
Product Code: MYN
Device Description Summary - 21 CFR 807.92(a)(4)
SugarBug is a software as a medical device (SaMD) that uses machine learning to label features that the reader should examine for evidence of decay. SugarBug uses convolutional neural network to perform a semantic segmentation task. The algorithm goes through every pixel in an image and assigns a probability value to it for the possibility that it contains decay. A threshold is used to determine which pixels are labeled in the device's output. The software reads the selected image using local processing; images are not imported or sent to a cloud server any time during routine use.
Page 6
510(k)# K250264
510(k) Summary
03NOV2025
Intended Use/Indications for Use - 21 CFR 807.92(a)(5)
SugarBug is a radiological, automated, concurrent read, computer-assisted detection software intended to aid in the detection and segmentation of caries on bitewing radiographs. The device provides additional information for the dentist to use in their diagnosis of a tooth surface suspected of being carious. Sugarbug is intended to be used on patients 18 years and older. The device is not intended as a replacement for a complete dentist's review or their clinical judgment that takes into account other relevant information from the image, patient history, and actual in vivo clinical assessment.
Indications for Use Comparison - 21 CFR 807.92(a)(5)
The indications for use for the subject device are substantially similar to those for the predicate device. The primary differences include different patient ages and radiograph types. While minor differences exist between the proposed and predicate device, these changes do not raise new questions of safety or effectiveness, and do not change the device's fundamental intended use.
Technological Comparison - 21 CFR 807.92(a)(6)
The following information provides a summary of how the technological characteristics of the devices compare. The Sugarbug software is similar to the predicate with the following characteristics:
- Same intended use environment and user
- Similar output of caries detection via segmentation
The SugarBug software has the following differences from the predicate device:
- Different processing location - Sugarbug processes images locally without requiring cloud connection
- Different radiograph type - Sugarbug is for use with digital files of bitewing radiographs, while the predicate processes both bitewing and periapical radiographs
- Different processing input - Sugarbug processes screen captures as opposed to image file input
- Different dental X-ray sensors included in the product development and validation
- Different minimum image resolution - Sugarbug can process images of 300 pixel resolution while the predicate requires a minimum resolution of 500 pixels
SugarBug has undergone software and performance testing to ensure that any differences in the technological characteristics do not raise different questions of safety and effectiveness
Non-Clinical and/or Clinical Tests Summary & Conclusions - 21 CFR 807.92(b)
Page 7
510(k)# K250264
510(k) Summary
03NOV2025
Summary of Non-clinical Testing:
Non-clinical testing consisted of Software Verification and Validation testing at the unit, integration, and system level to demonstrate that software requirements were implemented.
Standalone testing was conducted on a dataset of 400 de-identified images collected and labeled in the same procedure as the MRMC study discussed in the Clinical Testing Summary. Of the 400 images, 192 images contained caries (481 total lesions) while 208 were caries free. SugarBug's lesion-level sensitivity and mean FPPI were 0.686 (0.655, 0.717) and 0.231 (0.111, 0.303), respectively. The DICE coefficient versus ground truth was 0.746 (0.724, 0.768).
Summary of Clinical Testing:
Clinical Study Design and Objectives:
A retrospective, multi-reader, multi-case (MRMC) study was conducted to compare the diagnostic performance of dental practitioners (readers) when aided by the SugarBug software to their performance when unaided. 12 US licensed dentists served as readers, each evaluating the same set of 300 bitewing radiographs under two conditions: (1) unaided and (2) aided by SugarBug. Readers were asked to identify areas of suspected decay by filling in the area with an annotation tool. A washout period of 30 days was applied between the two reading sessions to minimize recall bias. Images were presented to each reader in a randomized order. The reading sequence (aided first versus unaided first) was also randomized. Each reader was asked to provide a confidence score for each lesion they labeled. The primary objective was to determine whether SugarBug improves diagnostic performance, as measured by weighted Alternative Free-response Receiver Operating Characteristic (wAFROC) area under the curve (AUC). Secondary objectives included evaluating reader changes in sensitivity, specificity, and annotation quality (DICE scores), as well as assessing standalone model performance of SugarBug.
Clinical Study Population and Data Collection:
300 de-identified bitewing radiographic images of patients aged 18 and older from the US were included. No procedures were performed solely for this study. All images were retrospectively collected from routine dental examinations. The images were sampled to be representative of a range of x-ray sensor types. Within the set, the relative representation of sensor type was: Vatech HD 29%, iSensor H2 11%, Schick 33: 45%, Dexis Platinum 15%. The patients' ages ranged from 18 to 87 with a mean age of 42. Approximately 47% of the patients were male and 53% were female. 133 images in the dataset contained caries while 167 were caries free. Annotations were carried out by 12 US licensed dentists. Ground truth was established by the
Page 8
510(k)# K250264
510(k) Summary
03NOV2025
consensus labels of 3 US licensed general dentists with an average of 27 years of clinical experience.
Clinical Study Results:
-
Primary Endpoint (wAFROC-AUC): The mean unaided reader wAFROC-AUC was 0.659 (0.611,0.707) while the mean aided reader wAFROC-AUC was 0.725 (0.683, 0.767). This shows an improvement of 0.066 (0.030, 0.102) with a p-value of 0.001. This demonstrates a statistically significant improvement in diagnostic accuracy when readers were aided by SugarBug.
-
Lesion-Level Sensitivity and FPPI: Aided readers' lesion-level mean sensitivity was 0.674 (0.615, 0.728) while that of unaided readers was 0.540 (0.445, 0.621). This demonstrates a statistically significant improvement of 0.134 (0.066, 0.206).
Aided readers showed a mean FPPI of 0.325 (0.128, 0.310) while unaided readers had a mean FPPI of 0.328 (0.102, 0.331). This demonstrates a very small improvement of -0.003 (-0.103, 0.086), although this difference was not statistically significant.
-
DICE Scores (Readers): Mean DICE scores (lesion annotation similarity relative to ground truth) were 0.695 (0.688, 0.702) for unaided readings and 0.740 (0.733, 0.747) for aided readings, resulting in a mean difference of 0.045 (0.035,0.055). Although this difference is modest, it suggests an improvement in lesion delineation.
Clinical Study Conclusions:
The study results indicate that SugarBug-aided readers exhibit statistically significant improvements in overall diagnostic performance (wAFROC-AUC) for the detection of dental caries compared to unaided readers.
Overall Conclusions:
The Non-Clinical and Clinical testing have demonstrated that SugarBug shows substantially equivalent performance to the predicate device.
§ 892.2070 Medical image analyzer.
(a)
Identification. Medical image analyzers, including computer-assisted/aided detection (CADe) devices for mammography breast cancer, ultrasound breast lesions, radiograph lung nodules, and radiograph dental caries detection, is a prescription device that is intended to identify, mark, highlight, or in any other manner direct the clinicians' attention to portions of a radiology image that may reveal abnormalities during interpretation of patient radiology images by the clinicians. This device incorporates pattern recognition and data analysis capabilities and operates on previously acquired medical images. This device is not intended to replace the review by a qualified radiologist, and is not intended to be used for triage, or to recommend diagnosis.(b)
Classification. Class II (special controls). The special controls for this device are:(1) Design verification and validation must include:
(i) A detailed description of the image analysis algorithms including a description of the algorithm inputs and outputs, each major component or block, and algorithm limitations.
(ii) A detailed description of pre-specified performance testing methods and dataset(s) used to assess whether the device will improve reader performance as intended and to characterize the standalone device performance. Performance testing includes one or more standalone tests, side-by-side comparisons, or a reader study, as applicable.
(iii) Results from performance testing that demonstrate that the device improves reader performance in the intended use population when used in accordance with the instructions for use. The performance assessment must be based on appropriate diagnostic accuracy measures (
e.g., receiver operator characteristic plot, sensitivity, specificity, predictive value, and diagnostic likelihood ratio). The test dataset must contain a sufficient number of cases from important cohorts (e.g., subsets defined by clinically relevant confounders, effect modifiers, concomitant diseases, and subsets defined by image acquisition characteristics) such that the performance estimates and confidence intervals of the device for these individual subsets can be characterized for the intended use population and imaging equipment.(iv) Appropriate software documentation (
e.g., device hazard analysis; software requirements specification document; software design specification document; traceability analysis; description of verification and validation activities including system level test protocol, pass/fail criteria, and results; and cybersecurity).(2) Labeling must include the following:
(i) A detailed description of the patient population for which the device is indicated for use.
(ii) A detailed description of the intended reading protocol.
(iii) A detailed description of the intended user and user training that addresses appropriate reading protocols for the device.
(iv) A detailed description of the device inputs and outputs.
(v) A detailed description of compatible imaging hardware and imaging protocols.
(vi) Discussion of warnings, precautions, and limitations must include situations in which the device may fail or may not operate at its expected performance level (
e.g., poor image quality or for certain subpopulations), as applicable.(vii) Device operating instructions.
(viii) A detailed summary of the performance testing, including: test methods, dataset characteristics, results, and a summary of sub-analyses on case distributions stratified by relevant confounders, such as lesion and organ characteristics, disease stages, and imaging equipment.