(252 days)
The ULab Systems UDesign is intended for use as a medical front-end device providing tools for management of orthodontic models, systematic inspection, detailed analysis, treatment simulation and virtual design of a series of dental casts, which may be used for sequential aligner travs or retainers, based on 3D models of the patient's dentition before the start of an orthodontic treatment. It can also be applied during the treatment to inspect and analyze the progress of the treatment. It can be used at the end of the treatment to evaluate if the outcome is consistent with the plamed/desired treatment objectives. The use of ULab Systems UDesign requires the necessary training and domain knowledge in the practice of orthodontics, as well to have received a dedicated training in the use of the software.
The ULab Systems UDesign is orthodontic diagnosis and treatment simulation software for use by dental professionals. UDesign imports patient 3-D digital scans and allows the user to diagnose the orthodontic treatment needs of the patient and rapidly develop a treatment plan. The output of the treatment plan may be downloaded as files in standard stereolithographic (STL) format for fabrication of dental casts, which may be used to fabricate sequential aligner trays or retainers.
The provided text does not contain acceptance criteria or a study that proves the device meets specific acceptance criteria in the typical sense of a clinical performance study. Instead, it discusses the software validation and verification.
Here's a breakdown of the requested information based on the provided text, highlighting what is present and what is absent:
Acceptance Criteria and Device Performance Study Information
1. A table of acceptance criteria and the reported device performance
The document states: "All test results met acceptance criteria, demonstrating the ULab Systems UDesign performs as intended, raises no new or different questions of risk and is substantially equivalent to the predicate device."
- Acceptance Criteria: Not explicitly detailed in the document. It mentions "acceptance criteria" generally for software and integration verification and validation testing, but specific metrics (e.g., accuracy percentages, error margins for tooth movement) are not provided.
- Reported Device Performance: Again, not explicitly detailed. The statement is a general affirmation that the software performed as intended during the validation process. There are no quantitative performance metrics (e.g., this software can predict tooth movement with X% accuracy).
2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)
- Sample Size for Test Set: Not specified. The document does not refer to a "test set" in the context of patient data, but rather "testing" for software validation.
- Data Provenance: Not specified.
3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)
- Not applicable as no "test set" with ground truth established by experts is described for performance evaluation. The "ground truth" mentioned in the context of the device's function is the patient's dentition (3D models).
4. Adjudication method (e.g. 2+1, 3+1, none) for the test set
- Not applicable. No expert adjudication process for a test set is described.
5. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance
- No MRMC comparative effectiveness study was done or reported. The document focuses on the software as a tool for "orthodontic diagnosis and treatment simulation" and does not describe studies involving human readers or AI assistance in decision-making or improvement. The device itself performs calculations and simulations, not interpretation for a human reader to then act upon (though it informs the orthodontist).
6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done
- The document describes the ULab Systems UDesign as "orthodontic diagnosis and treatment simulation software." Its purpose is to output treatment plans which may be used to fabricate dental casts. While the software itself performs calculations and simulations (a standalone algorithmic function), the "use of ULab Systems UDesign requires the necessary training and domain knowledge in the practice of orthodontics, as well to have received a dedicated training in the use of the software." This indicates that it's a tool for a trained professional, implying a human-in-the-loop process for clinical application, even if the core simulation is standalone. However, no specific "standalone performance study" in the sense of accuracy against a clinical outcome is described. The performance data section refers to "Software and integration verification and validation testing."
7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)
- For the software functions, the "ground truth" would be the mathematical and logical correctness of its calculations and simulations based on the input 3D models of the patient's dentition. The document doesn't detail how this "ground truth" was established or used for the software validation, other than stating testing was performed. It's safe to assume it's based on known geometric principles and potentially clinical scenarios, but not explicitly stated as expert consensus, pathology, or outcomes data.
8. The sample size for the training set
- Not applicable. The document describes software for design and simulation, not a machine learning or AI model that requires a "training set" of data in the typical sense. The "software and integration verification and validation testing" would be for the developed software code.
9. How the ground truth for the training set was established
- Not applicable, as there is no "training set" for an AI model mentioned.
Summary of what's provided:
The submission focuses on the regulatory clearance process for a medical device (software) based on:
- Its intended use and indications for use.
- Comparisons to a predicate device (3Shape Ortho System) showing substantial equivalence in technological characteristics.
- General affirmance that "Software and integration verification and validation testing was performed" in accordance with FDA guidance, and "All test results met acceptance criteria."
The document does not delve into the clinical performance metrics of the software in real-world or simulated orthodontic treatment accuracy, nor does it describe AI-specific validation or clinical efficacy studies with expert review or patient outcomes. The "performance data" discussed is about software engineering validation, not clinical performance for treatment simulation or diagnosis accuracy.
§ 872.5470 Orthodontic plastic bracket.
(a)
Identification. An orthodontic plastic bracket is a plastic device intended to be bonded to a tooth to apply pressure to a tooth from a flexible orthodontic wire to alter its position.(b)
Classification. Class II.