(236 days)
exoplan is a medical software, intended to support the planning of dental implants using the visualization of the implant placement within images of the patient's anatomy. The process is based on CT/CBCT data sets originating from other medical devices, and can be supported by optical scan(s) of the patient's anatomy as well as a virtual prosthetic proposal.
exoplan allows the design of surgical guides to support the placement of endosseous dental implants in guided surgery. The design of surgical guides is based on 3D surface data representive situation and approved implant positions. The software exports the planning and design results as geometrical data and a digital 3D model of the surgical guide to support the manufacture of a separate physical product. exoplan does not extend or change indications of dental implants. Usage of a surgical guide designed with the software does not change the necessary due diligence required compared to conventional (non-guided) surgery.
The software is intended to be used only by dental professionals with sufficient medical training in dental implantology and surgical dentistry in office environments suitable for reading diagnostic dental DICOM data sets. exoplan shall not be used for any purpose other than planning dental implant placement or design of surgical guides.
exoplan is a standalone software application for the purpose of pre-operative implant planning and design of surgical guides to support the surgical intervention.
The software application runs on "off-the-shelf" PC hardware with current Microsoft Windows operating system (7, 8.1,10, 64 Bit), off-the shell GPU card and otherwise standard peripheral components.
The device allows importing 3D CT data and dental scan data (e.g. scans from teeth, dental impression, or stone models) from compatible intraoral or desktop scanners. While the planning of implant position is mainly based on the information of the CT data, the design of a surgical quide is based on the STL data of the dental scan. Both modalities are registered to a common coordinate system to ensure that the implant positions defined by a user can be used for design of a surgical guide.
exoplan uses so called Implant Libraries that contain information provided by the original manufacturer of a respective physical implant, reconstruction parts on top of the implant, such as stock abutments or titanium bases. Libraries also contain drilling sleeves and tools of surgical kit items, such as drills and drill handles. The libraries are digitally signed and by that any modification of a library content or the referred library parts of files will be detected by exoplan reported to the user and documented in an Implant Planning Report or Surgical Protocol document.
exoplan has no contact with the patient.
The provided document is a 510(k) summary for the medical software "exoplan 2.3". It outlines the device's indications for use, technical characteristics, and a comparison to a predicate device. The document does not contain details about a clinical study involving human readers or a multi-reader multi-case (MRMC) study. It primarily focuses on the device's technical performance and verification/validation steps.
Here's an analysis of the acceptance criteria and the study information that is available in the document:
1. A table of acceptance criteria and the reported device performance:
The document includes a table titled "Summary of tests and accuracy results" on page 5, which lists several tested accuracies (acting as acceptance criteria) and their reported results.
# | Tested Accuracy | Reported Device Performance |
---|---|---|
1. | Visualization of iso-surface of CT data accuracy | 10% of the maximum voxel size (e.g., for 1mm voxel size, accuracy is 0.1mm) |
2. | Density threshold accuracy in 3D CT data | 10% of maximum voxel size for a perfectly selected threshold. |
3. | Accuracy of distance measurement in 3D CT data, rendering modes "iso-surface", "solid" | General dimensional measurement accuracy of 0.01mm. Clicking on DICOM object in iso-surface or solid rendering mode has an additional limitation of 10% of the voxel size. |
4. | Accuracy of distance measurement in STL meshes | Achievable accuracy of 0.01mm. |
5. | Accuracy of angle measurement | Achievable accuracy of 0.5°. |
6. | CT Data alignment accuracy, 3 Point Alignment & Best Fit Alignment | 3 Point Alignment: Achievable accuracy: 1mm. |
Best Fit Alignment: Achievable accuracy depends on input data resolution; for the data set used, 0.2mm. | ||
7. | Implant placement accuracy | Achievable accuracy of 0.3 mm relative to CT data. |
8. | Collision detection accuracy, visual | Collisions are detected: Implant/implant, marked nerve/implant, collision object mesh/implant. |
9. | Collision detection accuracy | 0.01mm. |
10. | Drill Sleeve placement accuracy | Accuracy of placing a sleeve along the implant axis: 0.01mm. |
11. | Drill Sleeve rotation accuracy | Achievable repetitious accuracy of 1°. |
12. | Accuracy of Surgical Guide bottom | 0.1mm in smooth areas without undercuts (relative to optical scan data plus user-defined offset). |
13. | Accuracy of merged Surgical Guide parts | 0.01mm in smooth areas (relative to the surface of the merged parts). |
14. | Accuracy of merged Surgical Guide bottom | 0.01mm in smooth areas (relative to the surface of the merged bottom). |
2. Sample size used for the test set and the data provenance:
The document states: "Software verification and validation is performed in accordance with the applicable guidance document... Prior to release of exoplan the verification and validation of the device has been completed." It also mentions "The selected tests verify accuracies of critical items in the whole workflow of exoplan".
However, the document does not explicitly state the sample size for the test set used for these accuracy measurements. It also does not specify the data provenance (e.g., country of origin, retrospective or prospective) for the CT/CBCT or optical scan data used in these tests. It only mentions the data types (CT/CBCT, optical scans) and that they originate "from other medical devices".
3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts:
The document does not mention the use of experts or their qualifications for establishing ground truth for the technical accuracy tests. The described tests appear to be technical verification and validation, likely against known or measured physical/digital standards, rather than expert-derived diagnoses or interpretations.
4. Adjudication method for the test set:
Since the document does not describe expert involvement in establishing ground truth for the technical tests, there is no adjudication method mentioned.
5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:
The document explicitly states on page 5: "Clinical testing is not a requirement and has not been performed." This indicates that no MRMC or other human-in-the-loop performance study has been conducted or reported in this submission. Therefore, no effect size of human reader improvement with AI assistance is provided.
6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done:
The provided "Summary of tests and accuracy results" table (page 5) describes the performance of the "exoplan" software itself, independent of human interaction beyond input provision. These are essentially standalone performance metrics focusing on the accuracy of its various computational and visualization functions (e.g., measurement accuracy, alignment accuracy, collision detection accuracy).
7. The type of ground truth used:
For the technical accuracy tests, the ground truth appears to be engineering ground truth or digital ground truth, derived from:
- Pre-defined voxel sizes and known physical dimensions for measurements.
- Known "perfectly selected thresholds" for density.
- Physical or digital models with established properties for alignment, implant placement, and guide fit.
- The software's internal calculations and comparisons for collision detection and merging parts.
There is no mention of expert consensus, pathology, or outcomes data being used as ground truth for these technical accuracy tests.
8. The sample size for the training set:
The document is a 510(k) summary for a "Picture archiving and communications system" that assists with dental implant planning. While it's software-based, the description of its functionalities (e.g., using "Implant Libraries" which contain manufacturers' data) suggests it might not be a machine learning/AI device in the sense that it requires a "training set" in the conventional machine learning context. The document does not mention any training set size because it doesn't describe an AI model that undergoes a training phase from data.
9. How the ground truth for the training set was established:
As no training set is mentioned or implied for a machine learning model, this information is not applicable and not provided in the document. The software appears to be rule-based or model-based, relying on geometric, anatomical, and manufacturer-provided data rather than learning from a large labeled dataset.
§ 892.2050 Medical image management and processing system.
(a)
Identification. A medical image management and processing system is a device that provides one or more capabilities relating to the review and digital processing of medical images for the purposes of interpretation by a trained practitioner of disease detection, diagnosis, or patient management. The software components may provide advanced or complex image processing functions for image manipulation, enhancement, or quantification that are intended for use in the interpretation and analysis of medical images. Advanced image manipulation functions may include image segmentation, multimodality image registration, or 3D visualization. Complex quantitative functions may include semi-automated measurements or time-series measurements.(b)
Classification. Class II (special controls; voluntary standards—Digital Imaging and Communications in Medicine (DICOM) Std., Joint Photographic Experts Group (JPEG) Std., Society of Motion Picture and Television Engineers (SMPTE) Test Pattern).