Search Results
Found 1 results
510(k) Data Aggregation
(317 days)
Avatar Medical Software V1
AVATAR MEDICAL Software V1 is intended as a medical imaging system that allows the processing, review, analysis, communication and media interchange of multi-dimensional digital images acquired from CT or MR imaging devices. It is also intended as software for preoperative surgical planning, and as software for the intraoperative display of the aforementioned multi-dimensional digital images. AVATAR MEDICAL Software V1 is designed for use by health care professionals and is intended to assist the clinician who is responsible for making all final patient management decisions.
The Avatar Medical Software V1 (AMS V1) is a software-only device that allows trained medical professionals to review CT and MR image data in three-dimensional (3D) format and/or in virtual reality (VR) interface. The 3D and VR images are accessible through the software desktop application and, if desired, through compatible VR headsets which are used by users for preoperative surgical planning and for display during intervention/surgery.
The AMS V1 product is to be used to assist in medical image review. Intended users are trained medical professionals, including imaging technicians, clinicians and surgeons.
AMS V1 includes two main software-based user interface components, the Desktop Interface and the VR Interface. The Desktop Interface runs on a compatible off-the-shelf (OTS) workstation provided by the hospital and only accessed by authorized personnel. The Desktop Interface contains a graphical user interface where a user can retrieve DICOM-compatible medical images locally or on a Picture Archiving Communication System (PACS). Retrieved CT and MR images can be viewed in 2D and 3D formats. Users are able to make measurements, annotations, and apply fixed and manual image filters.
The VR Interface is accessible via a compatible OTS headset to allow users to review the medical images in a VR format. VR formats can be viewed only when the user connects a compatible VR headset directly to the workstation being used to view the Desktop Interface. Additionally. AMS V1 enables the intended users to remotely stream the Desktop Interface to another workstation on the same local area network (LAN).
The 3D images generated using AMS V1 are intended to be used in relation to surgical procedures in which CT or MR images are used for preoperative planning and/or during intervention/surgery.
The intraoperative use of the AMS V1 solely corresponds to the two following cases:
- Display of the AMS V1 Desktop Interface on existing monitors/screens in the operating room
- Use in a non-sterile image review room accessible from the operating room during the procedure (AMS V1 operates on VR headsets which are not approved to be used in the sterile environment of the operating room)
Here's a detailed breakdown of the acceptance criteria and the study that proves the device meets them, based on the provided text:
1. Table of Acceptance Criteria and Reported Device Performance
Feature/Function | Tool in Subject Device: AMS V1 | Tool in Reference Device: Osirix MD (K101342) | Acceptance Criteria | Reported Device Performance |
---|---|---|---|---|
Linear Measurements (polylines) | Curve | Close Polygon | No statistical difference between distributions of measurements obtained for AMS V1 or the reference device Osirix MD, as evaluated per t-test statistics. Tests performed for a series of objects in reference MR and CT phantom images. | No statistical difference was reported, as implied by the statement "No statistical difference between distributions of measurements obtained for AMS V1 or the reference device Osirix MD, as evaluated per t-test statistics. Tests performed for a series of objects in reference MR and CT phantom images." The device met this criterion. |
Diameter Measurements | Ruler | Length | No statistical difference between distributions of measurements obtained for AMS V1 or the reference device Osirix MD, as evaluated per t-test statistics. Tests performed for a series of objects in reference MR and CT phantom images. | No statistical difference was reported, as implied by the statement "No statistical difference between distributions of measurements obtained for AMS V1 or the reference device Osirix MD, as evaluated per t-test statistics. Tests performed for a series of objects in reference MR and CT phantom images." The device met this criterion. |
Display Quality (Luminance, Contrast) | N/A | N/A (evaluated against guidance) | Successfully evaluated against the AAPM guidance recommendation for visual evaluation of luminance and contrast. | "The quality of the display was successfully evaluated against the AAPM guidance recommendation for visual evaluation of luminance and contrast." The device met this criterion. |
Optical Testing (VR Platforms) | N/A | N/A (evaluated against standard) | Homogeneity of luminance, resolution, and contrast evaluated as acceptable per IEC63145-20-20 in the center of the displays for the specified VR platforms. | "Additional optical testing was conducted on compatible VR platforms as per IEC63145-20-20 and passed as expected. The homogeneity of luminance, resolution, and contrast was evaluated as acceptable per these standards in the center of the displays for the specified VR platforms." The device met this criterion. |
Filter Technology Functionality | Image filters | (Similar to cleared device Osirix MD (K101342)) | Opacity and color of specific voxels in the image demonstrated to be controllable as intended by the filtering principle. | "The functioning of the filter technology was assessed by visual inspection. Using a reference DICOM, the opacity and color of specific voxels in the image was demonstrated to be controllable as intended by the filtering principle, which is similar to the cleared device Osirix MD (K101342)." The device met this criterion. |
VR Experience Fluidity | N/A | N/A | Averaged Frame Per Second (FPS) superior to a specific threshold for the minimal hardware configuration. | "The fluidity of the VR experience was assessed by evaluating the average Frame Per Second rate. The averaged FPS was superior to the specific threshold for the minimal hardware configuration." The device met this criterion. |
The provided text for points 2-9 below is sparse. The information below is limited to what can be directly inferred or is explicitly stated in the document.
2. Sample Size Used for the Test Set and Data Provenance
The document states that tests for linear and diameter measurements were "performed for a series of objects in reference MR and CT phantom images." This indicates that phantom data was used for these specific tests. The exact sample size (number of phantom images or objects within them) for the test set is not specified.
The data provenance is described as "reference MR and CT phantom images," which implies controlled, synthetic data rather than patient data from a specific country or collected retrospectively/prospectively.
3. Number of Experts Used to Establish the Ground Truth for the Test Set and Qualifications of Those Experts
The document does not specify the number of experts used or their qualifications for establishing ground truth on the test set. For the measurement tests, the ground truth was based on "reference MR and CT phantom images," meaning the inherent dimensions of the phantom objects served as the ground truth. For visual evaluations (display quality, filter functionality), it can be inferred that qualified personnel performed these, but no details are provided.
4. Adjudication Method for the Test Set
The document does not specify an adjudication method. For the measurement tests, the ground truth was objective (phantom dimensions), and for visual assessments, it seems a direct assessment against standards or intended functionality was performed, without mention of a multiple-reader adjudication process.
5. If a Multi Reader Multi Case (MRMC) Comparative Effectiveness Study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance
No MRMC comparative effectiveness study was done or reported in this document. The study focused on standalone performance evaluation as described below.
6. If a Standalone (i.e. algorithm only without human-in-the-loop performance) was done
Yes, a standalone performance evaluation was done. The performance data section describes "Measurement performance testing was conducted by leveraging reference digital phantoms and the comparison with a cleared device (Osirix MD K101342)." This compares the algorithm's measurement capabilities against a reference, which is a standalone assessment. Similarly, the display quality, optical testing, filter technology, and VR fluidity assessments are focused on the device's inherent performance.
7. The Type of Ground Truth Used (expert consensus, pathology, outcomes data, etc.)
The ground truth used for specific tests was:
- phantom images (for linear and diameter measurements) where the actual dimensions of the objects in the phantoms served as the objective truth.
- AAPM guidance recommendation and IEC63145-20-20 standard (for display quality and optical testing).
- Reference DICOM with intended filtering principles (for filter technology functionality).
8. The Sample Size for the Training Set
The document does not provide any information regarding the sample size for a training set. This submission focuses on verification and validation testing, suggesting that the core algorithms might have been developed prior, and details about their training data are not included in this 510(k) summary.
9. How the Ground Truth for the Training Set Was Established
The document does not provide any information on how the ground truth for a training set was established, as it does not discuss a training set.
Ask a specific question about this device
Page 1 of 1