K Number
K160407
Device Name
spineEOS
Manufacturer
Date Cleared
2016-04-08

(52 days)

Product Code
Regulation Number
892.2050
Panel
RA
Reference & Predicate Devices
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

Using 3D data and models obtained with sterEOS workstation, spineEOS software is indicated for assisting healthcare professionals in viewing, measuring images as well as in preoperative planning of spine surgeries. The device includes tools for measuring spine anatomical components for placement of surgical implants. Clinical judgment and experience are required to properly use the software online.

Device Description

spineEOS 1.0 allows surgeons to perform preoperative surgical planning of spine surgeries in case of Adolescent Idiopathic Scoliosis (AIS) or deformative spine. The software provides surgical tools for the correction of the curvature, for the placement of cages and for the achievement of osteotomies. The images displayed are x-rays from EOS System (K152788) and 3D model of the spine from sterEOS Workstation (K141137). spineEOS also displays preoperative parameters compared with reference values and updated values of parameters after planning. spineEOS is accessible on any computer via ONEFIT Management System (Class I device - Product code LMD - 510(k) Exempt) that provides a secure interface and storage through authentication mechanisms.

AI/ML Overview

The FDA 510(k) summary for spineEOS provides some information regarding its performance data, but it does not contain a detailed study with acceptance criteria, specific reported device performance metrics, sample sizes, or information about experts and ground truth as requested.

The document primarily focuses on establishing substantial equivalence to a predicate device (Surgimap 2.0) by comparing intended use, indications, and technological characteristics.

Here's an analysis of what is available and what is missing from the provided text, structured according to your request:

1. A table of acceptance criteria and the reported device performance

  • Missing from the document. The document states: "Software verification and validation testing were conducted and documentation was provided as recommended by FDA's Guidance for Industry and FDA Staff, 'Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices'." However, specific acceptance criteria or detailed results of these tests (e.g., accuracy of measurements, success rate of planning tools) are not provided in this 510(k) summary.

2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)

  • Missing from the document. The summary mentions "Software verification and validation testing," but does not specify the sample size of any test set or the provenance of the data used for such testing.

3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)

  • Missing from the document. There is no mention of experts, ground truth establishment, or their qualifications for any validation testing.

4. Adjudication method (e.g. 2+1, 3+1, none) for the test set

  • Missing from the document. No information about adjudication methods for a test set is provided.

5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

  • Missing from the document. The document makes no mention of a multi-reader multi-case (MRMC) comparative effectiveness study. The focus is on demonstrating equivalence to the predicate device's existing functionality rather than quantifying human performance improvements.

6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

  • Implied, but not detailed. The "Software verification and validation testing" would typically involve standalone performance testing of the algorithms and software functionalities. However, the specifics of these tests and their results are not detailed. The spineEOS is described as "assisting healthcare professionals," implying it's a human-in-the-loop device, but standalone testing of its components would be part of standard V&V. Again, no specific results are provided.

7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)

  • Missing from the document. As no specific performance study is detailed, the type of ground truth used is not mentioned.

8. The sample size for the training set

  • N/A (or not explicitly stated as a "training set"). The spineEOS is a software for viewing, measuring, and planning based on existing 3D data and models (from sterEOS workstation). It's not described as a machine learning device that requires a distinct "training set" in the sense of a deep learning model. Its validation would focus on the accuracy of its measurements and the functionality of its planning tools against known standards or expert opinion, not on learning from a dataset.

9. How the ground truth for the training set was established

  • N/A. Since a classical machine learning "training set" is not explicitly mentioned or implied for this type of device, the method for establishing its ground truth is not applicable in that context.

§ 892.2050 Medical image management and processing system.

(a)
Identification. A medical image management and processing system is a device that provides one or more capabilities relating to the review and digital processing of medical images for the purposes of interpretation by a trained practitioner of disease detection, diagnosis, or patient management. The software components may provide advanced or complex image processing functions for image manipulation, enhancement, or quantification that are intended for use in the interpretation and analysis of medical images. Advanced image manipulation functions may include image segmentation, multimodality image registration, or 3D visualization. Complex quantitative functions may include semi-automated measurements or time-series measurements.(b)
Classification. Class II (special controls; voluntary standards—Digital Imaging and Communications in Medicine (DICOM) Std., Joint Photographic Experts Group (JPEG) Std., Society of Motion Picture and Television Engineers (SMPTE) Test Pattern).