K Number
K213975
Manufacturer
Date Cleared
2022-05-06

(137 days)

Product Code
Regulation Number
892.2050
Panel
RA
Reference & Predicate Devices
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

The KEOPS Balance Analyzer 3D is intended for assisting healthcare professionals in viewing and measuring images as well as planning spine surgeries. The device allows surgeons and service providers to perform spine related measurements on images, and to plan surgical procedures. The device also includes tools for measuring anatomical components for design and placement of surgical implants. Clinical judgment and experience are required to properly use the software.

Device Description

The KEOPS Balance Analyzer 3D (KBA3D) is a software solution developed for the medical community. It is intended to view images and perform spine related measurements and plan surgical procedures. The image formats supported include standards such as jpeg, tiff, png, and DICOM file types. Images can be stored in the KEOPS database and measurements made using generic measuring and surgical tools within overlays of the images. The KEOPS Balance Analyzer 3D offers the ability to plan spine surgical procedures such as ostectomies of the spine and templating implants (rods). Keops Balance Analyzer 3D is web-based software.

AI/ML Overview

The provided text describes specific performance tests and acceptance criteria for the KEOPS Balance Analyzer 3D (KBA3D) software, which is intended for assisting healthcare professionals in viewing, measuring, and planning spine surgeries.

Here's an analysis of the acceptance criteria and the study proving the device meets them, based on the provided text, structured according to your requested points:

Acceptance Criteria and Device Performance Study for KEOPS Balance Analyzer 3D

The KEOPS Balance Analyzer 3D (KBA3D) underwent non-clinical performance testing to demonstrate its accuracy, precision, and reliability in various functionalities.

1. Table of Acceptance Criteria and Reported Device Performance

Functionality TestedAcceptance CriteriaReported Device Performance
Anatomical Parameters MeasurementAcceptable variations of values between original and "worst case variation" coordinates."Comparison of original and 'worst case variation' coordinates demonstrated acceptable variations of values and validated the accuracy of the software in measuring anatomical parameters."
Data Point Acquisition Repeatability/ReliabilityBarycenter location within 3 pixels (or less)."Data point acquisition was determined to be repeatable and reliable based on measurements showing location of the barycenter to be within the acceptance criteria of 3 pixels (or less)."
Spino-Pelvic Parameters CalculationNo difference in results compared to manual MS Excel calculations."Calculation of spino-pelvic parameters showed no difference in results between the KBA3D software and manual calculations using MS Excel."
Surgical Planning & Simulation ReliabilityPlanned corrections by the software compared to actual post-operative radiographic images met acceptance criteria."The reliability of the KBA3D surgical planning and simulation algorithm was demonstrated by comparing the planned corrections by the software to the actual corrections obtained from postoperative radiographic images. The acceptance criteria were met."
2D3D Reconstruction AccuracyAllowable maximum deviation of 5mm (which is 10% of the average dimension of a vertebral body) between 3D models from CT scans and 3D reconstructions from X-ray images."The reliability of KBA3D 2D3D reconstruction algorithm was validated by comparing 3D models from CT scans to 3D reconstructions from the KBA3D software from X-ray images; the results met the acceptance criteria of an allowable maximum deviation of 5mm (which is 10% of the average dimension of a vertebral body)."
Software UsabilityNo measurable, clinically relevant differences between user groups (first-time users vs. implied experienced users/benchmark)."The usability of the software was validated for the process of acquisition and intuitiveness of use by orthopaedic spine surgeons using the software for the first time. The results showed no measurable, clinically relevant differences between the user groups."

2. Sample Size Used for the Test Set and Data Provenance

The document does not explicitly state the specific sample sizes for each of the performance tests. However, it mentions:

  • For Usability Study: "A study was conducted with fifteen spine surgeons to validate the usability of the KBA3D software." This indicates a test set of 15 users.
  • For 2D3D Reconstruction: Comparison was done between "3D models from CT scans to 3D reconstructions from the KBA3D software from X-ray images." This implies a dataset of both CT and X-ray images.
  • For Surgical Simulation: Comparison of "planned corrections by the software to the actual corrections obtained from postoperative radiographic images." This implies a dataset of pre-operative (for planning) and post-operative images.

Data Provenance: The document does not specify the country of origin of the data or whether the data was retrospective or prospective. It describes the data as "2D Images (Profile and Face)" and "postoperative radiographic images" for the reconstruction and simulation tests respectively, and general "images" for measurement.

3. Number of Experts Used to Establish the Ground Truth for the Test Set and Qualifications

  • For Usability Study: "fifteen spine surgeons" were used. Their specific qualifications (e.g., years of experience) are not detailed beyond "orthopaedic spine surgeons." These individuals were the "users" of the software, testing its intuitiveness and acquisition process, rather than primarily establishing ground truth for the technical performance metrics.
  • For Technical Performance Ground Truth:
    • Anatomical Measurements and Spino-Pelvic Parameters: Ground truth appears to be established through "mathematical calculations" and "manual calculations using MS Excel," implying established analytical methods rather than expert consensus on images.
    • 2D3D Reconstruction: Ground truth was "3D models from CT scans." This suggests that high-fidelity CT data served as the reference for evaluating the accuracy of 3D reconstruction from 2D X-rays. While experts might have reviewed these, the ground truth source itself is described as the CT data.
    • Surgical Simulation: Ground truth was "actual corrections obtained from postoperative radiographic images." This represents real-world outcomes data which is then compared against the software's predicted outcomes.

The document does not specify the number of experts (e.g., radiologists, orthopedic surgeons) used solely to establish ground truth for the image-based measurements or reconstructions, nor their detailed qualifications, other than the 15 spine surgeons for the usability study.

4. Adjudication Method for the Test Set

The document does not describe any specific adjudication method (e.g., 2+1, 3+1) for establishing ground truth for the test data. The ground truth appears to be derived from objective calculations (mathematical, Excel), reference imaging (CT scans), or real-world post-operative outcomes.

5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study was Done

A Multi-Reader Multi-Case (MRMC) comparative effectiveness study, which typically evaluates how human readers improve with AI vs. without AI assistance, was not explicitly described for the KBA3D's clinical effectiveness as an AI assistance tool. The usability study involved human users (15 spine surgeons) but focused on the software's intuitiveness and acquisition process, not a direct comparison of diagnostic or planning accuracy with and without AI assistance. The performance testing focuses on the software's inherent accuracy and reliability.

6. If a Standalone (i.e. algorithm only without human-in-the loop performance) was done

Yes, the performance characteristics described, such as "Anatomical parameters measurements were verified via mathematical calculations," "3D algorithm reconstruction on 2D Images were verified by comparing vertebral body dimensions," and "Accuracy and precision testing were conducted," largely describe the standalone performance of the algorithm. While the device is intended to assist human users, the validation of its core functionalities (measurements, reconstructions, calculations) appears to be done algorithmically against established ground truths or comparative data, essentially demonstrating the algorithm's performance independent of a human-in-the-loop comparison.

7. The Type of Ground Truth Used

The types of ground truth used include:

  • Mathematical Calculations: For anatomical parameters and spino-pelvic parameters.
  • CT Scan Data: For validating 3D reconstruction from 2D X-ray images. This serves as a higher-fidelity reference.
  • Outcomes Data: "actual corrections obtained from postoperative radiographic images" for validating surgical simulation.

8. The Sample Size for the Training Set

The document does not specify the sample size used for the training set of the KEOPS Balance Analyzer 3D. The focus of this 510(k) summary is on the testing and validation of the device's performance, not its development or training phase.

9. How the Ground Truth for the Training Set was Established

Since the training set size is not mentioned, the method for establishing its ground truth is also not described in this document.

§ 892.2050 Medical image management and processing system.

(a)
Identification. A medical image management and processing system is a device that provides one or more capabilities relating to the review and digital processing of medical images for the purposes of interpretation by a trained practitioner of disease detection, diagnosis, or patient management. The software components may provide advanced or complex image processing functions for image manipulation, enhancement, or quantification that are intended for use in the interpretation and analysis of medical images. Advanced image manipulation functions may include image segmentation, multimodality image registration, or 3D visualization. Complex quantitative functions may include semi-automated measurements or time-series measurements.(b)
Classification. Class II (special controls; voluntary standards—Digital Imaging and Communications in Medicine (DICOM) Std., Joint Photographic Experts Group (JPEG) Std., Society of Motion Picture and Television Engineers (SMPTE) Test Pattern).