K Number
K191247
Date Cleared
2019-11-15

(190 days)

Product Code
Regulation Number
888.3660
Panel
OR
Reference & Predicate Devices
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

SmartSPACE Shoulder 3D Positioner

SmartSPACE Shoulder System instrumentation consists of a patient-specific 3D positioner. It has been specially designed to assist in the intraoperative positioning of glenoid components used with total anatomic or reverse shoulder arthroplasty procedures using anatomic landmarks that are identifiable on patient-specific preoperative CT scans.

SmartSPACE Shoulder Planner software

SmartSPACE Shoulder Planner software is a medical device for surgeons composed of one software component. It is intended to be used as a pre-surgical planner for shoulder orthopedic surgery.

SmartSPACE Shoulder Planner software runs on standard personal and business computers running Microsoft Windows operating system.

The software supports DICOM standard to import the CT scan (Computed Tomography) images of the patient. Only CT scan modality can be loaded with the SmartSPACE Shoulder Planner software.

SmartSPACE Shoulder Planner software allows the surgeon to visualize, measure, reconstruct, annotate and edit anatomic data.

It allows the surgeon to design shoulder patient-specific instrumentation based on the pre-surgical plan. The software leads to the generation of a surgical report along with a 3D file of the shoulder patient-specific instrumentation.

SmartSPACE Shoulder Planner software does not include any system to manufacture the shoulder patient-specific instrumentation.

SmartSPACE Shoulder Planner software is to be used for adult patients only and should not be used for diagnostic purposes.

Device Description

The SmartSPACE Shoulder System consists of the SmartSPACE Shoulder Planner software and a 3D positioner which assists the user in planning reverse and anatomic total shoulder arthroplasty and gives the user the ability to translate the surgical plan intraoperatively using a 3D positioner for glenoid K-wire placement.

AI/ML Overview

The provided text is a 510(k) Summary for the SmartSPACE Shoulder System, which includes software and a 3D positioner. It describes the device's intended use and compares it to a predicate device, BLUEPRINT Patient Specific Instrumentation (K162800). The summary discusses various performance data but does not outline specific acceptance criteria in a quantitative table or details of a study that explicitly proves the device "meets acceptance criteria" in the way a clinical study typically would for performance metrics like sensitivity, specificity, accuracy, or reader improvement in AI contexts.

Instead, the document focuses on demonstrating substantial equivalence to a predicate device through non-clinical studies and verification/validation activities. The "Performance Testing" section serves as the closest equivalent to a study proving the device meets certain criteria, but these are primarily engineering and software validation tests, not a traditional clinical performance study with human readers and ground truth as might be expected for an AI device in a Multi-Reader Multi-Case (MRMC) study.

Therefore, many of the requested fields cannot be directly extracted from the provided text. I will fill in the available information and explicitly state when information is not present.

Here's the analysis based on the provided document:


Device: SmartSPACE Shoulder System (SmartSPACE Shoulder Planner software and SmartSPACE Shoulder 3D Positioner)

1. Table of Acceptance Criteria and Reported Device Performance

The document does not present a formal table of acceptance criteria with numerical performance targets and corresponding results for the "SmartSPACE Shoulder System" as a whole system in the context of clinical accuracy or reader performance typically discussed for AI/software devices. Instead, it lists types of validation tests with a general requirement for "compliance" or "should be compliant."

Acceptance Criteria (Stated Goal / Type of Validation)Reported Device Performance (as stated)
Glenoid Version and Inclination Angle Validation Test (between reference method and software calculation should be compliant)The version and inclination angle between reference method and the software calculation should be compliant. (Implies compliance achieved, no specific values given)
Patient-Specific Guiding Wire Test (Version angle error, inclination angle error and entry point error should be compliant)Version angle error, inclination angle error and entry point error should be compliant. (Implies compliance achieved, no specific values given)
Segmentation Accuracy Test (Mean Distance Error in the surgical zone between 3D reconstruction and the reference reconstruction should be compliant)Mean Distance Error in the surgical zone between 3D reconstruction and the reference reconstruction should be compliant. (Implies compliance achieved, no specific values given)
Cadaver Validation Study (Pre-operative surgical plan compared to post-operative implant position should be compliant)Pre-operative surgical plan compared to post-operative implant position should be compliant. (Implies compliance achieved, no specific values given)
Software Verification and Validation Testing (in accordance with requirements of IEC 62304)Conducted in accordance with the requirements of IEC 62304. (Implies successful completion)
Biocompatibility Testing (per ISO 10993-1)Conducted to ensure the biocompatibility of the materials. (Implies successful completion)
Sterilization & Shelf-life Testing (validate end-user sterilization protocol)Performed to validate the end-user sterilization protocol. (Implies successful completion)
Human Factors Validation Testing (ensure users able to use software and guide as intended with available instructions for use and training videos)Underwent Human Factors Validation testing to ensure users were able to use the software and guides as intended with the available instructions for use and training videos. (Implies successful completion)

2. Sample Size Used for the Test Set and Data Provenance

  • Sample Size: Not specified quantitatively. The document mentions studies "performed on cadaveric specimen or performed by using patients' data." It does not provide the number of cadavers or patient datasets used for any specific test.
  • Data Provenance: Not specified (e.g., country of origin). The document mentions "cadaveric specimen" and "patients' data," but does not specify if data was retrospective or prospective.

3. Number of Experts Used to Establish the Ground Truth for the Test Set and Qualifications of those Experts

Not specified. The document refers to "reference method" in the validation tests but does not detail how this reference or "ground truth" was established, nor does it mention the number or qualifications of any experts involved in defining it for the testing.

4. Adjudication Method for the Test Set

Not specified.

5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study was done, and the effect size of how much human readers improve with AI vs without AI assistance

No, a Multi-Reader Multi-Case (MRMC) comparative effectiveness study comparing human readers with and without AI assistance was not mentioned, nor are any effect sizes for human reader improvement. The device's validation is focused on technical accuracy of planning and positioning components, not on aiding human interpretation or diagnosis.

6. If a Standalone (i.e. algorithm only without human-in-the-loop performance) was done

The software component (SmartSPACE Shoulder Planner software) has "standalone" validation in terms of its calculations and functionality (e.g., Glenoid Version and Inclination Angle Validation Test, Patient-Specific Guiding Wire Test, Segmentation Accuracy Test) compared to a "reference method". However, the overall device is an instrumentation system that includes both software for planning and a patient-specific 3D positioner for intraoperative use. The validation tests (e.g., Cadaver Validation Study) involve both components in a practical setting.

7. The Type of Ground Truth Used

The ground truth or "reference method" for the validation tests seems to be based on:

  • Engineering measurements/calculations (for glenoid angles, guiding wire parameters, segmentation accuracy).
  • Comparison of pre-operative plans with post-operative implant positions in cadaveric studies.
    The document does not indicate the use of pathology or outcomes data as a ground truth, which is typical as this is a surgical planning and guiding system, not a diagnostic one.

8. The Sample Size for the Training Set

Not applicable/Not specified. The document does not describe the use of machine learning or AI models developed using a training set. This is a software and instrumentation system that relies on pre-programmed algorithms and patient-specific anatomical data derived from CT scans for planning, not a learning algorithm that would require a "training set" in the conventional machine learning sense.

9. How the Ground Truth for the Training Set Was Established

Not applicable, as no machine learning training set is described.

§ 888.3660 Shoulder joint metal/polymer semi-constrained cemented prosthesis.

(a)
Identification. A shoulder joint metal/polymer semi-constrained cemented prosthesis is a device intended to be implanted to replace a shoulder joint. The device limits translation and rotation in one or more planes via the geometry of its articulating surfaces. It has no linkage across-the-joint. This generic type of device includes prostheses that have a humeral resurfacing component made of alloys, such as cobalt-chromium-molybdenum, and a glenoid resurfacing component made of ultra-high molecular weight polyethylene, and is limited to those prostheses intended for use with bone cement (§ 888.3027).(b)
Classification. Class II. The special controls for this device are:(1) FDA's:
(i) “Use of International Standard ISO 10993 ‘Biological Evaluation of Medical Devices—Part I: Evaluation and Testing,’ ”
(ii) “510(k) Sterility Review Guidance of 2/12/90 (K90-1),”
(iii) “Guidance Document for Testing Orthopedic Implants with Modified Metallic Surfaces Apposing Bone or Bone Cement,”
(iv) “Guidance Document for the Preparation of Premarket Notification (510(k)) Application for Orthopedic Devices,” and
(v) “Guidance Document for Testing Non-articulating, ‘Mechanically Locked’ Modular Implant Components,”
(2) International Organization for Standardization's (ISO):
(i) ISO 5832-3:1996 “Implants for Surgery—Metallic Materials—Part 3: Wrought Titanium 6-aluminum 4-vandium Alloy,”
(ii) ISO 5832-4:1996 “Implants for Surgery—Metallic Materials—Part 4: Cobalt-chromium-molybdenum casting alloy,”
(iii) ISO 5832-12:1996 “Implants for Surgery—Metallic Materials—Part 12: Wrought Cobalt-chromium-molybdenum alloy,”
(iv) ISO 5833:1992 “Implants for Surgery—Acrylic Resin Cements,”
(v) ISO 5834-2:1998 “Implants for Surgery—Ultra-high Molecular Weight Polyethylene—Part 2: Moulded Forms,”
(vi) ISO 6018:1987 “Orthopaedic Implants—General Requirements for Marking, Packaging, and Labeling,” and
(vii) ISO 9001:1994 “Quality Systems—Model for Quality Assurance in Design/Development, Production, Installation, and Servicing,” and
(3) American Society for Testing and Materials':
(i) F 75-92 “Specification for Cast Cobalt-28 Chromium-6 Molybdenum Alloy for Surgical Implant Material,”
(ii) F 648-98 “Specification for Ultra-High-Molecular-Weight Polyethylene Powder and Fabricated Form for Surgical Implants,”
(iii) F 799-96 “Specification for Cobalt-28 Chromium-6 Molybdenum Alloy Forgings for Surgical Implants,”
(iv) F 1044-95 “Test Method for Shear Testing of Porous Metal Coatings,”
(v) F 1108-97 “Specification for Titanium-6 Aluminum-4 Vanadium Alloy Castings for Surgical Implants,”
(vi) F 1147-95 “Test Method for Tension Testing of Porous Metal,”
(vii) F 1378-97 “Standard Specification for Shoulder Prosthesis,” and
(viii) F 1537-94 “Specification for Wrought Cobalt-28 Chromium-6 Molybdenum Alloy for Surgical Implants.”