K Number
K170144
Device Name
StentBoost Live
Date Cleared
2017-06-07

(141 days)

Product Code
Regulation Number
892.1650
Panel
RA
Reference & Predicate Devices
Predicate For
N/A
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

StentBoost Live is intended to assist the physician during percutaneous coronary interventions.
StentBoost Live provides real-time enhanced visualization of stents.
StentBoost Live provides real-time enhanced visualization of stents in relation to coronary vessels.
StentBoost Live assists in the treatment of cardiovascular diseases by visualizing the placement and deployment of coronary stents.
StentBoost Live is suitable for use with the entire adult human population.
StentBoost Live is a software medical device and does not come in contact with a human subject.

Device Description

StentBoost Live is a software medical device intended to provide enhanced visualization of stents in coronary vessels in real-time. It supports the physician in placing and deploying stents.
The StentBoost Live is connected to a Philips interventional X-ray System and uses X-ray generated data as input.
StentBoost Live uses radiopaque balloon markers to provide real-time stent enhanced visualization by displaying a motion compensated average stent image.
StentBoost Live provides X-ray System integration with the Philips interventional X-ray System to allow automatic power on and off of StentBoost Live. In addition the user interactions can be controlled from the table-side of the X-ray System.
StentBoost Live provides the option for the physician to create DICOM compatible snapshots and movies of the enhanced image for reporting and archiving when connected to a picture archiving and communication system (PACS).

AI/ML Overview

Here's a breakdown of the acceptance criteria and the study information for the Philips StentBoost Live, based on the provided document:

1. Table of Acceptance Criteria and Reported Device Performance

The document does not explicitly present a "table of acceptance criteria" with numerical performance metrics against which the device was measured for a specific clinical task. Instead, it describes various verification and validation activities conducted to ensure the device performs as intended and is safe and effective. The device performance is generally stated in terms of its ability to meet documented requirements and user needs.

Acceptance Criteria Category (Derived)StentBoost Live Performance (Reported in Document)
Software Life Cycle ComplianceComplies with IEC 62304
Usability Engineering ApplicationComplies with IEC 62366-1
Risk ManagementComplies with ISO 14971
Labeling/SymbolsComplies with ISO 15223-1
DICOM CompatibilityComplies with NEMA PS 3.1 - 3.20 (2016)
General Verification TestingPassed tests for system level requirements, risk control measures, privacy, security, reliability, performance, and interoperability.
Algorithm Verification (Marker Detection)Demonstrated sufficient functioning for marker pair detection performance.
Algorithm Verification (Benchmark vs. Predicate)Benchmarked against predicate device (StentBoost R4) to ensure sufficient functioning of algorithms. (Specific metrics not provided, but implies comparable or improved performance).
Usability ValidationPerformed with representative intended users in a simulated environment; results support intended use, user needs, and claims.
Expert Opinion ValidationPerformed in a simulated environment with certified interventional cardiologists; experts evaluated a wide range of previously acquired X-ray data. (Implies positive expert evaluation, specific metrics not given).
In-House Simulated ValidationValidation protocols addressing clinical user needs and device navigation workflow were executed and passed by experienced Clinical Marketing Specialists.

2. Sample Size Used for the Test Set and Data Provenance

  • Test Set Sample Size:
    • Algorithm Verification Testing: "previously acquired x-ray data" was used. The exact number of cases or images is not specified.
    • Expert Opinion Validation: "Previously acquired x-ray data was used to allow the expert to evaluate a wide range of variance e.g. region to treat, detector format, patient, acquisition angles." The exact number of cases is not specified.
    • Usability Validation: Conducted with "representative intended users" in a simulated environment. The number of users or sessions is not specified.
    • In-house Simulated Validation: Validation protocols were performed. The number of cases or scenarios is not specified.
  • Data Provenance: The document states "previously acquired x-ray data" for algorithm and expert opinion validation. There is no information provided regarding the country of origin of this data or if it was retrospective or prospective. Given it was "previously acquired," it implies retrospective data.

3. Number of Experts Used to Establish Ground Truth for the Test Set and Qualifications

  • Number of Experts: "certified interventional cardiologists" were used for expert opinion validation. The specific number of cardiologists is not stated.
  • Qualifications of Experts: "certified interventional cardiologists." For in-house simulated validation, "experienced Clinical Marketing Specialists with clinical knowledge gained from work experience and hospital visits" were used, and stated to be "equivalent to the intended operator profiles." No years of experience are specified for either type of expert.
  • Ground Truth Establishment for Test Set: For algorithm verification, the ground truth for marker detection and benchmarking against the predicate was likely established through internal methods by Philips. For expert opinion validation, the experts themselves performed the evaluation of the device's visualization capabilities, effectively establishing a subjective "ground truth" on performance in a simulated clinical context.

4. Adjudication Method for the Test Set

The document does not explicitly describe an adjudication method (e.g., 2+1, 3+1) for the expert opinion or any other validation tests where multiple experts might have been involved. The "expert opinion validation" implies experts individually evaluated the device, but how disagreements or consensus was reached is not detailed.

5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study

No multi-reader, multi-case (MRMC) comparative effectiveness study comparing human readers with AI assistance versus without AI assistance was reported. The evaluation seems to focus on the standalone performance and usability of StentBoost Live. The document states: "StentBoost Live did not require clinical trials to establish substantial equivalence to the currently marketed StentBoost R4."

6. Standalone Performance Study (Algorithm Only)

Yes, a standalone performance study in the form of "Algorithm verification testing" was done. This testing involved:

  • "determining marker pair detection performance"
  • "benchmark testing against the predicate device, StentBoost R4."

These tests evaluated the algorithms using "previously acquired x-ray data" to ensure their "sufficient functioning."

7. Type of Ground Truth Used

  • Algorithm Verification: The ground truth for marker pair detection was likely defined by Philips engineers based on expected or known marker locations in the "previously acquired x-ray data." For benchmarking, the predicate device's output served as a reference.
  • Expert Opinion Validation: The ground truth was established by the subjective evaluation and professional judgment of the "certified interventional cardiologists" regarding the "enhanced visualization of stents." This is a form of expert consensus/opinion in a simulated environment.
  • Usability and In-House Simulated Validation: The "ground truth" was defined by predefined user needs, clinical workflows, and safety mitigations, where successful execution against these criteria constituted validation.

8. Sample Size for the Training Set

The document does not provide any information regarding the sample size used for the training set of the StentBoost Live algorithms. It is inferred that algorithms were trained, but details about the training data are not disclosed.

9. How Ground Truth for the Training Set Was Established

The document does not contain information on how the ground truth was established for any training set.

§ 892.1650 Image-intensified fluoroscopic x-ray system.

(a)
Identification. An image-intensified fluoroscopic x-ray system is a device intended to visualize anatomical structures by converting a pattern of x-radiation into a visible image through electronic amplification. This generic type of device may include signal analysis and display equipment, patient and equipment supports, component parts, and accessories.(b)
Classification. Class II (special controls). An anthrogram tray or radiology dental tray intended for use with an image-intensified fluoroscopic x-ray system only is exempt from the premarket notification procedures in subpart E of part 807 of this chapter subject to the limitations in § 892.9. In addition, when intended as an accessory to the device described in paragraph (a) of this section, the fluoroscopic compression device is exempt from the premarket notification procedures in subpart E of part 807 of this chapter subject to the limitations in § 892.9.