K Number
K170130
Date Cleared
2017-06-02

(140 days)

Product Code
Regulation Number
892.1650
Panel
RA
Reference & Predicate Devices
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

Dynamic Coronary Roadmap is intended to assist the physician during percutaneous coronary interventions in correlating the device position to the coronary vasculature, by providing a motion compensated overlay of this coronary vasculature. Dynamic Coronary Roadmap is suitable for use with the entire adult human population.

Dynamic Coronary Roadmap is a software medical device and does not come in contact with a human subject.

Device Description

The Dynamic Coronary Roadmap is a software medical device intended to provide a real-time and dynamic angiographic roadmap of coronary arteries. The angiographic roadmap is automatically generated from previously acquired diagnostic coronary angiograms during the same procedure.

Dynamic Coronary Roadmap uses coronary angiograms, acquired during a PCI procedure, to automatically generate a dynamic angiographic roadmap of the coronary vasculature. This roadmap is then overlaid on the live fluoroscopy images during device navigation. Dynamic Coronary Roadmap works in combination with a Philips interventional X-ray system. The user interface of Dynamic Coronary Roadmap guides the physician through the workflow and minimal additional user interaction from the tableside is required. The following design features support the physician with this:

  • Dynamic angiographic roadmap creation; this technique allows the physician to automatically construct a 2D dynamic angiographic roadmap of the coronary vasculature from a diagnostic coronary angiogram.
  • Live guidance; this technique provides continuous overlay of the dynamic angiographic roadmap on live fluoroscopic images.
  • X-ray system integration; this provides the physician with a seamless integration with the Philips interventional X-ray system. The clinical product supports:
    • Automatic power ON or OFF; this allows the software medical device to always be available by automatically powering ON and OFF with the X-ray system.
    • 3D Automatic Position Control (APC); this allows the C-arm to automatically move to a nearby available dynamic angiographic roadmap to be able to reuse this for live guidance.
    • Table-side control; this provides the physician with an efficient workflow during interventional procedures. The most frequently used functions that require additional user interaction next to the normal x-ray system interaction can be controlled from the tableside of the Xray system.
AI/ML Overview

Here's a breakdown of the acceptance criteria and the study information for the Dynamic Coronary Roadmap, based on the provided text:

1. Table of Acceptance Criteria and Reported Device Performance

The document does not explicitly present a table of quantitative acceptance criteria with corresponding performance metrics. Instead, it describes general categories of testing and validation, concluding that the device "passed" and "conforms" to requirements.

Acceptance Criteria CategoryReported Device Performance
General Verification Testing (System-level requirements, risk control, privacy, security, functional, non-functional aspects)"Software verification has been performed to cover the system level requirements... as well as the identified risk control measures... and the Privacy and Security requirements. These protocols address functional and non-functional aspects of DCR such as reliability, performance and interoperability."
Outcome: "passed" and "support the safety and effectiveness of the product."
Algorithm Verification Testing (Registration of catheter tip and guide wire)"Dedicated algorithm verification testing has been performed with standard angiographic and fluoroscopy x-ray data to ensure sufficient functioning of the algorithms. An indirect registration verification was performed focusing only on the registration of the catheter tip and the guide wire."
Outcome: "passed" and "support the safety and effectiveness of the product."
Usability Validation (Intended use, user needs, claims)"Performed with representative intended users in a simulated environment."
Outcome: "passed" and "support the safety and effectiveness of the product."
Expert Opinion Validation (Evaluation of variance, e.g., region, detector, patient, angles)"Performed in a simulated environment with certified interventional cardiologists. Standard angiographic and fluoroscopy x-ray data was used to allow the expert to evaluate a wide range of variance e.g. region to treat, detector format, patient, acquisition angles."
Outcome: "passed" and "support the safety and effectiveness of the product."
In-house Simulated Validation (Device navigation workflow, Instructions for Use, safety mitigations)"Protocols were created to address each clinical user need in the form of a device navigation workflow. Additional protocols were created to ensure that the Instructions for Use is written on the correct detail level as well as the verification of the effectiveness of the safety mitigations. The protocols were executed in a simulated clinical setting."
Outcome: "passed" and "support the safety and effectiveness of the product."

2. Sample Size for the Test Set and Data Provenance

The document states:

  • Test Set Description: "Standard angiographic and fluoroscopy x-ray data" was used for algorithm verification and expert opinion validation, allowing for evaluation of "a wide range of variance e.g. region to treat, detector format, patient, acquisition angles."
  • Sample Size: The exact number of cases or data points in the test set is not specified.
  • Data Provenance: The document does not explicitly state the country of origin or whether the data was retrospective or prospective. It only mentions "standard angiographic and fluoroscopy x-ray data."

3. Number of Experts Used to Establish Ground Truth for the Test Set and Their Qualifications

  • Number of Experts: The document states that "certified interventional cardiologists" participated in the Expert Opinion Validation. While it mentions "the expert" (singular) evaluating, it implicitly suggests a group of experts due to the nature of validation. However, the exact number of experts is not specified.
  • Qualifications: "Certified interventional cardiologists." No specific years of experience are mentioned.
  • For In-house Simulated Validation: "Experienced Clinical Marketing Specialists with clinical knowledge gained from work experience and hospital visits." These test participants "have experience in the relevant clinical area and therefore are considered equivalent to the intended operator profiles."

4. Adjudication Method for the Test Set

The document does not describe a formal adjudication method (e.g., 2+1, 3+1, none) for the test set. The "Expert Opinion Validation" involved experts evaluating performance, but the process for resolving disagreements or establishing a single ground truth from multiple expert opinions is not detailed.

5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study

  • Was it done? No, the document does not mention a multi-reader multi-case (MRMC) comparative effectiveness study.
  • Effect Size: Therefore, no effect size of AI assistance versus without AI assistance is reported.

6. Standalone Performance Study (Algorithm Only)

Yes, a standalone performance assessment of the algorithm was part of the "Algorithm Verification Testing." This testing was performed "with standard angiographic and fluoroscopy x-ray data to ensure sufficient functioning of the algorithms." It specifically focused on "registration of the catheter tip and the guide wire," implying an evaluation of the algorithm's output independently.

7. Type of Ground Truth Used

The concept of "ground truth" seems to be established through:

  • Expert Consensus/Opinion: For the "Expert Opinion Validation," the performance of the device was evaluated against the assessment of certified interventional cardiologists in a simulated environment. This implies that their collective judgment served as the benchmark.
  • Internal Validation/Simulated Clinical Setting: For the "In-house Simulated Validation," established protocols representing clinical user needs and safety mitigations were used, with evaluation performed by experienced clinical marketing specialists. This suggests a pre-defined set of expected outcomes or correct performance against which the device was measured.
  • System Requirements/Risk Control Measures: For "General Verification Testing," the device's performance was measured against "system level requirements" and "identified risk control measures."

8. Sample Size for the Training Set

The document does not specify the sample size used for the training set. It focuses on verification and validation data.

9. How the Ground Truth for the Training Set Was Established

The document does not describe how the ground truth for any training set was established, as it does not elaborate on the development or training of the algorithms, only their verification and validation.

§ 892.1650 Image-intensified fluoroscopic x-ray system.

(a)
Identification. An image-intensified fluoroscopic x-ray system is a device intended to visualize anatomical structures by converting a pattern of x-radiation into a visible image through electronic amplification. This generic type of device may include signal analysis and display equipment, patient and equipment supports, component parts, and accessories.(b)
Classification. Class II (special controls). An anthrogram tray or radiology dental tray intended for use with an image-intensified fluoroscopic x-ray system only is exempt from the premarket notification procedures in subpart E of part 807 of this chapter subject to the limitations in § 892.9. In addition, when intended as an accessory to the device described in paragraph (a) of this section, the fluoroscopic compression device is exempt from the premarket notification procedures in subpart E of part 807 of this chapter subject to the limitations in § 892.9.