Search Filters

Search Results

Found 2 results

510(k) Data Aggregation

    K Number
    K172307
    Date Cleared
    2017-10-24

    (85 days)

    Product Code
    Regulation Number
    892.1650
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    Dynamic Coronary Roadmap 2.0

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    Dynamic Coronary Roadmap is intended to assist the physician during percutaneous coronary interventions in correlating the device position to the coronary vasculature, by providing a motion compensated overlay of this coronary vasculature.

    The FFR/FR Roadmap feature is intended to assist the physician during percutaneous coronary interventions in relating the intravascular blood pressurement to its anatomical location. FFR/iFR roadmap visualizes the position of the pressure wire and the coronary artery on an X-ray image at the moment that an intravascular blood pressure measurement was performed as well as the intravascular blood pressure measurement values themselves.

    Dynamic Coronary Roadmap is suitable for use with the entire adult human population.

    Device Description

    Dynamic Coronary Roadmap is a software medical device intended to provide a real-time and dynamic angiographic roadmap of coronary arteries.

    The angiographic roadmap is automatically generated from previously acquired diagnostic coronary angiograms during the same procedure.

    Dynamic Coronary Roadmap overlays the angiographic roadmap on live 2D fluoroscopic images, thereby assisting the physician in navigating devices, e.g. (guide) wires, catheters, through the coronary arteries.

    Dynamic Coronary Roadmap is to be used in combination with a Philips Interventional X-ray system.

    When also used in conjunction with a compatible intravascular blood pressure measurement system, Dynamic Coronary Roadmap offers an FFR / iFR Roadmap feature. This feature co-registers the information of the blood pressure within a coronary artery with the corresponding X-ray image of the pressure wire within that coronary artery.

    AI/ML Overview

    Here's a breakdown of the acceptance criteria and study information for the Dynamic Coronary Roadmap 2.0 device, based on the provided text:

    1. Table of Acceptance Criteria and Reported Device Performance

    Acceptance Criteria CategorySpecific CriteriaReported Device Performance
    Software VerificationImplementation of all System Requirements Specification.All executed tests passed.
    Implementation of identified safety risk control measures.All executed tests passed.
    Implementation of Privacy and Security requirements.All executed tests passed.
    Usability ValidationPredefined criteria for mean task completion rates.Exceeded predefined criteria.
    Predefined criteria for system usability scores.Exceeded predefined criteria.
    Predefined criteria for net promoter scores.Exceeded predefined criteria.
    Expert Opinion ValidationPredefined acceptance criteria for expert analysis of preclinical datasets.Actual acceptance scores exceeded predefined acceptance criteria.
    In-house Simulated Use ValidationSuccessful execution of validation protocols for device navigation workflow, user needs, intended use, and safety/security effectiveness.All executed validation protocols were passed.
    Compliance with Consensus StandardsAdherence to a list of specified FDA-recognized consensus standards (e.g., IEC 62304, IEC 62366-1, ISO 14971, ISO 15223-1, NEMA PS 3.1-3.20).Demonstrates compliance.
    Compliance with FDA Guidance DocumentsAdherence to a list of specified FDA guidance documents.Demonstrates compliance.
    Safety and EffectivenessComparable safety and effectiveness to the predicate device.Substantially equivalent; does not raise new safety/effectiveness concerns.
    Intended UseConforms to the stated intended use.Conforms to intended use.
    User Needs and ClaimsConforms to user needs and claims.Conforms to user needs and claims.

    2. Sample Size Used for the Test Set and Data Provenance

    The document does not explicitly state a specific "sample size" in terms of number of cases or images for the validation studies. It refers to:

    • Usability validation: Performed with certified interventional cardiologists in a simulated environment.
    • Expert opinion validation: Experts analyzed "a wide range of pre-clinical datasets" in a simulated environment.
    • In-house simulated use validation: Performed with "experienced Clinical Marketing specialists."

    Data Provenance: The studies were conducted in a simulated environment using pre-clinical datasets for the expert opinion validation. No information is provided about the country of origin of the data or whether it was retrospective or prospective.

    3. Number of Experts Used to Establish the Ground Truth for the Test Set and Qualifications of Those Experts

    • Usability validation: "Certified interventional cardiologists" were used. The number is not specified.
    • Expert opinion validation: "Certified interventional cardiologists" were used. The number is not specified.
    • In-house simulated use validation: "Experienced Clinical Marketing specialists" were used, who fulfill the intended user profile based on clinical knowledge gained from work experience and hospital visits. The number is not specified.

    The specific "ground truth" establishment for a test set (e.g., expert consensus) isn't detailed for a diagnostic-style performance evaluation, as this a medical device assistance system rather than a diagnostic one. The focus is on usability, safety, and effectiveness in assisting physicians.

    4. Adjudication Method for the Test Set

    No specific adjudication method (e.g., 2+1, 3+1) is mentioned. The validation approaches (usability, expert opinion, in-house simulated use) suggest a qualitative assessment rather than a quantitative ground truth adjudicated by multiple experts for diagnostic performance.

    5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study

    No MRMC comparative effectiveness study was mentioned. The device's substantial equivalence was demonstrated through non-clinical performance testing and comparison to a predicate device, not by showing human readers improve with AI assistance. The device is for assistance, not a standalone diagnostic tool.

    6. Standalone (Algorithm Only Without Human-in-the-Loop Performance) Study

    A standalone performance study of the algorithm's diagnostic capabilities was not conducted or reported. The device is explicitly designed as an assistance tool for physicians during percutaneous coronary interventions, providing a "motion compensated overlay" and "visualization of the pressure wire." The testing focused on its function as an assistive device with human interaction (usability, expert opinion, simulated use).

    The document states: "Algorithm verification was not warranted as there are no algorithms implemented for the FFR/iFR Roadmap feature and no changes have been made to the existing Roadmapping algorithm that impacted its performance compared to the currently marketed predicate device Dynamic Coronary Roadmap 1.0 (K170130)." This indicates that the core roadmapping algorithm was considered established from the predicate device and did not require re-verification, and the new FFR/iFR feature didn't involve new algorithms in the same way.

    7. The Type of Ground Truth Used

    The "ground truth" for the performance evaluation can be best characterized as:

    • User Performance Metrics: For usability testing (task completion rates, usability scores, net promoter scores).
    • Expert Opinion/Consensus: For expert opinion validation, where certified interventional cardiologists analyzed preclinical datasets against predefined acceptance criteria.
    • Conformance to User Needs/Intended Use: For in-house simulated use validation, ensuring the device met its intended purpose and user requirements.

    This is distinct from a ground truth established for diagnostic accuracy, which would typically involve pathology, clinical outcomes, or comprehensive expert consensus on findings.

    8. Sample Size for the Training Set

    The document does not provide any information about the sample size used for a training set. Given that "Algorithm verification was not warranted" and no changes were made to the core roadmapping algorithm, it's implied that any algorithm training for the core functionality would have occurred for the predicate device (Dynamic Coronary Roadmap 1.0) and is not discussed here for 2.0. The new FFR/iFR feature is described as co-registration and visualization, rather than a new machine learning algorithm requiring a separate training set.

    9. How the Ground Truth for the Training Set Was Established

    Since no information is provided about a training set for Dynamic Coronary Roadmap 2.0's new features, or retraining of existing algorithms, the method for establishing ground truth for a training set is not detailed in this submission.

    Ask a Question

    Ask a specific question about this device

    K Number
    K170130
    Date Cleared
    2017-06-02

    (140 days)

    Product Code
    Regulation Number
    892.1650
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    Dynamic Coronary Roadmap

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    Dynamic Coronary Roadmap is intended to assist the physician during percutaneous coronary interventions in correlating the device position to the coronary vasculature, by providing a motion compensated overlay of this coronary vasculature. Dynamic Coronary Roadmap is suitable for use with the entire adult human population.

    Dynamic Coronary Roadmap is a software medical device and does not come in contact with a human subject.

    Device Description

    The Dynamic Coronary Roadmap is a software medical device intended to provide a real-time and dynamic angiographic roadmap of coronary arteries. The angiographic roadmap is automatically generated from previously acquired diagnostic coronary angiograms during the same procedure.

    Dynamic Coronary Roadmap uses coronary angiograms, acquired during a PCI procedure, to automatically generate a dynamic angiographic roadmap of the coronary vasculature. This roadmap is then overlaid on the live fluoroscopy images during device navigation. Dynamic Coronary Roadmap works in combination with a Philips interventional X-ray system. The user interface of Dynamic Coronary Roadmap guides the physician through the workflow and minimal additional user interaction from the tableside is required. The following design features support the physician with this:

    • Dynamic angiographic roadmap creation; this technique allows the physician to automatically construct a 2D dynamic angiographic roadmap of the coronary vasculature from a diagnostic coronary angiogram.
    • Live guidance; this technique provides continuous overlay of the dynamic angiographic roadmap on live fluoroscopic images.
    • X-ray system integration; this provides the physician with a seamless integration with the Philips interventional X-ray system. The clinical product supports:
      • Automatic power ON or OFF; this allows the software medical device to always be available by automatically powering ON and OFF with the X-ray system.
      • 3D Automatic Position Control (APC); this allows the C-arm to automatically move to a nearby available dynamic angiographic roadmap to be able to reuse this for live guidance.
      • Table-side control; this provides the physician with an efficient workflow during interventional procedures. The most frequently used functions that require additional user interaction next to the normal x-ray system interaction can be controlled from the tableside of the Xray system.
    AI/ML Overview

    Here's a breakdown of the acceptance criteria and the study information for the Dynamic Coronary Roadmap, based on the provided text:

    1. Table of Acceptance Criteria and Reported Device Performance

    The document does not explicitly present a table of quantitative acceptance criteria with corresponding performance metrics. Instead, it describes general categories of testing and validation, concluding that the device "passed" and "conforms" to requirements.

    Acceptance Criteria CategoryReported Device Performance
    General Verification Testing (System-level requirements, risk control, privacy, security, functional, non-functional aspects)"Software verification has been performed to cover the system level requirements... as well as the identified risk control measures... and the Privacy and Security requirements. These protocols address functional and non-functional aspects of DCR such as reliability, performance and interoperability."
    Outcome: "passed" and "support the safety and effectiveness of the product."
    Algorithm Verification Testing (Registration of catheter tip and guide wire)"Dedicated algorithm verification testing has been performed with standard angiographic and fluoroscopy x-ray data to ensure sufficient functioning of the algorithms. An indirect registration verification was performed focusing only on the registration of the catheter tip and the guide wire."
    Outcome: "passed" and "support the safety and effectiveness of the product."
    Usability Validation (Intended use, user needs, claims)"Performed with representative intended users in a simulated environment."
    Outcome: "passed" and "support the safety and effectiveness of the product."
    Expert Opinion Validation (Evaluation of variance, e.g., region, detector, patient, angles)"Performed in a simulated environment with certified interventional cardiologists. Standard angiographic and fluoroscopy x-ray data was used to allow the expert to evaluate a wide range of variance e.g. region to treat, detector format, patient, acquisition angles."
    Outcome: "passed" and "support the safety and effectiveness of the product."
    In-house Simulated Validation (Device navigation workflow, Instructions for Use, safety mitigations)"Protocols were created to address each clinical user need in the form of a device navigation workflow. Additional protocols were created to ensure that the Instructions for Use is written on the correct detail level as well as the verification of the effectiveness of the safety mitigations. The protocols were executed in a simulated clinical setting."
    Outcome: "passed" and "support the safety and effectiveness of the product."

    2. Sample Size for the Test Set and Data Provenance

    The document states:

    • Test Set Description: "Standard angiographic and fluoroscopy x-ray data" was used for algorithm verification and expert opinion validation, allowing for evaluation of "a wide range of variance e.g. region to treat, detector format, patient, acquisition angles."
    • Sample Size: The exact number of cases or data points in the test set is not specified.
    • Data Provenance: The document does not explicitly state the country of origin or whether the data was retrospective or prospective. It only mentions "standard angiographic and fluoroscopy x-ray data."

    3. Number of Experts Used to Establish Ground Truth for the Test Set and Their Qualifications

    • Number of Experts: The document states that "certified interventional cardiologists" participated in the Expert Opinion Validation. While it mentions "the expert" (singular) evaluating, it implicitly suggests a group of experts due to the nature of validation. However, the exact number of experts is not specified.
    • Qualifications: "Certified interventional cardiologists." No specific years of experience are mentioned.
    • For In-house Simulated Validation: "Experienced Clinical Marketing Specialists with clinical knowledge gained from work experience and hospital visits." These test participants "have experience in the relevant clinical area and therefore are considered equivalent to the intended operator profiles."

    4. Adjudication Method for the Test Set

    The document does not describe a formal adjudication method (e.g., 2+1, 3+1, none) for the test set. The "Expert Opinion Validation" involved experts evaluating performance, but the process for resolving disagreements or establishing a single ground truth from multiple expert opinions is not detailed.

    5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study

    • Was it done? No, the document does not mention a multi-reader multi-case (MRMC) comparative effectiveness study.
    • Effect Size: Therefore, no effect size of AI assistance versus without AI assistance is reported.

    6. Standalone Performance Study (Algorithm Only)

    Yes, a standalone performance assessment of the algorithm was part of the "Algorithm Verification Testing." This testing was performed "with standard angiographic and fluoroscopy x-ray data to ensure sufficient functioning of the algorithms." It specifically focused on "registration of the catheter tip and the guide wire," implying an evaluation of the algorithm's output independently.

    7. Type of Ground Truth Used

    The concept of "ground truth" seems to be established through:

    • Expert Consensus/Opinion: For the "Expert Opinion Validation," the performance of the device was evaluated against the assessment of certified interventional cardiologists in a simulated environment. This implies that their collective judgment served as the benchmark.
    • Internal Validation/Simulated Clinical Setting: For the "In-house Simulated Validation," established protocols representing clinical user needs and safety mitigations were used, with evaluation performed by experienced clinical marketing specialists. This suggests a pre-defined set of expected outcomes or correct performance against which the device was measured.
    • System Requirements/Risk Control Measures: For "General Verification Testing," the device's performance was measured against "system level requirements" and "identified risk control measures."

    8. Sample Size for the Training Set

    The document does not specify the sample size used for the training set. It focuses on verification and validation data.

    9. How the Ground Truth for the Training Set Was Established

    The document does not describe how the ground truth for any training set was established, as it does not elaborate on the development or training of the algorithms, only their verification and validation.

    Ask a Question

    Ask a specific question about this device

    Page 1 of 1