K Number
K243842
Date Cleared
2025-03-06

(83 days)

Product Code
Regulation Number
870.1425
Panel
CV
Reference & Predicate Devices
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

The IOPS (Intra-Operative Positioning System) is intended for the evaluation of vascular anatomy as captured via 3D modeling from previously acquired scan data. It is intended for real time tip positioning and navigation using sensor-equipped compatible catheters and guidewires used in endovascular interventions in the peripheral, aortic side branch vasculature. The system is indicated for use as an adjunct to fluoroscopy. The IOPS does not make a diagnosis.

Device Description

The Intra-Operative Positioning System (IOPS) consists of a surgical navigation technology and a number of associated accessories. The navigation technology is a non-contact reusable multi-patient use device. The associated accessories are single use devices provided sterile (EtO).

The IOPS displays the position and orientation of sensor equipped catheters, guidewires, and tracking pad utilizing electromagnetic tracking technology. The system enables mapping of the patient's vascular system utilizing previously acquired CT scan data. IOPS registers the location and orientation of the sensors in real time, superimposing navigation of the catheters and guidewires to the patient's vascular map displayed on a monitor. The system is for use as an adjunct to fluoroscopy and does not make a diagnosis.

The associated accessories include:

  • Guidewire
  • Catheters
  • Fiducial Tracking Pad
  • Guidewire Handle
AI/ML Overview

The provided text is a 510(k) summary for the Intra-Operative Positioning System (IOPS®). It outlines the device, its indications for use, and a comparison to predicate devices, focusing on the justification for substantial equivalence. However, it does not contain a detailed study report with specific acceptance criteria and performance data in the structured format requested.

The document primarily focuses on regulatory approval based on demonstrating substantial equivalence to existing devices, highlighting that changes in indications for use and software versions do not raise new questions of safety or effectiveness. It states that "Software verification was conducted to establish equivalency to the predicate device in safety and effectiveness" and "Design Validation, including usability testing, was performed to support the inclusion of all peripheral, aortic and aortic side branch vasculature vessels in the indications for use statement and to establish equivalency to the predicate device in safety and effectiveness."

Without a specific performance study report, I cannot provide a table of acceptance criteria and reported device performance, nor can I detail the sample sizes, data provenance, expert qualifications, or adjudication methods for a test set, nor can I provide information on MRMC studies or effect sizes. The document explicitly states "No animal testing was completed as a part of this submission," and "No clinical testing was completed as a part of this submission."

Therefore, based solely on the provided text, a comprehensive answer to your request is not possible. The information regarding acceptance criteria and performance data, as typically found in a detailed study, is not present here.

However, I can interpret what is provided and explain why the specific details you're asking for are missing from this type of FDA document:

This document is an FDA 510(k) clearance letter and summary. Its purpose is to demonstrate "substantial equivalence" of a new device to an already legally marketed predicate device, not necessarily to present a full, detailed clinical or performance study with all the metrics you've requested.

The "performance data" section states:

  • Bench Testing: "Software verification was conducted to establish equivalency to the predicate device in safety and effectiveness." "Design Validation, including usability testing, was performed to support the inclusion of all peripheral, aortic and aortic side branch vasculature vessels in the indications for use statement and to establish equivalency to the predicate device in safety and effectiveness." It mentions that "physician targeting of three regions for cannulation" and "technicians were given test case scenarios and asked to place the tracking pad" were part of these studies.
  • Animal Testing: "No animal testing was completed as a part of this submission."
  • Clinical Testing: "No clinical testing was completed as a part of this submission."

This implies that the "proof" for meeting acceptance criteria was primarily based on:

  1. Software Verification: Ensuring the new software version functions correctly and reliably, likely against pre-defined software requirements and potentially compared to the predicate's software performance. The acceptance criteria would be internal software validation metrics (e.g., bug rates, functional correctness tests).
  2. Design Validation / Usability Testing (Benchtop): Assessing if the device, with its expanded indications, can be used safely and effectively as intended. This would involve simulated use cases. Acceptance criteria would likely relate to successful navigation/targeting within specified tolerances, ease of use, and lack of critical errors.

Based on the available text, here's what can be inferred or stated directly, along with what is missing:


Acceptance Criteria and Device Performance (Inferred/General from Text):

Acceptance Criteria CategoryReported Device Performance / Justification
Overall Equivalence"The successful completion of non-clinical testing demonstrates that IOPS performs as intended and is substantially equivalent to the predicate device."
Software Performance"Software verification was conducted to establish equivalency to the predicate device in safety and effectiveness."
Specific metrics (e.g., accuracy, precision, latency) are not provided in this summary.
Design Validation / Usability (for expanded indications)"Design Validation, including usability testing, was performed to support the inclusion of all peripheral, aortic and aortic side branch vasculature vessels in the indications for use statement and to establish equivalency to the predicate device in safety and effectiveness."
"These studies included physician targeting of three regions for cannulation: branches of the aortic arch, distal branches of the descending aorta, and branches of the peripheral vasculature."
"In addition, technicians were given test case scenarios and asked to place the tracking pad in each of the three proposed expanded indications regions."
Specific performance outcomes (e.g., targeting accuracy, success rates) and quantitative acceptance thresholds are not detailed in this summary.
Safety"There are no hardware changes to IOPS and its associated accessories and therefore Electrical, Mechanical and Thermal Safety Testing is not needed."
"The change in the indications for use does not raise new or different questions of safety or effectiveness."

Detailed Study Information (Based on provided text):

  1. A table of acceptance criteria and the reported device performance:

    • As shown above, the document infers acceptance criteria from the statements about "software verification" and "design validation/usability testing" for "equivalency" and "safety and effectiveness" for the expanded indications.
    • Specific quantitative acceptance criteria (e.g., "accuracy > X mm") and their corresponding reported values are NOT detailed in this summary document. This level of detail would typically be found in the full submission, not the public-facing summary.
  2. Sample sizes used for the test set and the data provenance:

    • Sample Size: Not specified for the software verification or design validation/usability testing. It mentions "physician targeting of three regions" and "technicians were given test case scenarios," implying a small, representative sample, likely for a benchtop study, but no numbers are given.
    • Data Provenance: Implied to be prospective benchtop testing conducted specifically for this submission. Country of origin is not explicitly stated, but the company is US-based.
  3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts:

    • The document refers to "physician targeting" and "technicians." It does not specify the number or detailed qualifications beyond "physician" and "technician."
    • No "ground truth" in the sense of a diagnostic consensus is mentioned, as the device "does not make a diagnosis." The "ground truth" for these performance tests would likely be the known anatomical targets in the 3D models and potentially physical measurements of the device's accuracy during tracking.
  4. Adjudication method (e.g. 2+1, 3+1, none) for the test set:

    • Not applicable/Not mentioned. The testing described is operational/performance testing, not diagnostic interpretation requiring adjudication among readers.
  5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:

    • No MRMC study was done, nor is the device described as an AI or diagnostic device that assists human readers. The device is a "real time tip positioning and navigation system" that is an "adjunct to fluoroscopy." It does not involve human readers interpreting images with or without AI assistance.
  6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done:

    • The "Software verification" and underlying accuracy measurements (though not detailed here) would represent the standalone performance of the algorithm for tracking and positioning without human interaction, but its intended use is always human-in-the-loop as a navigation aid.
  7. The type of ground truth used (expert consensus, pathology, outcomes data, etc):

    • For this type of navigation device, the "ground truth" would be the physical accuracy of the electromagnetic tracking system relative to a known model or phantom, and the real-time overlay accuracy onto pre-acquired 3D CT scan data. This is an engineering and spatial accuracy ground truth, not a diagnostic or pathological ground truth as seen in imaging AI.
  8. The sample size for the training set:

    • Not applicable/Not mentioned as this device is not described as a machine learning/AI device that requires a training set of images or data in the conventional sense. Its function is based on electromagnetic tracking and anatomical registration, not on learning from a large dataset to identify patterns or make predictions.
  9. How the ground truth for the training set was established:

    • Not applicable, as there is no mention of a training set or machine learning components requiring labeled data.

§ 870.1425 Programmable diagnostic computer.

(a)
Identification. A programmable diagnostic computer is a device that can be programmed to compute various physiologic or blood flow parameters based on the output from one or more electrodes, transducers, or measuring devices; this device includes any associated commercially supplied programs.(b)
Classification. Class II (performance standards).