Search Results
Found 1 results
510(k) Data Aggregation
(532 days)
The Cardiovascular Suite 4.2.1 is a software program that is intended to aid trained healthcare in the quantitative analysis of vascular ultrasound images in adults, particularly for the measurement of the diameter and its changes on the brachial artery, the diameter and its changes on the carotid Intima-Media Thickness, and for carotid plaque analysis.
The Cardiovascular Suite 4.2.1 is a software indicated for estimating early cardiovascular parameters by identifying and tracking the edges of the arteries by analyzing sequences of ultrasound images or single images of the longitudinal section of the vessel. The software consists of two main functional measurement modules: 1) the FMD-Studio for measuring Flow-Mediated-Dilation (FMD) of the brachial artery, by processing sequences of ultrasound image 2) The Carotid-Studio for measuring, by processing sequences of ultrasound images, the thickness of the carotid intima-media and the instantaneous carotid diameter that, associated with a pressure estimate, can provide arterial elasticity parameters. On single images, the software also provides a tool for Plaque Measurement and Quantification. The system is able to process previously recorded video files or directly process the video output of an ultrasound system in real time.
Here's a summary of the acceptance criteria and the study details for the Cardiovascular Suite 4.2.1, based on the provided document:
Acceptance Criteria and Device Performance
The acceptance criteria are implied by the precision and accuracy metrics evaluated during the validation testing. The device is considered to meet these criteria if its performance falls within acceptable ranges for repeatability and agreement with an expert's measurements. The table below summarizes the reported device performance, which serves as the fulfillment of these implied criteria.
Table of Acceptance Criteria and Reported Device Performance
| Measurement/Metric | Acceptance Criterion (Implied by reported performance) | Reported Device Performance (Cardiovascular Suite 4.2.1) |
|---|---|---|
| FMD Studio Precision | Coefficient of Variation (CV) | |
| Intra-observer intra-session FMD% variability | ≤ 10% | 9.9% ± 8.4% (reported as 10%) |
| Intra-observer inter-session FMD% variability | ≤ 13% | 12.9% ± 11.6% (reported as 13%) |
| Shear Rate measurement precision | ≤ 2.3% (specifically stated) | 2.3% |
| Carotid Studio Precision | Coefficient of Variation (CV) | |
| Intra-session Diameter variation | ≤ 2% | 2% |
| Intra-session IMT | ≤ 6% | 6% |
| Inter-session Diameter variation | ≤ 3% | 3% |
| Inter-session Diameter variation (cardiac cycle) | ≤ 12% | 12% |
| Inter-session IMT | ≤ 6% | 6% |
| Plaque geometric and statistics (single image) | ≤ 10% | < 10% for each measurement |
| Carotid Analyzer Accuracy (IMT) | Bias ± SD for IMT accuracy (specifically stated) | 0.006 ± 0.039mm |
| Carotid Analyzer Accuracy (Diameter) | Bias ± SD for Diameter accuracy (specifically stated) | 0.060 ± 0.110mm |
| Carotid Analyzer Accuracy (Distension) | Bias ± SD for Distension accuracy (specifically stated) | 0.016 ± 0.039mm |
| FMD Analyzer Accuracy (% diameter variation) | Error ≤ 0.013% (specifically stated) | 0.013% |
Study Information
-
Sample sizes used for the test set and the data provenance:
- FMD Studio Precision: 135 healthy volunteers. The data provenance is from seven Italian centers (prospective study, country of origin: Italy).
- Carotid Studio Precision: 10 healthy volunteers. The provenance is not explicitly stated but implies a controlled, prospective study.
- Compatible Ultrasound Devices Accuracy: Not explicitly stated as a numerical sample size but refers to "full set of images" and "each sub-set of images coming from each of the 15 devices." This suggests a test set composed of images from 15 different ultrasound devices, with a total of 120 carotid artery images (60 online/60 offline) and 120 brachial artery images (60 online/60 offline). The provenance is not explicitly stated.
-
Number of experts used to establish the ground truth for the test set and the qualifications of those experts:
- Carotid Studio Precision: Two Clinical Operators (Opr 1 and Opr 2) performed measurements. Their specific qualifications (e.g., "radiologist with 10 years of experience") are not provided.
- Compatible Ultrasound Devices Accuracy: An unspecified "expert" provided the gold-standard manual measurements. The specific qualifications of this expert are not provided.
-
Adjudication method for the test set:
- Carotid Studio Precision: For the "precision" part of the Carotid Studio module, Opr 1 and Opr 2 each measured vessels three times. Opr 1 repeated the analysis during a second session. This suggests direct comparison of individual measurements for variability, rather than a formal adjudication of a ground truth.
- Compatible Ultrasound Devices Accuracy: The device's measurements were "compared with gold-standard measurements manually obtained by an expert." This implies a direct comparison rather than a consensus or adjudication among multiple readers/experts.
-
If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:
- No MRMC comparative effectiveness study is described where human readers' performance with and without AI assistance is evaluated. The studies focus on the precision and accuracy of the device itself or comparison between the device and expert/gold-standard measurements.
-
If a standalone (i.e. algorithm only without human-in-the loop performance) was done:
- Yes, the performance testing described for both FMD Studio and Carotid Studio precision, and the accuracy comparisons for both modules, appear to be evaluating the standalone performance of the Cardiovascular Suite 4.2.1 software. For instance, the "accuracy of the following measurement of the CVS software was evaluated" and "The measurements were carried out by our software and compared with gold-standard measurements manually obtained by an expert." This indicates standalone algorithm performance evaluation.
-
The type of ground truth used (expert consensus, pathology, outcomes data, etc):
- FMD Studio Accuracy: Evaluated on "synthetic image sequences."
- Carotid Studio Accuracy: Agreement with "RF based gold-standard" (likely referring to RadioFrequency-based quantitative ultrasound measurements) and "manual measurements manually obtained by an expert".
- Precision Studies: The ground truth for precision studies is the measurement itself, with the focus on consistency across repeated measurements by the same or different operators.
-
The sample size for the training set:
- The document does not explicitly state the sample size used for training the algorithm. It focuses on validation testing.
-
How the ground truth for the training set was established:
- Since the training set size is not provided, the method for establishing its ground truth is also not elaborated upon in this document.
Ask a specific question about this device
Page 1 of 1