Search Filters

Search Results

Found 2 results

510(k) Data Aggregation

    K Number
    K251293

    Validate with FDA (Live)

    Device Name
    CardioVision
    Manufacturer
    Date Cleared
    2025-11-21

    (210 days)

    Product Code
    Regulation Number
    870.2200
    Age Range
    All
    Reference & Predicate Devices
    Predicate For
    N/A
    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticPediatricDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The iCardio.ai CardioVision™ AI is an automated machine learning–based decision support system, indicated as a diagnostic aid for patients undergoing an echocardiographic exam consisting of a single PLAX view in an outpatient environment, such as a primary care setting.

    When utilized by an interpreting clinician, this device provides information that may be useful in detecting moderate or severe aortic stenosis. iCardio.ai CardioVision™ AI is indicated in adult populations over 21 years of age. Patient management decisions should not be made solely on the results of the iCardio.ai CardioVision™ AI analysis. iCardio.ai CardioVision™ AI analyzes a single cine-loop DICOM of the parasternal long axis (PLAX).

    Device Description

    The iCardio.ai CardioVision™ AI is a standalone image analysis software developed by iCardio.ai Corporation, designed to assist in the review of echocardiography images. It is intended for adjunctive use with other physical vital sign parameters and patient information, but it is not intended to independently direct therapy. The device facilitates determining whether an echocardiographic exam is consistent with aortic stenosis (AS), by providing classification results that support clinical decision-making.

    The iCardio.ai CardioVision™ AI takes as input a DICOM-compliant, partial or full echocardiogram study, which must include at least one parasternal long-axis (PLAX) view of the heart and at least one full cardiac cycle. The device uses a set of convolutional neural networks (CNNs) to analyze the image data and estimate the likelihood of moderate or severe aortic stenosis. The output consists of a binary classification of "none/mild" or "moderate/severe," indicating whether the echocardiogram is consistent with moderate or severe aortic stenosis. In cases where the image quality is insufficient, the device may output an "indeterminate" result.

    The CNNs and their thresholds are fixed prior to validation and do not continuously learn during standalone testing. These models are coupled with pre- and post-processing functionalities, allowing the device to integrate seamlessly with pre-existing medical imaging workflows, including PACS, DICOM viewers, and imaging worklists. The iCardio.ai CardioVision™ AI is intended to be used as an aid in diagnosing AS, with the final diagnosis always made by an interpreting clinician, who should consider the patient's presentation, medical history, and additional diagnostic tests.

    AI/ML Overview

    Here's a breakdown of the acceptance criteria and the study proving the device meets them, based on the provided FDA 510(k) clearance letter for CardioVision™:

    Acceptance Criteria and Reported Device Performance

    MetricAcceptance CriteriaReported Device Performance (without indeterminate outputs)Reported Device Performance (including indeterminate outputs)
    AUROCExceeds predefined success criteria0.945Not explicitly stated but inferred to be similar to Sensitivity/Specificity
    SensitivityExceeds predefined success criteria and predicate device0.896 (95% Wilson score CI: [0.8427, 0.9321])0.876 (95% Wilson score CI: [0.8213, 0.9162])
    SpecificityExceeds predefined success criteria and predicate device0.872 (95% Wilson score CI: [0.8384, 0.8995])0.866 (95% Wilson score CI: [0.8324, 0.8943])
    PPVNot explicitly stated as acceptance criteria0.734 (95% Wilson score CI: [0.673, 0.787])Not explicitly stated
    NPVNot explicitly stated as acceptance criteria0.955 (95% Wilson score CI: [0.931, 0.971])Not explicitly stated
    Rejection RateNot explicitly stated as acceptance criteria1.077% (7 out of 650 studies)1.077%

    Note: The document explicitly states that the levels of sensitivity and specificity exceed the predefined success criteria and those of the predicate device, supporting the claim of substantial equivalence. While exact numerical thresholds for the acceptance criteria aren't provided in terms of specific values, the narrative confirms they were met.

    Study Details

    FeatureDescription
    1. Sample size used for the test set and the data provenanceSample Size: 650 echocardiography studies from 608 subjects.Data Provenance: Retrospective, multi-center performance study from 12 independent clinical sites across the United States.
    2. Number of experts used to establish the ground truth for the test set and the qualifications of those expertsNumber of Experts: Not explicitly stated as a specific number, but referred to as "experienced Level III echocardiographers."Qualifications: "Experienced Level III echocardiographers."
    3. Adjudication method for the test setMethod: A "majority vote approach" was used in cases of disagreement among the experts.
    4. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, if so, what was the effect size of how much human readers improve with AI vs without AI assistanceMRMC Study: No, an MRMC comparative effectiveness study is not detailed in this document. The study described is a standalone performance evaluation of the AI. (A "human factors validation study" was conducted to evaluate usability, where participants successfully completed the critical task of results interpretation without errors, but this is not an MRMC study comparing human performance with and without AI assistance on diagnostic accuracy).
    5. If a standalone (i.e. algorithm only without human-in-the-loop performance) was doneStandalone Performance Study: Yes, the document describes a "standalone study" with the primary objective to "evaluate the software's ability to detect aortic stenosis." The reported performance metrics (AUROC, Sensitivity, Specificity, etc.) are for the algorithm's performance alone.
    6. The type of ground truth usedGround Truth Type: Expert consensus based on "echocardiographic assessments performed by experienced Level III echocardiographers," with a majority vote for disagreements.
    7. The sample size for the training setTraining Set Size: Not specified in the provided document. The document states, "No data from these [test set] sites were used in the training or tuning of the algorithm."
    8. How the ground truth for the training set was establishedTraining Set Ground Truth: Not explicitly detailed in the provided document. It can be inferred that similar methods (expert echocardiographic assessments) would have been used for training data, but the specifics are not provided.
    Ask a Question

    Ask a specific question about this device

    K Number
    K981807

    Validate with FDA (Live)

    Date Cleared
    1998-08-19

    (90 days)

    Product Code
    Regulation Number
    870.2340
    Age Range
    All
    Reference & Predicate Devices
    Predicate For
    N/A
    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticPediatricDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    VISUALIZATION AND MEASUREMENT OF ECG BY TRAINED USED RECORDINGS TO BE MEDICAL CONDITION. MON | TORING CARDIAC IN THE OF STAFF

    Device Description

    The Cardio Vision software is a program which runs on a personal computer and is used for the display and storage of Electrocardiogram recordings. The software receives an input signal containing the ECG data from an ECG receiving center. The receiving center is connected to the computer on which the Cardio Vision software is running via a serial port. The ECG receiving center receives the ECG signal from an ECG recording device either via direct connection or via telephone. The software interprets the digital signal and displays it on the screen in real time as a single or multiple lead ECG plot.

    The received ECG signal can be stored on the disk and can be compared to previous ECG recordings for that patient. In addition, measurements can be performed on the ECG plot.

    AI/ML Overview

    The provided text describes the Cardio Vision Software, an ECG display and storage program. Here's a breakdown of the acceptance criteria and study information:

    1. Table of Acceptance Criteria and Reported Device Performance

    Acceptance Criterion (Implicit)Reported Device Performance
    Data sampling and processing accuracy"Comparison of the log file with the original signal verified that the data sampling and data processing modules function properly."
    Data display accuracy and scaling"Using the ECG display module to display data from prepared files verified that the data was displayed properly and in the correct scale."
    Consistency and accuracy of ECG data processing and presentation"The comparison of Cardio Vision Software printouts with simultaneous ECG strip chart output demonstrated that the Cardio Vision Software processes and presents ECG data consistently and accurately."
    Substantial equivalence to predicate devices"The safety and effectiveness of the Cardio Vision software are similar to that of its predicate devices. It is SHL's opinion that the Cardio Vision software is substantially equivalent to its legally marketed predicate devices in terms of safety and effectiveness." (This is a claim of equivalence, supported by the above bench and clinical data).

    2. Sample Size Used for the Test Set and Data Provenance

    The document does not explicitly state a specific sample size for a "test set" in the context of clinical trials or an independent validation study. The "Bench Data" section refers to an "artificial signal" and "prepared files," which implies synthetic or pre-recorded data for testing specific functionalities. The "Clinical Data" section refers to "simultaneous ECG strip chart output," suggesting real patient data, but no further details on the number of patients, recordings, or their provenance (country, retrospective/prospective) are provided.

    3. Number of Experts Used to Establish Ground Truth for the Test Set and Their Qualifications

    The document does not detail any expert review process for establishing ground truth for a test set. The validation primarily relies on comparison with "original signals" (for bench data) and "simultaneous ECG strip chart output" (for clinical data), implying these were considered the ground truth references. There is no mention of human experts directly annotating or establishing ground truth for the purpose of evaluating the device.

    4. Adjudication Method for the Test Set

    No adjudication method (e.g., 2+1, 3+1) is mentioned, as there is no indication of multiple independent expert reviewers establishing a ground truth that would require adjudication.

    5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study

    No Multi-Reader Multi-Case (MRMC) comparative effectiveness study was done. The document does not describe any study involving human readers with and without AI assistance, nor does it provide an effect size for such a comparison.

    6. Standalone (Algorithm Only Without Human-in-the-Loop Performance) Study

    Yes, the studies described are essentially standalone evaluations of the algorithm's performance.

    • Bench Data: "An artificial signal was fed into the receiving center, and the resulting ECG data was displayed and written to a log file. Comparison of the log file with the original signal verified that the data sampling and data processing modules function properly." This is an algorithm-only test.
    • Clinical Data: "The comparison of Cardio Vision Software printouts with simultaneous ECG strip chart output demonstrated that the Cardio Vision Software processes and presents ECG data consistently and accurately." This also evaluates the software's output against a reference without explicit human intervention in the interpretation of the software's output for evaluation purposes.

    7. Type of Ground Truth Used

    • Bench Data: The ground truth for the bench data was the "original artificial signal."
    • Clinical Data: The ground truth for the clinical data was "simultaneous ECG strip chart output." This suggests the output from a traditional, established ECG recording method was used as the reference standard.

    8. Sample Size for the Training Set

    The document does not provide any information about a training set for the Cardio Vision Software. This suggests the software is primarily a display and storage tool, not an AI/ML-based diagnostic algorithm that learns from a training set. The descriptions focus on its functionality in processing and presenting data, rather than making automated interpretations or diagnoses.

    9. How the Ground Truth for the Training Set Was Established

    As no training set is mentioned or implied, the establishment of ground truth for a training set is not applicable to this device description.

    Ask a Question

    Ask a specific question about this device

    Page 1 of 1