K Number
K201168
Device Name
ECG Monitor App
Date Cleared
2020-08-04

(95 days)

Product Code
Regulation Number
870.2345
Panel
CV
Reference & Predicate Devices
N/A
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

The Samsung ECG Monitor Application is an over-the-counter (OTC) software-only, mobile medical application operating on a compatible Samsung Galaxy Watch and Phone. The app is intended to create, record, store, transfer, and display a single channel electrocardiogram (ECG), similar to a Lead I ECG, for informational use only in adults 22 years and older. Classifiable traces are labeled by the app as either atrial fibrillation (AFib) or sinus rhythm with the intention of aiding heart rhythm identification; it is not intended to replace traditional methods of diagnosis or treatment. The app is not intended for users with other known arrhythmias and users should not interpret or take clinical action based on the device output without consultation of a qualified healthcare professional.

Device Description

The Samsung ECG Monitor Application consists of a pair of mobile medical apps: one on a compatible Samsung wearable and the other on a compatible Samsung phone. The compatible Samsung wearable application captures bioelectrical signals from the user and generates single lead ECG signals, calculates average heart rate and classifies the rhythm. The wearable application securely transmits the obtained data to the phone application on the paired phone device. The phone application shows the ECG measurement history and generates the PDF file for the received ECG signals which can be shared by the user.

AI/ML Overview

The Samsung ECG Monitor Application was proven to be non-inferior to the predicate (Apple ECG App, DEN180044) in terms of rhythm classification accuracy and ECG signal quality sufficiency.

Here's the breakdown of the acceptance criteria and study details:

1. A table of acceptance criteria and the reported device performance

Acceptance Criteria CategoryAcceptance Criteria (Non-inferiority Margin)Reported Device Performance (Samsung ECG Monitor App)Reference (Predicate Device)
AFib SensitivityWithin pre-determined non-inferiority margin compared to predicate98.1% (95% CI: 96.3%, 99.9%)99.6% (95% CI: 98.7%, 100%)
Sinus Rhythm SpecificityWithin pre-determined non-inferiority margin compared to predicate100% (95% CI: 100%)99.6% (95% CI: 98.8%, 100%)
Inconclusive Rate (AFib or SR truth)Within pre-determined non-inferiority margin compared to predicate2.9% (95% CI: 1.1%, 4.7%)2.2% (95% CI: 0.7%, 3.7%)
Cardiologist Interpretability of ECG RecordingsWithin pre-determined non-inferiority margin compared to predicate98.5% (95% CI: 97.4%, 99.5%)99.4% (95% CI: 98.8%, 100%)
Concordance between App Strip and 12-lead ECGWithin pre-determined non-inferiority margin compared to predicate99.4% (95% CI: 98.7%, 100%)99.8% (95% CI: 99.4%, 100%)
Fiducial Point Annotation (Key ECG Features)All key ECG features (QRS amplitude, RR interval, QRS duration, PR interval) within non-inferiority margin with statistical significance compared to 12-lead ECG.Met non-inferiority margin with statistical significance.N/A (compared to 12-lead reference)

2. Sample size used for the test set and the data provenance

  • Sample Size: 544 subjects.
    • 268 AFib patients
    • 261 Sinus Rhythm (SR) patients
    • 15 with other arrhythmias
  • Data Provenance: The document does not explicitly state the country of origin. However, the manufacturer is Samsung Electronics Co., Ltd in Korea. The study structure implies prospectively collected data for this clinical validation.

3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts

  • Number of Experts: Unspecified number of cardiologists and three blinded, independent ECG technicians.
  • Qualifications of Experts:
    • Cardiologists: Used for comparing the ECG App algorithm detection of AFib and SR to 12-lead ECG reference strips, and for interpreting the ECG Monitor App strips. No specific experience level provided.
    • ECG Technicians: Three blinded, independent ECG technicians were used for fiducial point annotation. No specific experience level provided.

4. Adjudication method for the test set

  • For rhythm classification, the ground truth was established by cardiologists' read of 12-lead ECG reference strips. This implies a consensus or authoritative read by these experts.
  • For signal quality interpretability and concordance, cardiologists' interpretation served as the reference.
  • For fiducial point annotation, three blinded, independent ECG technicians marked the points, implying their individual annotations were compared against the reference or potentially against each other for a form of consensus, though this is not explicitly detailed.

5. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

The document describes a clinical study where the algorithm's performance (Samsung ECG Monitor App) was compared to a reference standard (cardiologist-read 12-lead ECG), and also against a predicate device (Apple ECG App). It does not describe an MRMC comparative effectiveness study evaluating how human readers improve with AI vs without AI assistance. The focus was on the algorithm's standalone performance compared to expert ground truth and its non-inferiority to an existing cleared device.

6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

Yes, a standalone study was done. The clinical study directly evaluated the Samsung ECG Monitor App algorithm's performance in detecting AFib and Sinus Rhythm against a cardiologist-read 12-lead ECG reference strip. The reported sensitivity, specificity, and inconclusive rates are for the algorithm's performance alone.

7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)

The primary ground truth used was expert consensus / expert interpretation from:

  • Cardiologists (for rhythm classification based on 12-lead ECG reference strips and for interpretability and concordance studies).
  • Blinded, independent ECG technicians (for fiducial point annotation).

8. The sample size for the training set

The document does not specify the sample size for the training set. It only details the clinical validation study (test set).

9. How the ground truth for the training set was established

The document does not provide information on how the ground truth for the training set was established, as it focuses solely on the clinical validation (test set) and device performance evaluation.

§ 870.2345 Electrocardiograph software for over-the-counter use.

(a)
Identification. An electrocardiograph software device for over-the-counter use creates, analyzes, and displays electrocardiograph data and can provide information for identifying cardiac arrhythmias. This device is not intended to provide a diagnosis.(b)
Classification. Class II (special controls). The special controls for this device are:(1) Clinical performance testing under anticipated conditions of use must demonstrate the following:
(i) The ability to obtain an electrocardiograph of sufficient quality for display and analysis; and
(ii) The performance characteristics of the detection algorithm as reported by sensitivity and either specificity or positive predictive value.
(2) Software verification, validation, and hazard analysis must be performed. Documentation must include a characterization of the technical specifications of the software, including the detection algorithm and its inputs and outputs.
(3) Non-clinical performance testing must validate detection algorithm performance using a previously adjudicated data set.
(4) Human factors and usability testing must demonstrate the following:
(i) The user can correctly use the device based solely on reading the device labeling; and
(ii) The user can correctly interpret the device output and understand when to seek medical care.
(5) Labeling must include:
(i) Hardware platform and operating system requirements;
(ii) Situations in which the device may not operate at an expected performance level;
(iii) A summary of the clinical performance testing conducted with the device;
(iv) A description of what the device measures and outputs to the user; and
(v) Guidance on interpretation of any results.