Search Filters

Search Results

Found 1 results

510(k) Data Aggregation

    K Number
    K070280
    Date Cleared
    2007-02-23

    (25 days)

    Product Code
    Regulation Number
    870.1425
    Reference & Predicate Devices
    Predicate For
    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    CardioDay® is a Holter software which is indicated for patients who may benefit from a long-term continuous electrocardiographic (ECG) recording, including, but not limited to, those with complaints of palpitations, syncope, chest pain, shortness of breath, or those that need to be monitored to judge their current cardiac functionality such as patients who have recently received pacemakers.

    Device Description

    CardioDay® does not perform any diagnosis of data by itself but only displays ECG morphologies and associated, claudited graphs such as heart rate trends, RR variability, and other statistical values in graphical form. The physician will be able to review, edit, and print the data collected.

    AI/ML Overview

    The provided document describes a Special 510(k) notification for the CardioDay® software, which is a Holter ECG device. The submission focuses on demonstrating substantial equivalence to a legally marketed predicate device (CardioDay® Version 1.9.5, K051471) rather than presenting a performance study with specific acceptance criteria that would typically involve numerical metrics like sensitivity, specificity, or accuracy for an AI/algorithm-driven device.

    The "acceptance criteria" in this context are primarily related to the functional equivalence of the new version (CardioDay® Version 2.0) to the predicate device and compliance with relevant medical device standards. The study proving this effectively involves comparing the technical and functional characteristics of the new device to the predicate.

    Here's an analysis based on the provided text, addressing your specific points:

    1. A table of acceptance criteria and the reported device performance

    Since this is a Special 510(k) for a software update rather than a new device with novel AI algorithms, the "acceptance criteria" are not reported as specific performance metrics (e.g., sensitivity, specificity) but rather as functional and technical equivalence to the predicate device and compliance with standards. The "reported device performance" is the demonstration that these equivalences and compliances are met.

    Below is a summary table based on the document's comparison of the new device (CardioDay® Version 2.0) to the legally marketed predicate (CardioDay® Version 1.9.5). The acceptance criterion implicitly embedded in this type of submission is "maintains all predicate functionalities" and "introduces no new safety or effectiveness concerns."

    Acceptance Criteria (Functional Equivalence / Compliance)Reported Device Performance (CardioDay® Version 2.0 vs. Predicate)
    Hardware Specifications: CPU, RAM, Hard Disk Space, Display, Peripherals (CD-ROM, Operating System, Ports, Printer, Keyboard, Mouse, Installation Media, Further Periphery)All specifications are either identical or improved (e.g., increased minimum RAM, updated OS compatibility), maintaining or exceeding predicate capabilities. New features like Bluetooth connectivity are noted.
    Software Features (Patient Screen): Patient ID, Name, Address, Personal Data, Medication, Indication, Physician's Name, Date of RecordingAll features are identical ("Yes" for both new and predicate).
    Software Features (Analysis Options): Analysis Duration, Primary Channel Selection, Sensitivity/Signal Quality, Tachycardia/Bradycardia Threshold, Pause Duration, Prematurity, R on T, Pacemaker Type/Thresholds, Superimposition/QuickScan, 12-Lead ECG Module, Holter Data TransferAll features are identical ("Yes" for both new and predicate).
    Software Features (Events Detected): VES/PVC, SVES/SVE, Couplet, Triplet, VTACH, Bigeminy, R on T, ST-Analysis, SVTACH, Arrhythmia, Bradycardia, Burst, V. STIM, A. STIM, AV. STIM, Undersense, Exitblock, Oversense, Pause, Event Marker, HR Stripes, Artifact, NormalAll event detection capabilities are identical ("Yes" for both new and predicate).
    Software Features (Functionality Available): Start, Read Tape/Digital Recorder, Import, Analyze New, Open, Edit Patient Data, Print Preview, Print, Close Recording/Close, Delete Recording, Archive, Diagnosis, View ECG (Online via OptoLink/USB/Bluetooth), Screen Calibration/Setup, FFT Setup, Report Setup, Various Displays (Classes, PM Events, Events, HR Min./Max., Statistics, Diagnosis, Overview), Help FunctionsMost functionalities are identical. New features include "View ECG Online via USB Cable" and "View ECG Online via Bluetooth Data Transfer" (not present in predicate).
    Software Features (Icons/Buttons Available): Start: Read Digital/Tape Recorder, Open Existing Record, Digital Recorder, Tape Recorder, Open: List of Patients, Print, Rhythm Analysis, Print PreviewAll features are identical ("Yes" for both new and predicate). The document notes "The label and form of the icons/buttons, however, are different."
    Software Features (Options Available): Classes, Events, Heart Rate Min/Max, Average Heart Rate, Statistics (FFT, ST Diagrams), Report, Overview, Heart Rate Variability (RR Delay, RR FFT, 24h RR FFT, RR Histograms)All features are identical ("Yes" for both new and predicate). The document notes "The name of those options may vary."
    Software Features (Graphics & Displays Available): Basis Sampling Rate for Graphical Displays, Classified Beats, Zoomed/Context of Selected Beat, Events, Heart Rate Trend, Average RR Interval, Y-T, RR > 50ms Distribution, FFT/ST Diagrams, Overview 2 channels.The "Basis Sampling Rate for Graphical Displays" was improved from 8 ms to 4 ms. A new indicator for Atrial Fibrillation was added. All other features are identical.
    Software Features (Printout Options): Full Disclosure (various channels/times), Marked Events (various per page/channels), Event Table/Histogram, HR/ST Diagrams, HR diagram + Min/Max, RR Intervals/Delay/Histograms/Spectra, Pacemaker Event Histogram/Function Analysis, Report, Print to File (PDF), Save as Default OptionAll features are identical ("Yes" for both new and predicate). The document notes "The commands to generate a given printout as well as its appearance do vary."
    Software Features (Editing & Reviewing Options): Scroll through Beats/Events, Edit Beat Labels/Event Marker, View Patient Event Markers, Jump from Statistics/Overview to ECG, Select Time Interval for RR Parameters, Edit ReportAll features are identical ("Yes" for both new and predicate). The document notes "The label and appearance of those options may vary."
    Supported Recorders: RZ153+ Digital Recorder, CardioMem® CM 3000, CardioMem® CM 3000-12, CardioMem® CM 3000-12BTRZ153+, CM 3000, CM 3000-12 are supported in both. CM 3000-12BT is newly supported with Bluetooth functionality.
    Compliance with Standards: 21 CFR 820, ISO 9001:2000 / ISO 13485:2003, IEC 60601-1-4, ANSI/AAMI EC38, IEC 60601-2-47, ISO 14971, EN 980, EN 1041, ISO 15223The new device continues to comply with all listed standards, as well as several FDA guidelines ("Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices" and "General Principles of Software Validation").

    2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)

    The document explicitly states: "Clinical testing was not required to demonstrate substantial equivalence of safety and effectiveness." This implies there was no formal "test set" of patient data used in the sense of a clinical trial or performance evaluation for the software's diagnostic capabilities. The evaluation was primarily based on a comparison of technical and functional specifications and compliance with standards. Therefore, information on sample size, data provenance, retrospective/prospective nature is not applicable or not provided in this submission.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)

    Since clinical testing was not performed and no patient data test set was used for diagnostic performance evaluation, there were no experts used to establish ground truth for a test set. The submission focuses on software verification and validation, ensuring that the software functions as intended and is equivalent to the predicate device.

    4. Adjudication method (e.g. 2+1, 3+1, none) for the test set

    As no clinical test set was utilized for evaluating diagnostic performance, no adjudication method was applied to a test set.

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    No MRMC comparative effectiveness study was done. The CardioDay® software, as described, "does not perform any diagnosis of data by itself but only displays ECG morphologies and associated, claudited graphs such as heart rate trends, RR variability, and other statistical values in graphical form. The physician will be able to review, edit, and print the data collected." It is a tool for physicians to review and analyze Holter ECG data, not an AI diagnostic algorithm that assists human readers in making a diagnosis from scratch.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    The software is explicitly stated to be a tool for physicians and "does not perform any diagnosis of data by itself." Therefore, a standalone performance evaluation (algorithm only) designed to establish diagnostic accuracy was not performed nor would it be appropriate for a device with this stated functionality. Its performance is tied to its role as a display and analysis tool for human interpretation.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)

    Given that no clinical performance study was conducted, there was no ground truth established for diagnostic accuracy. The "ground truth" for this submission revolves around the software's functional correctness and compliance with its own specifications and relevant standards.

    8. The sample size for the training set

    The document does not describe any machine learning or AI models requiring a "training set." Therefore, information on the sample size for a training set is not applicable or not provided.

    9. How the ground truth for the training set was established

    As there is no mention of a training set for machine learning/AI, this point is not applicable. The "ground truth" for the software's development likely refers to software requirements specifications, design documents, and medical device standards against which its functionality was verified.

    Ask a Question

    Ask a specific question about this device

    Page 1 of 1