K Number
K161240
Date Cleared
2016-08-10

(100 days)

Product Code
Regulation Number
870.1425
Panel
CV
Reference & Predicate Devices
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

The Rhythm View Workstation is a computerized system that assists in the diagnosis of complex cardiac arrhythmias. The Rhythm View Workstation is used to analyze electrogram and electrocardiogram signals and display them in a visual format.

Device Description

The RhythmView Workstation is comprised of the following components: Cart, Monitor/Display, Computer, Radio-Frequency Identification (RFID) Reader/Writer, Software, Keyboard, Mouse, Two Port USB Switch, Solid State Hard Drive (optional component).

RhythmView takes electrical signals collected from multi-polar electrophysiology catheters and outputs a graphic display that assists in the diagnosis of cardiac arrhythmias.

The RhythmView computes and displays electrical rotors or focal beat sources that may sustain human heart rhythm disorders including focal AT, AFL, other SVT, AF, VT and VF in a given patient. The product takes as input electrical signals recorded during the heart rhythm disorder under consideration, typically from multiple specified locations within the heart during an electrophysiological study. The RhythmView then uses proprietary patented algorithms and methods to compute spatial organization during the heart rhythm disorder. These computed elements are displayed graphically in interactive form for review to aid diagnosis by the physician during an electrophysiology study.

AI/ML Overview

Here's an analysis of the provided text to extract information about the device's acceptance criteria and the study proving it meets those criteria.

Note: The provided document is a 510(k) summary for a software update to an existing device, the RhythmView Workstation. As such, the "study" described focuses on verification and validation of the new software features, rather than a full clinical trial to establish initial safety and effectiveness, which would have been done for the original device. This means some of the requested information (like MRMC studies for AI performance) might not be directly applicable or explicitly detailed in this type of submission.


Acceptance Criteria and Device Performance Study for RhythmView Workstation SW V6.0.3

The RhythmView Workstation is a computerized system that assists in the diagnosis of complex cardiac arrhythmias by analyzing electrogram and electrocardiogram signals and displaying them in a visual format. The current 510(k) (K161240) pertains to software version 6.0.3, which introduces new features like "Stability Map" and "Epoch Timeline."

The performance testing primarily focuses on validating these new software features and confirming that the device continues to meet its intended use.

1. Table of Acceptance Criteria and Reported Device Performance

The document does not explicitly list quantitative "acceptance criteria" for the new software features (e.g., a specific sensitivity or specificity threshold). Instead, it describes types of validation performed to ensure the new features function as intended and do not negatively impact the existing functionality.

Acceptance Criteria (Implied)Reported Device Performance (Summary)
New Features Validation:
Stability Map Accuracy: Verification that the Stability Maps generated are correct compilations of RAP profiles from individual segments.Verification that the Stability Maps generated by RhythmView are correct compilations of the RAP profiles from individual segments has been conducted. Specific numerical results are not provided, but the conclusion states "The testing has demonstrated that the SW updates for RhythmView V6.0 provide reasonable assurance that the proposed device conforms to the appropriate requirements for its intended use."
Stability Map & Default Filter Effectiveness: Validation of the Stability Map and its default filter setting using clinical data sets.Validation of the Stability Map and the default Stability Filter setting using clinical data sets has been conducted.
Consistent Diagnosis (with/without RAP): Confirmation that activation maps, with and without Rotational Activity Profile (RAP) data, lead to a consistent diagnosis by physicians.Physician validation to confirm that the activation maps with and without RAP lead to a consistent diagnosis has been performed.
User Interface (UI) Usability: Evaluation of the new UI features through simulated user testing.Simulated User testing to evaluate new features of UI has been performed.

2. Sample Size Used for the Test Set and Data Provenance

  • Test Set Sample Size: The document mentions "clinical data sets" for validation of the Stability Map, but does not specify the sample size (i.e., number of patients or cases) used for these tests.
  • Data Provenance: The document states "clinical data sets" were used. No specific country of origin is mentioned. The testing methods described imply a retrospective analysis of existing clinical data, particularly for validating the Stability Map's compilation and consistency of diagnosis.

3. Number of Experts Used to Establish Ground Truth for the Test Set and Qualifications

  • Number of Experts: The document refers to "Physician validation," implying that multiple physicians were involved in confirming the consistency of diagnoses. However, the exact number of experts is not specified.
  • Qualifications of Experts: The experts are identified as "Physician(s)." No specific qualifications (e.g., years of experience, sub-specialty) are provided in this summary.

4. Adjudication Method for the Test Set

The document does not describe a formal adjudication method (like 2+1 or 3+1 consensus) for establishing ground truth or resolving discrepancies among readers. The "Physician validation" suggests that physicians reviewed the outputs, but the process for achieving a single "ground truth" or resolving disagreements is not detailed.

5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study

No. The document does not describe a multi-reader multi-case (MRMC) comparative effectiveness study to assess how human readers improve with AI vs. without AI assistance. This submission is for a software update that provides new display and analysis tools, rather than an AI-driven diagnostic aid that would directly assist human interpretation in a comparative manner. The "Physician validation" described is more akin to a usability/consistency check of the new features.

6. Standalone Performance (Algorithm Only)

The device is a "computerized system that assists in the diagnosis" and displays information for physician interpretation. It's not a standalone AI algorithm designed to provide a diagnosis without human interaction. The validation focuses on the accuracy of the output of the system (e.g., Stability Maps as correct compilations) and its utility for physicians, rather than an independent diagnostic performance of the algorithm itself. Therefore, a standalone (algorithm only without human-in-the-loop performance) study, in the sense of an AI model making diagnostic predictions, was not performed or detailed for this type of device and software update.

7. Type of Ground Truth Used

The ground truth for the "Physician validation" appears to be expert consensus on diagnosis. The validation checks if "activation maps with and without RAP lead to a consistent diagnosis," implying that the physicians' established diagnosis (which forms the clinical ground truth for the case) should remain consistent regardless of the presence of the new RAP display features. For the "Stability Map Accuracy," the ground truth seems to be computational correctness (i.e., checking if the map is a "correct compilation" of other data).

8. Sample Size for the Training Set

Not applicable/Not provided. This document describes the validation of a software update for an existing medical device, not the development of a new machine learning model requiring a training set. The algorithms are proprietary, patented, and seem to be based on established computational methods for signal processing and display, rather than data-driven learning from a large training dataset.

9. How Ground Truth for the Training Set Was Established

Not applicable. As no training set is mentioned (see point 8), there is no information on how a ground truth for a training set would have been established.

§ 870.1425 Programmable diagnostic computer.

(a)
Identification. A programmable diagnostic computer is a device that can be programmed to compute various physiologic or blood flow parameters based on the output from one or more electrodes, transducers, or measuring devices; this device includes any associated commercially supplied programs.(b)
Classification. Class II (performance standards).