Search Filters

Search Results

Found 2 results

510(k) Data Aggregation

    K Number
    K161240
    Date Cleared
    2016-08-10

    (100 days)

    Product Code
    Regulation Number
    870.1425
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    Rhythm View Workstation (non-streaming)

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The Rhythm View Workstation is a computerized system that assists in the diagnosis of complex cardiac arrhythmias. The Rhythm View Workstation is used to analyze electrogram and electrocardiogram signals and display them in a visual format.

    Device Description

    The RhythmView Workstation is comprised of the following components: Cart, Monitor/Display, Computer, Radio-Frequency Identification (RFID) Reader/Writer, Software, Keyboard, Mouse, Two Port USB Switch, Solid State Hard Drive (optional component).

    RhythmView takes electrical signals collected from multi-polar electrophysiology catheters and outputs a graphic display that assists in the diagnosis of cardiac arrhythmias.

    The RhythmView computes and displays electrical rotors or focal beat sources that may sustain human heart rhythm disorders including focal AT, AFL, other SVT, AF, VT and VF in a given patient. The product takes as input electrical signals recorded during the heart rhythm disorder under consideration, typically from multiple specified locations within the heart during an electrophysiological study. The RhythmView then uses proprietary patented algorithms and methods to compute spatial organization during the heart rhythm disorder. These computed elements are displayed graphically in interactive form for review to aid diagnosis by the physician during an electrophysiology study.

    AI/ML Overview

    Here's an analysis of the provided text to extract information about the device's acceptance criteria and the study proving it meets those criteria.

    Note: The provided document is a 510(k) summary for a software update to an existing device, the RhythmView Workstation. As such, the "study" described focuses on verification and validation of the new software features, rather than a full clinical trial to establish initial safety and effectiveness, which would have been done for the original device. This means some of the requested information (like MRMC studies for AI performance) might not be directly applicable or explicitly detailed in this type of submission.


    Acceptance Criteria and Device Performance Study for RhythmView Workstation SW V6.0.3

    The RhythmView Workstation is a computerized system that assists in the diagnosis of complex cardiac arrhythmias by analyzing electrogram and electrocardiogram signals and displaying them in a visual format. The current 510(k) (K161240) pertains to software version 6.0.3, which introduces new features like "Stability Map" and "Epoch Timeline."

    The performance testing primarily focuses on validating these new software features and confirming that the device continues to meet its intended use.

    1. Table of Acceptance Criteria and Reported Device Performance

    The document does not explicitly list quantitative "acceptance criteria" for the new software features (e.g., a specific sensitivity or specificity threshold). Instead, it describes types of validation performed to ensure the new features function as intended and do not negatively impact the existing functionality.

    Acceptance Criteria (Implied)Reported Device Performance (Summary)
    New Features Validation:
    Stability Map Accuracy: Verification that the Stability Maps generated are correct compilations of RAP profiles from individual segments.Verification that the Stability Maps generated by RhythmView are correct compilations of the RAP profiles from individual segments has been conducted. Specific numerical results are not provided, but the conclusion states "The testing has demonstrated that the SW updates for RhythmView V6.0 provide reasonable assurance that the proposed device conforms to the appropriate requirements for its intended use."
    Stability Map & Default Filter Effectiveness: Validation of the Stability Map and its default filter setting using clinical data sets.Validation of the Stability Map and the default Stability Filter setting using clinical data sets has been conducted.
    Consistent Diagnosis (with/without RAP): Confirmation that activation maps, with and without Rotational Activity Profile (RAP) data, lead to a consistent diagnosis by physicians.Physician validation to confirm that the activation maps with and without RAP lead to a consistent diagnosis has been performed.
    User Interface (UI) Usability: Evaluation of the new UI features through simulated user testing.Simulated User testing to evaluate new features of UI has been performed.

    2. Sample Size Used for the Test Set and Data Provenance

    • Test Set Sample Size: The document mentions "clinical data sets" for validation of the Stability Map, but does not specify the sample size (i.e., number of patients or cases) used for these tests.
    • Data Provenance: The document states "clinical data sets" were used. No specific country of origin is mentioned. The testing methods described imply a retrospective analysis of existing clinical data, particularly for validating the Stability Map's compilation and consistency of diagnosis.

    3. Number of Experts Used to Establish Ground Truth for the Test Set and Qualifications

    • Number of Experts: The document refers to "Physician validation," implying that multiple physicians were involved in confirming the consistency of diagnoses. However, the exact number of experts is not specified.
    • Qualifications of Experts: The experts are identified as "Physician(s)." No specific qualifications (e.g., years of experience, sub-specialty) are provided in this summary.

    4. Adjudication Method for the Test Set

    The document does not describe a formal adjudication method (like 2+1 or 3+1 consensus) for establishing ground truth or resolving discrepancies among readers. The "Physician validation" suggests that physicians reviewed the outputs, but the process for achieving a single "ground truth" or resolving disagreements is not detailed.

    5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study

    No. The document does not describe a multi-reader multi-case (MRMC) comparative effectiveness study to assess how human readers improve with AI vs. without AI assistance. This submission is for a software update that provides new display and analysis tools, rather than an AI-driven diagnostic aid that would directly assist human interpretation in a comparative manner. The "Physician validation" described is more akin to a usability/consistency check of the new features.

    6. Standalone Performance (Algorithm Only)

    The device is a "computerized system that assists in the diagnosis" and displays information for physician interpretation. It's not a standalone AI algorithm designed to provide a diagnosis without human interaction. The validation focuses on the accuracy of the output of the system (e.g., Stability Maps as correct compilations) and its utility for physicians, rather than an independent diagnostic performance of the algorithm itself. Therefore, a standalone (algorithm only without human-in-the-loop performance) study, in the sense of an AI model making diagnostic predictions, was not performed or detailed for this type of device and software update.

    7. Type of Ground Truth Used

    The ground truth for the "Physician validation" appears to be expert consensus on diagnosis. The validation checks if "activation maps with and without RAP lead to a consistent diagnosis," implying that the physicians' established diagnosis (which forms the clinical ground truth for the case) should remain consistent regardless of the presence of the new RAP display features. For the "Stability Map Accuracy," the ground truth seems to be computational correctness (i.e., checking if the map is a "correct compilation" of other data).

    8. Sample Size for the Training Set

    Not applicable/Not provided. This document describes the validation of a software update for an existing medical device, not the development of a new machine learning model requiring a training set. The algorithms are proprietary, patented, and seem to be based on established computational methods for signal processing and display, rather than data-driven learning from a large training dataset.

    9. How Ground Truth for the Training Set Was Established

    Not applicable. As no training set is mentioned (see point 8), there is no information on how a ground truth for a training set would have been established.

    Ask a Question

    Ask a specific question about this device

    K Number
    K151245
    Manufacturer
    Date Cleared
    2015-09-15

    (127 days)

    Product Code
    Regulation Number
    870.1425
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    Rhythm View Workstation (non-streaming, steaming option)

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The Rhythm View Workstation is a computerized system that assists in the diagnosis of complex cardiac arrhythmias. The Rhythm View Workstation is used to analyze electrogram and electrocardiogram signals and display them in a visual format.

    Device Description

    The RhythmView Workstation is comprised of the following components: Cart, Monitor/Display, Computer, Radio-Frequency Identification (RFID) Reader/Writer, Solid State Hard Drive (optional component), Keyboard, Mouse, Two Port USB Switch, Video Splitter (new component - streaming option only), *Panel box required, Software. RhythmView takes electrical signals collected from multi-polar electrophysiology catheters and outputs a graphic display that assists in the diagnosis of cardiac arrhythmias. The Rhythm View computes and displays electrical rotors or focal beat sources responsible for maintaining human heart rhythm disorders including focal AT, AFL, other SVT, AF, VT and VF in a given patient. The product takes as input electrical signals recorded during the heart rhythm disorder under consideration, typically from multiple specified locations within the heart during an electrophysiological study. The RhythmView then uses proprietary patented algorithms and methods to compute spatial organization during the heart rhythm disorder. These computed elements are displayed graphically in interactive form for review to aid diagnosis by the physician during an electrophysiology study.

    AI/ML Overview

    The provided text describes the RhythmView Workstation SW V5.0, a computerized system that assists in the diagnosis of complex cardiac arrhythmias by analyzing electrogram and electrocardiogram signals.

    Here's an analysis of the acceptance criteria and study information, based on the provided text:

    1. Table of Acceptance Criteria and Reported Device Performance

    The FDA clearance document for the RhythmView Workstation SW V5.0 (K151245) indicates that the device's performance was evaluated through verification and validation testing, including Electrical Safety and EMC testing, and usability testing. The document states that these tests provide "reasonable assurance that the proposed device conformances to the appropriate requirements for its intended use" and that "it has been demonstrated that the RhythmView Workstation is safe and effective for its intended use."

    However, specific numerical acceptance criteria (e.g., sensitivity, specificity, accuracy thresholds) and their corresponding reported performance values are not explicitly detailed in the provided text. The document focuses on demonstrating substantial equivalence to predicate devices and adherence to general safety and performance standards rather than quantitative clinical performance metrics.

    The "Device Characteristic" table in Appendix 1: 510(k) Summary compares the proposed and predicate RhythmView Workstations in terms of features and functionality. This table can be interpreted as a set of functional requirements or acceptance criteria for the new version, demonstrating that it either maintains existing functionality ("Yes" in both columns) or introduces new features ("Yes" in Proposed, "No" in Predicate).

    Device CharacteristicExpected Performance (Acceptance Criteria, implicit from Predicate)Reported Device Performance (Proposed RhythmView™ Workstation)
    Signal processingYesYes
    Post-processing displayYesYes
    Grid display of electrode signalsYesYes
    Programming LanguageC++C++
    Export of processed file into video formatYesYes
    Manual tagging by user of electrogramsNoNo
    OTS Software requirementsSame as PredicateSame as Predicate
    Display options for review of processed signals• Electrical Activity
    • Contours Only
    • DContours
    • Rotational Activity Profile• Electrical Activity
    • Contours Only
    • DContours
    • Rotational Activity Profile
    RAP display (optional)Monochromatic onlyMulti-color with monochromatic option available
    RFID Reader/Writer FunctionYesYes
    Data transfer via Two Port SwitchYesYes
    Direct data transfer via USB cable to RV Workstation from EP systemOption availableOption available
    Atrial FunctionYesYes
    Ventricular FunctionOption availableOption available
    Signal Quality IndicatorNo (new feature)Yes
    Spotlight FeatureNo (new feature)Yes
    Streaming real-time electrogramsNo (new feature)Option available

    2. Sample Size Used for the Test Set and Data Provenance

    The provided text does not specify the sample size used for any test set or the data provenance (e.g., country of origin, retrospective or prospective) for the studies mentioned. The testing described is primarily focused on software and hardware verification/validation, and usability, rather than a clinical performance study with patient data.

    3. Number of Experts Used to Establish the Ground Truth for the Test Set and Qualifications of Those Experts

    This information is not provided in the text. Given that the testing mentioned is primarily verification, validation, and usability, it's unlikely that a traditional "ground truth" for clinical performance, as established by multiple clinical experts, was a primary component of this regulatory submission.

    4. Adjudication Method for the Test Set

    This information is not provided in the text.

    5. If a Multi Reader Multi Case (MRMC) Comparative Effectiveness Study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    A Multi-Reader Multi-Case (MRMC) comparative effectiveness study is not mentioned in the provided text. The device is described as "assisting" in diagnosis and displaying information for a physician's review, but there is no data presented on human reader performance improvement with or without the AI.

    6. If a Standalone (i.e. algorithm only without human-in-the-loop performance) was done

    A standalone performance study of the algorithm is not explicitly mentioned in the provided text in terms of quantitative clinical metrics (e.g., sensitivity, specificity). The focus is on the workstation as a whole and its functionality in assisting a physician. The statement "The RhythmView then uses proprietary patented algorithms and methods to compute spatial organization during the heart rhythm disorder" confirms the presence of an algorithm, but its standalone performance is not quantified.

    7. The Type of Ground Truth Used

    The text does not explicitly state the type of ground truth used for any clinical performance evaluation. The "ground truth" for the verification and validation (V&V) testing would likely refer to engineering specifications and expected software behavior, and for usability testing, it would relate to user task completion and satisfaction. For the underlying algorithms that "compute spatial organization," the ground truth, if evaluated, would typically be established based on accepted electrophysiological principles or expert consensus in the field.

    8. The Sample Size for the Training Set

    The text does not provide any information regarding a training set sample size. This type of detail is typically associated with the development of the "proprietary patented algorithms and methods" but is not part of this 510(k) summary.

    9. How the Ground Truth for the Training Set Was Established

    The text does not provide any information regarding how the ground truth for a training set was established.

    Ask a Question

    Ask a specific question about this device

    Page 1 of 1