Search Filters

Search Results

Found 1 results

510(k) Data Aggregation

    K Number
    K970545
    Date Cleared
    1997-05-08

    (85 days)

    Product Code
    Regulation Number
    870.1025
    Reference & Predicate Devices
    N/A
    Why did this record match?
    Device Name :

    DASH 1000 PATIENT MONITOR

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The Marquette Eagle 1000 (DASH 1000) Patient Monitor is a patient monitor that is designed to be used to monitor a patient's basic physiological parameters including: electrocardiography (ECG), invasive blood pressure, non-invasive blood pressure, oxygen saturation, and temperature. Optionally, the printing of information by a paper recorder may be added to the basic monitor configuration. Use of this device is intended for patient populations including: adult, pediatric, and/or neonatal.

    Device Description

    The Marquette Eagle 1000 Patient Monitor is a patient monitoring system that is designed to be used to monitor a patient's basic physiological parameters including: electrocardiography (ECG), invasive blood pressure, non-invasive blood pressure, oxygen saturation, and temperature. The Eagle 1000 Patient Monitor is a microprocessor-based, software-driven device. The signal-acquisition and -processing technologies and the basic parts of the device software were re-used from former devices.

    AI/ML Overview

    The provided 510(k) summary for the Marquette Eagle 1000 Patient Monitor primarily focuses on demonstrating substantial equivalence to a predicate device rather than presenting detailed performance studies with acceptance criteria for specific alarm detection or diagnostic functions. The device is a patient monitoring system, and the submission emphasizes its ability to monitor basic physiological parameters and raise alarms.

    Here's an analysis of the requested information based on the provided text:

    Acceptance Criteria and Device Performance

    The document does not explicitly state numerical acceptance criteria for each monitored parameter (like ECG accuracy, blood pressure accuracy, SpO2 accuracy, etc.) nor does it report specific device performance against such criteria. Instead, it makes a general statement:

    General Statement on Performance:
    "Testing was performed on the Eagle 1000 Patient Monitor and its predicate devices. Precision, accuracy, as well as safety testing was performed. Test results indicate that the Eagle 1000 Patient Monitor provides an equivalent level or better in performance, when compared to the legally marketed predicate device(s) when tested to the accuracy requirements as specified in the contents of the premarket notification submission."

    Since no specific acceptance criteria or quantitative performance data are given, a table of acceptance criteria and reported device performance cannot be generated from the provided text.


    Additional Requested Information:

    2. Sample size used for the test set and the data provenance (e.g., country of origin of the data, retrospective or prospective)
    The document does not provide details on the sample size used for the test set or the data provenance.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g., radiologist with 10 years of experience)
    This information is not provided in the document. Given that this is a patient monitor for basic physiological parameters, ground truth would typically be established by validated reference devices, not human experts in the diagnostic sense (e.g., a highly accurate ECG machine as a reference for ECG, or an arterial line for invasive blood pressure).

    4. Adjudication method (e.g., 2+1, 3+1, none) for the test set
    Not applicable and not provided. Adjudication methods are typically used when subjective interpretations are involved, which is not the primary function of a physiological monitor measuring objective parameters.

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance
    Not applicable. This device is a monitor, not an AI-assisted diagnostic tool that would involve human readers interpreting images or complex data in an MRMC study.

    6. If a standalone (i.e., algorithm only without human-in-the-loop performance) was done
    This is a standalone device in the sense that it collects and displays physiological data. Its performance would inherently be "standalone" in that it performs its functions without direct human intervention in the signal processing. However, the document does not detail specific "algorithm only" performance metrics separate from the integrated device.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)
    The document does not explicitly state the type of ground truth used. For a physiological monitor, ground truth is typically established by validated reference devices known for their high accuracy in measuring the specific physiological parameter (e.g., a highly accurate reference thermometer for temperature, an arterial line for invasive blood pressure, or a gold-standard oximeter for SpO2).

    8. The sample size for the training set
    The document does not mention a "training set" as this device predates the common application of machine learning with distinct training and test sets in medical device submissions. The device is described as "microprocessor-based, software-driven" and that "signal-acquisition and -processing technologies and the basic parts of the device software were re-used from former devices." This implies that the design and performance were likely based on established engineering principles and validation against known physiological signals, rather than iterative machine learning training.

    9. How the ground truth for the training set was established
    Not applicable, as a distinct "training set" in the modern machine learning sense is not indicated. The software and signal processing were re-used from former devices, suggesting that their development and validation would have followed standard engineering practices for medical devices at the time, likely involving comparisons to established reference measurements.

    Ask a Question

    Ask a specific question about this device

    Page 1 of 1