(40 days)
Indicated for central monitoring of multiple adult, pediatric, and neonatal patients; and where the clinician decides to monitor cardiac arrhythmia of adult, pediatric, and neonatal patients and/or ST segment of adult patients to gain information for treatment, to monitor adequacy of treatment, or to exclude causes of symptoms.
The Philips IntelliVue Information Center iX Software Revision B.01 is central station software that runs on off-the-shelf Windows PCs and servers which can connect to recorders for waveform printing. It displays physiologic waves and parameters from multiple patient connected monitors and telemetry devices in summary or detailed format, and generates alarm signals. It provides retrospective review applications and a variety of data import and export functions.
This 510(k) premarket notification for the M3290B Philips IntelliVue Information Center iX Software Release B.01 does not contain detailed information about the acceptance criteria or a specific study proving the device meets those criteria. The document primarily focuses on establishing substantial equivalence to a predicate device (M3290B IntelliVue Information Center software, Release A.0, marketed pursuant to K102495) based on shared indications for use and technological characteristics.
Instead, the document states:
"Verification, validation, and testing activities, where required to establish the performance, functionality, and reliability characteristics of the new device with respect to the predicate are performed. Testing involved system level tests, performance tests, and safety testing from hazard analysis. Pass/Fail criteria were based on the specifications cleared for the predicate device and test results showed substantial equivalence."
This indicates that internal testing was conducted against existing specifications (presumably for the predicate device) to verify performance. However, the specific acceptance criteria, the detailed results, and the methodology of these tests are not provided in this summary.
Therefore, most of the requested information cannot be extracted from this document.
Here's what can be gathered, with limitations:
1. A table of acceptance criteria and the reported device performance
- Acceptance Criteria: Not explicitly stated in terms of quantitative metrics (e.g., sensitivity, specificity, accuracy for arrhythmia detection). The document generalizes: "Pass/Fail criteria were based on the specifications cleared for the predicate device."
- Reported Device Performance: Not explicitly provided with specific numbers. The document states: "test results showed substantial equivalence. The M3290B IntelliVue Information Center Software meets all defined reliability requirements and performance claims."
2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)
- Not specified in the document.
3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)
- Not specified in the document. The general nature of a "central station software" suggests the ground truth for internal performance testing might be based on established medical standards or reference equipment, rather than direct expert labeling for each data point in the way an AI diagnostic algorithm might.
4. Adjudication method (e.g. 2+1, 3+1, none) for the test set
- Not specified in the document.
5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance
- No, an MRMC study is not mentioned. This device is a central station software for displaying physiological data and generating alarms, not an AI-assisted diagnostic tool in the typical sense that would undergo MRMC studies for improved human reader performance.
6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done
- The document implies standalone testing was performed to verify "system level tests, performance tests, and safety testing," but details on what constitutes "standalone performance" in this context (e.g., specific event detection accuracy) are not provided. Given it's a central monitoring system with alarm functions, its "standalone" performance would likely relate to its ability to correctly process and display data and trigger alarms according to predefined thresholds.
7. The type of ground truth used (expert consensus, pathology, outcomes data, etc)
- Not specified. For a physiological monitor, ground truth would typically come from known calibrated inputs, reference measurements, or established medical standards for event detection.
8. The sample size for the training set
- Not applicable as this is a software update to an existing monitoring system, not a new AI algorithm that uses a "training set" in the machine learning sense. The testing likely involved verification and validation against functional specifications.
9. How the ground truth for the training set was established
- Not applicable (see point 8).
§ 870.2300 Cardiac monitor (including cardiotachometer and rate alarm).
(a)
Identification. A cardiac monitor (including cardiotachometer and rate alarm) is a device used to measure the heart rate from an analog signal produced by an electrocardiograph, vectorcardiograph, or blood pressure monitor. This device may sound an alarm when the heart rate falls outside preset upper and lower limits.(b)
Classification. Class II (performance standards).