(173 days)
Intended Use
The intended use of the Information Center Software is to display physiologic waves, parameters, and trends, format data for strip chart recordings and provide the secondary annunciation of alarms from other networked medical devices at a centralized location. The Information Center Software provides for the retrospective review of alarms, physiologic waves and parameters from its database.
An additional intended use of the Information Center Software is to provide primary annunciation of alarms and configuration and control access for networked telemetry monitors.
This product is intended for use in health care facilities by trained healthcare professionals. This product is not intended for home use.
Indications for Use
Indicated for central monitoring of multiple adult and all pediatric subgroups (Newborn (neonate). Infant, Child, Adolescent) patients; and where the clinician decides to monitor cardiac arthythmia of adult, pediatric, and neonatal patients and/or ST segment of adult patients to gain information for treatment, to monitor adequacy of treatment, or to exclude causes of symptoms.
The Philips IntelliVue Information Center iX Software Revision C.O is central station software that runs on off-the-shelf Windows PCs and servers which can connect to recorders for waveform printing. It displays physiologic waves and parameters from multiple patient connected monitors and telemetry devices in summary or detailed format, and generates alarm signals. It provides retrospective review applications and a variety of data import and export functions.
The provided FDA 510(k) summary for the Philips IntelliVue Information Center iX (K153702) discusses software changes but does not contain detailed information about specific acceptance criteria, device performance, or a study rigorously proving the device meets new acceptance criteria. Instead, it focuses on demonstrating substantial equivalence to a predicate device (K143057) through non-clinical testing of design, functionality, and reliability, rather than clinical performance for new applications.
The document explicitly states: "Clinical Performance testing for M3290B Philips IntelliVue Information Center iX software Release C.0 was not performed, as there were no new clinical applications that had hazards or risk mitigations that required a clinical performance testing to support equivalence."
Therefore, I cannot populate the requested tables and sections with specific acceptance criteria and performance data for this particular 510(k) application, as such detailed information is not present in the provided text. The submission relies on demonstrating that the software updates do not introduce new safety or effectiveness concerns compared to the already cleared predicate device.
However, based on the non-clinical testing performed and the general approach of a 510(k) summary seeking substantial equivalence, I can describe what would typically be the nature of the acceptance criteria and study in such a scenario, by interpreting the information given and stating what is not present.
Description of Acceptance Criteria and Study to Prove Device Meets Acceptance Criteria
The provided 510(k) summary for the Philips IntelliVue Information Center iX (K153702) focuses on demonstrating substantial equivalence to a predicate device (K143057) for software updates. It explicitly states that clinical performance testing was not performed because no new clinical applications or significant new hazards/risks were introduced that would necessitate it. Therefore, the "acceptance criteria" here are primarily tied to verifying that the updated software maintains the safety, effectiveness, functionality, and reliability characteristics of the predicate device, as confirmed through non-clinical testing.
1. Table of Acceptance Criteria and Reported Device Performance
Given that no clinical performance study was conducted for new clinical applications, specific numerical performance metrics (e.g., sensitivity, specificity for arrhythmia detection) are not reported for this particular 510(k) submission. The acceptance criteria and "performance" are framed around maintaining equivalence to the predicate device.
Acceptance Criterion (Implied/General) | Reported Device Performance (as per document) |
---|---|
Functional Equivalence | - Data acquisition from Philips Efficia monitors (new). |
- Transmission of web interface to IntelliVue bedside monitors (new/expanded outbound data services).
- Expanded ability to store complex data sets from various additional sources (PDX Data Warehouse).
- Integration of Early Warning Score (EWS) information from bedside monitors (new/expanded application).
- Auto-assignment of bed labels when configured.
- Display of Philips Efficia monitor integration similar to IntelliVue.
- Management association for 'orphan beds' in patient and equipment management.
- Display of previously gathered ST/AR algorithm data (no changes to algorithm). |
| Reliability and Stability | - Verification, validation, and testing activities, including system level tests, performance tests, and safety testing from hazard analysis. - Test results showed substantial equivalence, meeting all defined reliability requirements and performance claims based on specifications cleared for the predicate device. |
| Safety | - Risk Analysis conducted. - Design Reviews conducted.
- Testing involved safety testing from hazard analysis.
- No new safety and/or effectiveness concerns were identified compared to the predicate device. |
| Performance Standards | - Compliance with Philips verification and validation processes. - Pass/Fail criteria based on specifications cleared for the predicate device.
- Compliance with requirements specified in international and FDA-recognized consensus standards. |
| Clinical Performance (New Risks) | - Clinical performance testing not performed as no new clinical applications with hazards or risk mitigations requiring it were identified. The device's clinical performance is thereby considered equivalent to the predicate, which would have established such performance in its own clearance. |
2. Sample Size for Test Set and Data Provenance
- Test Set Sample Size: Not specified in the provided document. The testing was non-clinical, likely involving various software modules and integration points rather than a "test set" of patient data in the conventional sense for clinical performance.
- Data Provenance: Not applicable in the context of clinical patient data for this submission, as the testing was non-clinical (engineering verification and validation).
3. Number of Experts Used to Establish Ground Truth for the Test Set and Qualifications
- Number of Experts: Not applicable, as the testing was non-clinical. Ground truth for software functionality, reliability, and safety is typically established against design specifications, recognized standards, and hazard analyses, rather than clinical expert consensus on patient data.
- Qualifications of Experts: Not specified. Testing would have been conducted by Philips' internal engineering, quality assurance, and regulatory teams.
4. Adjudication Method for the Test Set
- Adjudication Method: Not applicable, as no clinical ground truth requiring adjudication was established for this submission.
5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study
- MRMC Study Done: No. An MRMC study is a clinical study involving multiple human readers interpreting cases to assess diagnostic performance. This submission explicitly states "Clinical Performance testing... was not performed."
6. Standalone Performance Study (Algorithm only)
- Standalone Performance Study Done: No, not in the sense of a new clinical algorithm being evaluated for its standalone diagnostic performance. The document mentions "Release C.0 allows data previously gathered by the algorithm [ST/AR] to be displayed. No changes to the algorithm are present." This indicates that existing algorithms (like ST/AR for arrhythmia/ST segment analysis) were unchanged, and their performance would have been established in previous 510(k) clearances for the predicate device. The focus here is on the information center's ability to process and display that data.
7. Type of Ground Truth Used
- Type of Ground Truth: For the non-clinical testing performed, the ground truth was based on:
- Product Specifications: Meeting defined requirements.
- Design Specifications: Adherence to engineered design.
- Hazard Analysis: Ensuring risks are mitigated and safety maintained.
- Predicate Device Specifications: Ensuring equivalence in performance and functionality to the device previously cleared.
- International and FDA-recognized Consensus Standards: Compliance with industry benchmarks.
8. Sample Size for the Training Set
- Training Set Sample Size: Not applicable. This submission concerns software updates to a central monitoring system, not the development of new machine learning algorithms requiring a training set of data. Existing algorithms (like ST/AR) maintain their previously established performance and would have been "trained" (or validated) in earlier submissions if they involved such methodologies.
9. How Ground Truth for the Training Set Was Established
- How Ground Truth Was Established: Not applicable, as no new training set was used for this submission.
§ 870.1025 Arrhythmia detector and alarm (including ST-segment measurement and alarm).
(a)
Identification. The arrhythmia detector and alarm device monitors an electrocardiogram and is designed to produce a visible or audible signal or alarm when atrial or ventricular arrhythmia, such as premature contraction or ventricular fibrillation, occurs.(b)
Classification. Class II (special controls). The guidance document entitled “Class II Special Controls Guidance Document: Arrhythmia Detector and Alarm” will serve as the special control. See § 870.1 for the availability of this guidance document.