Search Filters

Search Results

Found 3 results

510(k) Data Aggregation

    K Number
    K163584
    Date Cleared
    2017-07-07

    (199 days)

    Product Code
    Regulation Number
    870.1025
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    M3290B Philips IntelliVue Information Center iX

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The intended use of the Philips Patient Information Center iX software application is to:

    Receive, aggregate, process, distribute and display physiologic waves, parameters, alarms and events at locations other than at the patient, for multiple patients.

    Determine alarm conditions and generate alarm signals for Philips approved medical devices, that send physiological data and do not have the ability to determine the alarm condition.

    • Algorithms present in the software are limited to the ST/AR ECG (for arrhythmia, ST Segment and QT Segment Monitoring) and SpO2.

    Generate alarm signals for user notification, based on the alarm signal determined and sent by Philips approved medical devices.

    Perform diagnostic 12-Lead analysis and interpretation based on raw ECG data samples provided from Philips approved medical devices. Result may be displayed, printed and/or distributed to Philips approved medical devices.

    Provide review and trend application data, designed to contribute to the screening of patient condition or visual indications provided are intended to support the judgement of a medical professional and are not intended to be the sole source of information for decision making, thus these applications are not intended for diagnoses or active patient monitoring where immediate action is required.

    Provide connection to other systems not associated with active patient monitoring, such as information systems. The software performs the action to transfer, store, convert from one format to another according to to display medical device data.

    The Information Center Software is intended for use in professional healthcare facilities by trained healthcare professionals. The Information Center Software is not intended for home use.

    Indicated for use when monitoring adult and/or specified pediatric subgroups (Newborn (neonate), Infant, Child, Adolescent) patients as indicated by labeling of the medical device providing the data. Rx only.

    Device Description

    The Philips Patient Information Center uses off-the-shelf Windows PCs and servers, combined with the Patient Information Center iX M3290B software Release C.01 to provide centralized display of physiologic waves, parameters, and trends, format data for strip chart recordings printed reports, and secondary annunciation of alarms from other networked medical devices. The M3290B Software provides for the retrospective review of alarms, physiologic waves and parameters from its database. Additionally, the M3290B Software provides primary annunciation of alarms and configuration and control access for networked telemetry monitors.

    Compatible Accessories include: Mobile Caregiver – a medical device data system, viewing only, mobile application associated with the Enhanced Web Viewing feature cleared in the predicate. This is not a new mobile application, and it has no changes that introduce significant risks for the PIC iX C.01 release.

    AI/ML Overview

    The provided document, a 510(k) premarket notification for the Philips Medical Systems "M3290B Patient Information Center iX Release C.01," primarily focuses on establishing substantial equivalence to a predicate device (M3290B Philips IntelliVue Information Center iX software Revision C.0). The document asserts that the changes in the new device do not introduce significant risks or new clinical applications requiring clinical performance testing. Therefore, it does not describe a clinical study in the traditional sense, particularly one involving an AI algorithm that would have specific acceptance criteria for diagnostic performance metrics (e.g., sensitivity, specificity, AUC) or human-AI reader performance.

    The device described is a patient information center software, which functions to receive, aggregate, process, distribute, and display physiological data, determine alarm conditions, generate alarm signals, and perform diagnostic 12-Lead analysis and interpretation. The algorithms mentioned (ST/AR ECG for arrhythmia, ST Segment, and QT Segment Monitoring) are present in the software but the performance evaluation described focuses on system-level testing and verification/validation to ensure it functions as intended and is as safe and effective as its predicate.

    Based on the provided text, here is an analysis of the requested information:

    1. A table of acceptance criteria and the reported device performance

    The document does not provide a table of acceptance criteria in the form of specific performance metrics (like sensitivity, specificity, accuracy) for a new clinical application or AI algorithm, nor does it present reported device performance against such metrics. Instead, the acceptance criteria are implicitly tied to the system-level functional and safety requirements, and the reported performance is that the device "meets all defined reliability requirements and performance claims" and "test results showed substantial equivalence."

    Acceptance Criteria (Implicit from document)Reported Device Performance
    Functional Equivalence: Device performs functions (receive, aggregate, process, distribute, display data, generate alarms, 12-Lead analysis) as intended and as demonstrated by the predicate device."Performance, functionality, and reliability characteristics of the new device with respect to the predicate are performed."
    Safety: Device operates without introducing new or significant safety risks compared to the predicate."Testing involved system level tests, performance tests, and safety testing from hazard analysis."
    "The M3290B Philips IntelliVue Information Center iX software Release C.01 meets all defined reliability requirements and performance claims."
    Reliability: Device maintains consistent and dependable operation."The M3290B Philips IntelliVue Information Center iX software Release C.01 meets all defined reliability requirements and performance claims."
    Substantial Equivalence: Device is equivalent to the predicate in design, technology, intended use, safety, and effectiveness."M3290B Philips IntelliVue Information Center iX software Release C.01 is substantially equivalent to the predicate device M3290B Philips IntelliVue Information Center iX software Release C.O (K153702)."
    Adherence to Specifications/Standards: Device complies with Philips' verification and validation processes, and relevant consensus standards."Tested in accordance with Philips verification and validation processes."
    "Complied with the requirements specified in the international and FDA-recognized consensus standards."

    2. Sample size used for the test set and the data provenance (e.g., country of origin of the data, retrospective or prospective)

    The document states: "Clinical Performance testing for M3290B Philips IntelliVue Information Center iX software Release C.01 was not performed, as there were no new clinical applications that had hazards or risk mitigations that required a clinical performance testing to support equivalence."

    Therefore, there is no specific test set or clinical data (sample size, provenance) described for evaluating the performance of the integrated algorithms. The testing was non-clinical, focusing on system-level verification and validation.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g., radiologist with 10 years of experience)

    Since no clinical performance study was conducted to evaluate the algorithms' diagnostic performance, there was no ground truth established by experts for a test set. The validation was against the defined specifications and the performance of the predicate device.

    4. Adjudication method (e.g., 2+1, 3+1, none) for the test set

    Not applicable, as no clinical test set requiring expert adjudication was used.

    5. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    Not applicable. The document explicitly states that clinical performance testing was not performed. This device is an information center, not an AI-powered diagnostic tool intended to directly assist human readers in image interpretation or a similar task where MRMC studies are typically conducted.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    The document alludes to "Algorithms present in the software are limited to the ST/AR ECG (for arrhythmia, ST Segment and QT Segment Monitoring) and SpO2," and "Perform diagnostic 12-Lead analysis and interpretation". However, it does not describe a standalone performance study for these algorithms. The testing described is system-level verification and validation. The 12-lead analysis is stated to be "based on raw ECG data samples provided from Philips approved medical devices" and the "Result may be displayed, printed and/or distributed". The output data (e.g., from 12-lead analysis) is explicitly stated to "not be the sole source of information for decision making," indicating a human-in-the-loop context, but no study of this combined performance is presented.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc)

    Not applicable, as no clinical performance study requiring ground truth was conducted. The assessment relied on verifying that the system's output aligned with predefined functional specifications and matched the performance of the predicate device in a non-clinical testing environment.

    8. The sample size for the training set

    Not applicable. This document describes a software update for a patient information center, not a machine learning model requiring a training set. The existing algorithms are presumed to be validated from prior predicate device clearances.

    9. How the ground truth for the training set was established

    Not applicable, as no training set for a machine learning model is described.

    Ask a Question

    Ask a specific question about this device

    K Number
    K153702
    Date Cleared
    2016-06-13

    (173 days)

    Product Code
    Regulation Number
    870.1025
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    M3290B Philips IntelliVue Information Center iX

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    Intended Use

    The intended use of the Information Center Software is to display physiologic waves, parameters, and trends, format data for strip chart recordings and provide the secondary annunciation of alarms from other networked medical devices at a centralized location. The Information Center Software provides for the retrospective review of alarms, physiologic waves and parameters from its database.

    An additional intended use of the Information Center Software is to provide primary annunciation of alarms and configuration and control access for networked telemetry monitors.

    This product is intended for use in health care facilities by trained healthcare professionals. This product is not intended for home use.

    Indications for Use

    Indicated for central monitoring of multiple adult and all pediatric subgroups (Newborn (neonate). Infant, Child, Adolescent) patients; and where the clinician decides to monitor cardiac arthythmia of adult, pediatric, and neonatal patients and/or ST segment of adult patients to gain information for treatment, to monitor adequacy of treatment, or to exclude causes of symptoms.

    Device Description

    The Philips IntelliVue Information Center iX Software Revision C.O is central station software that runs on off-the-shelf Windows PCs and servers which can connect to recorders for waveform printing. It displays physiologic waves and parameters from multiple patient connected monitors and telemetry devices in summary or detailed format, and generates alarm signals. It provides retrospective review applications and a variety of data import and export functions.

    AI/ML Overview

    The provided FDA 510(k) summary for the Philips IntelliVue Information Center iX (K153702) discusses software changes but does not contain detailed information about specific acceptance criteria, device performance, or a study rigorously proving the device meets new acceptance criteria. Instead, it focuses on demonstrating substantial equivalence to a predicate device (K143057) through non-clinical testing of design, functionality, and reliability, rather than clinical performance for new applications.

    The document explicitly states: "Clinical Performance testing for M3290B Philips IntelliVue Information Center iX software Release C.0 was not performed, as there were no new clinical applications that had hazards or risk mitigations that required a clinical performance testing to support equivalence."

    Therefore, I cannot populate the requested tables and sections with specific acceptance criteria and performance data for this particular 510(k) application, as such detailed information is not present in the provided text. The submission relies on demonstrating that the software updates do not introduce new safety or effectiveness concerns compared to the already cleared predicate device.

    However, based on the non-clinical testing performed and the general approach of a 510(k) summary seeking substantial equivalence, I can describe what would typically be the nature of the acceptance criteria and study in such a scenario, by interpreting the information given and stating what is not present.


    Description of Acceptance Criteria and Study to Prove Device Meets Acceptance Criteria

    The provided 510(k) summary for the Philips IntelliVue Information Center iX (K153702) focuses on demonstrating substantial equivalence to a predicate device (K143057) for software updates. It explicitly states that clinical performance testing was not performed because no new clinical applications or significant new hazards/risks were introduced that would necessitate it. Therefore, the "acceptance criteria" here are primarily tied to verifying that the updated software maintains the safety, effectiveness, functionality, and reliability characteristics of the predicate device, as confirmed through non-clinical testing.

    1. Table of Acceptance Criteria and Reported Device Performance

    Given that no clinical performance study was conducted for new clinical applications, specific numerical performance metrics (e.g., sensitivity, specificity for arrhythmia detection) are not reported for this particular 510(k) submission. The acceptance criteria and "performance" are framed around maintaining equivalence to the predicate device.

    Acceptance Criterion (Implied/General)Reported Device Performance (as per document)
    Functional Equivalence- Data acquisition from Philips Efficia monitors (new).
    • Transmission of web interface to IntelliVue bedside monitors (new/expanded outbound data services).
    • Expanded ability to store complex data sets from various additional sources (PDX Data Warehouse).
    • Integration of Early Warning Score (EWS) information from bedside monitors (new/expanded application).
    • Auto-assignment of bed labels when configured.
    • Display of Philips Efficia monitor integration similar to IntelliVue.
    • Management association for 'orphan beds' in patient and equipment management.
    • Display of previously gathered ST/AR algorithm data (no changes to algorithm). |
      | Reliability and Stability | - Verification, validation, and testing activities, including system level tests, performance tests, and safety testing from hazard analysis.
    • Test results showed substantial equivalence, meeting all defined reliability requirements and performance claims based on specifications cleared for the predicate device. |
      | Safety | - Risk Analysis conducted.
    • Design Reviews conducted.
    • Testing involved safety testing from hazard analysis.
    • No new safety and/or effectiveness concerns were identified compared to the predicate device. |
      | Performance Standards | - Compliance with Philips verification and validation processes.
    • Pass/Fail criteria based on specifications cleared for the predicate device.
    • Compliance with requirements specified in international and FDA-recognized consensus standards. |
      | Clinical Performance (New Risks) | - Clinical performance testing not performed as no new clinical applications with hazards or risk mitigations requiring it were identified. The device's clinical performance is thereby considered equivalent to the predicate, which would have established such performance in its own clearance. |

    2. Sample Size for Test Set and Data Provenance

    • Test Set Sample Size: Not specified in the provided document. The testing was non-clinical, likely involving various software modules and integration points rather than a "test set" of patient data in the conventional sense for clinical performance.
    • Data Provenance: Not applicable in the context of clinical patient data for this submission, as the testing was non-clinical (engineering verification and validation).

    3. Number of Experts Used to Establish Ground Truth for the Test Set and Qualifications

    • Number of Experts: Not applicable, as the testing was non-clinical. Ground truth for software functionality, reliability, and safety is typically established against design specifications, recognized standards, and hazard analyses, rather than clinical expert consensus on patient data.
    • Qualifications of Experts: Not specified. Testing would have been conducted by Philips' internal engineering, quality assurance, and regulatory teams.

    4. Adjudication Method for the Test Set

    • Adjudication Method: Not applicable, as no clinical ground truth requiring adjudication was established for this submission.

    5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study

    • MRMC Study Done: No. An MRMC study is a clinical study involving multiple human readers interpreting cases to assess diagnostic performance. This submission explicitly states "Clinical Performance testing... was not performed."

    6. Standalone Performance Study (Algorithm only)

    • Standalone Performance Study Done: No, not in the sense of a new clinical algorithm being evaluated for its standalone diagnostic performance. The document mentions "Release C.0 allows data previously gathered by the algorithm [ST/AR] to be displayed. No changes to the algorithm are present." This indicates that existing algorithms (like ST/AR for arrhythmia/ST segment analysis) were unchanged, and their performance would have been established in previous 510(k) clearances for the predicate device. The focus here is on the information center's ability to process and display that data.

    7. Type of Ground Truth Used

    • Type of Ground Truth: For the non-clinical testing performed, the ground truth was based on:
      • Product Specifications: Meeting defined requirements.
      • Design Specifications: Adherence to engineered design.
      • Hazard Analysis: Ensuring risks are mitigated and safety maintained.
      • Predicate Device Specifications: Ensuring equivalence in performance and functionality to the device previously cleared.
      • International and FDA-recognized Consensus Standards: Compliance with industry benchmarks.

    8. Sample Size for the Training Set

    • Training Set Sample Size: Not applicable. This submission concerns software updates to a central monitoring system, not the development of new machine learning algorithms requiring a training set of data. Existing algorithms (like ST/AR) maintain their previously established performance and would have been "trained" (or validated) in earlier submissions if they involved such methodologies.

    9. How Ground Truth for the Training Set Was Established

    • How Ground Truth Was Established: Not applicable, as no new training set was used for this submission.
    Ask a Question

    Ask a specific question about this device

    K Number
    K143057
    Date Cleared
    2014-12-03

    (40 days)

    Product Code
    Regulation Number
    870.2300
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    M3290B Philips IntelliVue Information Center iX

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    Indicated for central monitoring of multiple adult, pediatric, and neonatal patients; and where the clinician decides to monitor cardiac arrhythmia of adult, pediatric, and neonatal patients and/or ST segment of adult patients to gain information for treatment, to monitor adequacy of treatment, or to exclude causes of symptoms.

    Device Description

    The Philips IntelliVue Information Center iX Software Revision B.01 is central station software that runs on off-the-shelf Windows PCs and servers which can connect to recorders for waveform printing. It displays physiologic waves and parameters from multiple patient connected monitors and telemetry devices in summary or detailed format, and generates alarm signals. It provides retrospective review applications and a variety of data import and export functions.

    AI/ML Overview

    This 510(k) premarket notification for the M3290B Philips IntelliVue Information Center iX Software Release B.01 does not contain detailed information about the acceptance criteria or a specific study proving the device meets those criteria. The document primarily focuses on establishing substantial equivalence to a predicate device (M3290B IntelliVue Information Center software, Release A.0, marketed pursuant to K102495) based on shared indications for use and technological characteristics.

    Instead, the document states:
    "Verification, validation, and testing activities, where required to establish the performance, functionality, and reliability characteristics of the new device with respect to the predicate are performed. Testing involved system level tests, performance tests, and safety testing from hazard analysis. Pass/Fail criteria were based on the specifications cleared for the predicate device and test results showed substantial equivalence."

    This indicates that internal testing was conducted against existing specifications (presumably for the predicate device) to verify performance. However, the specific acceptance criteria, the detailed results, and the methodology of these tests are not provided in this summary.

    Therefore, most of the requested information cannot be extracted from this document.

    Here's what can be gathered, with limitations:

    1. A table of acceptance criteria and the reported device performance

    • Acceptance Criteria: Not explicitly stated in terms of quantitative metrics (e.g., sensitivity, specificity, accuracy for arrhythmia detection). The document generalizes: "Pass/Fail criteria were based on the specifications cleared for the predicate device."
    • Reported Device Performance: Not explicitly provided with specific numbers. The document states: "test results showed substantial equivalence. The M3290B IntelliVue Information Center Software meets all defined reliability requirements and performance claims."

    2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)

    • Not specified in the document.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)

    • Not specified in the document. The general nature of a "central station software" suggests the ground truth for internal performance testing might be based on established medical standards or reference equipment, rather than direct expert labeling for each data point in the way an AI diagnostic algorithm might.

    4. Adjudication method (e.g. 2+1, 3+1, none) for the test set

    • Not specified in the document.

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    • No, an MRMC study is not mentioned. This device is a central station software for displaying physiological data and generating alarms, not an AI-assisted diagnostic tool in the typical sense that would undergo MRMC studies for improved human reader performance.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    • The document implies standalone testing was performed to verify "system level tests, performance tests, and safety testing," but details on what constitutes "standalone performance" in this context (e.g., specific event detection accuracy) are not provided. Given it's a central monitoring system with alarm functions, its "standalone" performance would likely relate to its ability to correctly process and display data and trigger alarms according to predefined thresholds.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc)

    • Not specified. For a physiological monitor, ground truth would typically come from known calibrated inputs, reference measurements, or established medical standards for event detection.

    8. The sample size for the training set

    • Not applicable as this is a software update to an existing monitoring system, not a new AI algorithm that uses a "training set" in the machine learning sense. The testing likely involved verification and validation against functional specifications.

    9. How the ground truth for the training set was established

    • Not applicable (see point 8).
    Ask a Question

    Ask a specific question about this device

    Page 1 of 1