K Number
K212777
Date Cleared
2021-09-24

(23 days)

Product Code
Regulation Number
892.2050
Panel
RA
Reference & Predicate Devices
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

Collaboration Live is indicated for remote console access of the Philips ultrasound system for diagnostic image viewing and review, consultation, guidance, support, and education in real time. Access must be granted by the healthcare professionals operating the ultrasound system. Compliance with the technical and operator requirements specified in the User Manual is required.

It is the responsibility of the healthcare professionals at the remote client to ensure image quality, display contrast, and ambient light conditions are consistent with the generally accepted standards of the clinical application.

Device Description

Collaboration Live is software-based communication feature integrated with Philips Diagnostic Ultrasound Systems. Collaboration Live, together with remote-client Reacts software, enables two-way communication of text, voice, image, and video information between an ultrasound local system operator and a remote healthcare professional on a Windows, iOS, Android, or Chrome platform. Collaboration Live-Reacts facilitates: 1) remote diagnostic viewing and review, 2) remote clinical training and education, 3) remote peer-topeer collaboration, and 4) remote service support. Collaboration Live functionality includes a remote-control feature in which the ultrasound local system operator may grant a qualified remote user control of the ultrasound system parameters via a virtual control panel and virtual touch screen. By meeting the technical, operator, and environment requirements specified in the User Manual, healthcare professionals, using Reacts, may provide clinical diagnoses from a remote location as they would directly on the ultrasound system.

AI/ML Overview

The provided text describes a 510(k) premarket notification for Philips Ultrasound, Inc.'s "Collaboration Live" device, and its substantial equivalence to a predicate device. However, the document does not contain the specific details about acceptance criteria or a detailed study proving the device meets those criteria, as typically found in a clinical validation study.

The document states:

  • "Validation testing was completed and produced under Philips's design controls procedures that comply with 21 CFR 820.30. Validation Testing, with pre-determined criteria, was conducted to evaluate and demonstrate the equivalency of Collaboration Live used on Windows, iOS, Android, and Chrome platforms."
  • "The results of the design control activities support that the software, which expands the compatible platforms and adds three features, does not raise new questions of safety or effectiveness. In addition to labeling, testing performed demonstrates that the Collaboration Live software meets the defined requirements and performance claims and are substantially equivalent to the predicate software (K201665)."

This implies that some form of validation against pre-determined criteria was performed, but the specifics of those criteria (e.g., quantitative performance metrics like accuracy, sensitivity, specificity, resolution, latency, etc., for diagnostic image viewing and review), the study design, sample sizes, ground truth establishment, or expert involvement are not included in this FDA 510(k) summary document.

The 510(k) summary focuses on demonstrating substantial equivalence, primarily by expanding compatible platforms and adding new features (Network Indicator, Remote Image Quality, Remote User Measurement) while maintaining the original indications for use. It asserts that these changes do not raise new questions of safety or effectiveness.

Therefore, many of the requested details cannot be extracted from this document.

Here's what can be inferred or explicitly stated based on the provided text, and what is missing:


Acceptance Criteria and Device Performance:

  • 1. A table of acceptance criteria and the reported device performance:
    • Acceptance Criteria: Not explicitly stated in quantitative terms within this document. The document refers to "pre-determined criteria" and "defined requirements and performance claims" but does not list them. Given the device's function (remote console access for diagnostic image viewing and review, consultation, guidance, support, and education), the acceptance criteria would likely relate to image quality (resolution, clarity, color representation), latency, reliability of connection, and functionality of remote control features, all assessed for clinical applicability.
    • Reported Device Performance: The document only states that "testing performed demonstrates that the Collaboration Live software meets the defined requirements and performance claims." Specific performance metrics are not provided.

Study Details:

  • 2. Sample sizes used for the test set and the data provenance: Not provided in the document.
    • Data Provenance: Not provided.
    • Retrospective/Prospective: Not specified.
  • 3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts: Not provided. The document mentions "healthcare professionals at the remote client" being responsible for ensuring image quality, but this refers to the user's responsibility, not the study's ground truth establishment.
  • 4. Adjudication method (e.g. 2+1, 3+1, none) for the test set: Not provided.
  • 5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:
    • Not explicitly stated and highly unlikely for this device. This device is a remote access/collaboration tool, not an AI diagnostic algorithm intended to improve human reader performance on a diagnostic task. Its validation would focus on the fidelity and functionality of the remote access.
  • 6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done: Not applicable. This device is inherently a human-in-the-loop system for real-time collaboration. The "alone" aspect would be assessing the technical performance of the remote viewing and control, which is implied by "Validation Testing, with pre-determined criteria, was conducted to evaluate and demonstrate the equivalency of Collaboration Live used on Windows, iOS, Android, and Chrome platforms."
  • 7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.): Not specified. For a device like this, ground truth would likely involve technical performance metrics (e.g., objective image quality assessments, latency measurements, functional tests of remote controls) alongside subjective assessments by clinicians regarding its usability and diagnostic adequacy when viewing images remotely.
  • 8. The sample size for the training set: Not applicable and not provided. This is not an AI/ML device that requires a "training set" in the typical sense for learning patterns from data. Its development involves software engineering and functional testing.
  • 9. How the ground truth for the training set was established: Not applicable and not provided.

Summary of Device and its Purpose:

  • Device Name: Collaboration Live
  • Manufacturer: Philips Ultrasound, Inc.
  • Regulation Name: Medical image management and processing system
  • Regulatory Class: Class II (Product Codes: LLZ, IYN, IYO)
  • Indications for Use: Remote console access of the Philips ultrasound system for diagnostic image viewing and review, consultation, guidance, support, and education in real time. Access must be granted by the healthcare professionals operating the ultrasound system.
  • Key Features (as per this submission): Expansion of compatible platforms (Windows, iOS, Android, Chrome) and addition of Network Indicator, Remote Image Quality, Remote User Measurement.
  • Predicate Device: Collaboration Live – Philips Ultrasound (K201665, cleared September 15, 2020)

In conclusion, while the document confirms that validation testing demonstrating equivalency and meeting defined requirements was performed, it does not disclose the specific acceptance criteria or the detailed results of that study, which is typical for a 510(k) summary as it focuses on establishing substantial equivalence rather than providing a full clinical validation report.

§ 892.2050 Medical image management and processing system.

(a)
Identification. A medical image management and processing system is a device that provides one or more capabilities relating to the review and digital processing of medical images for the purposes of interpretation by a trained practitioner of disease detection, diagnosis, or patient management. The software components may provide advanced or complex image processing functions for image manipulation, enhancement, or quantification that are intended for use in the interpretation and analysis of medical images. Advanced image manipulation functions may include image segmentation, multimodality image registration, or 3D visualization. Complex quantitative functions may include semi-automated measurements or time-series measurements.(b)
Classification. Class II (special controls; voluntary standards—Digital Imaging and Communications in Medicine (DICOM) Std., Joint Photographic Experts Group (JPEG) Std., Society of Motion Picture and Television Engineers (SMPTE) Test Pattern).