Search Filters

Search Results

Found 4 results

510(k) Data Aggregation

    K Number
    K242329
    Date Cleared
    2024-11-18

    (104 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    CT Collaboration Live

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    CT Collaboration Live is indicated for remote communication with CT console (via chat, call, video call, screen sharing and remote access) by a qualified remote clinical user for consultation, guidance, support, and training in real time. Remote access must be granted by the CT technologist operating the system. Remote access is only available for Philips CT systems supporting CT Collaboration Live connectivity capabilities. Images reviewed remotely are not for diagnostic use.

    Device Description

    The proposed CT Collaboration Live is a software application integrated in Philips Computed Tomography (CT) X-Ray CT 5300 Systems. CT Collaboration Live enables two-way communication of text, voice, image, and video information between a CT system operator and a remote user on a Windows device. CT Collaboration Live facilitates: 1) peer-to-peer consultation and training, 2) system-level remote sharing & operation, 3) access to live image feed and 4) remote expert user(s) and physician consultation. CT Collaboration Live functionality includes a remote-control feature in which the CT system operator may grant a qualified remote user control of the CT system parameters via a virtual control panel and virtual touch screen.

    AI/ML Overview

    I am sorry, but the provided text does not contain the detailed information necessary to answer your request regarding acceptance criteria and a study proving device performance.

    The document is a 510(k) summary for Philips Healthcare's "CT Collaboration Live" device. It primarily focuses on demonstrating substantial equivalence to a predicate device (Collaboration Live, K200179) rather than detailing specific acceptance criteria and the results of a primary study to prove those criteria were met for the new device.

    Here's a breakdown of what is and is not in the document, relating to your request:

    What is mentioned (but not in enough detail for your request):

    • Acceptance Criteria: The text states, "All tests were used to support substantial equivalence of the proposed CT Collaboration Live and to demonstrate that CT Collaboration Live: ... Meets the acceptance criteria and is adequate for its intended use." However, it does not provide a table of these acceptance criteria or the specific performance metrics achieved against them.
    • Study (Non-Clinical Performance Data): The document mentions "Non-clinical performance software verification testing has been performed" and that "Software verification activities demonstrate that the CT Collaboration Live software application meets the design input requirements." It also states, "The summary and conclusion of results are provided in the System Verification Test Report." However, the actual results of this testing that would prove the device met acceptance criteria are not included in this document.
    • No Clinical Study: It explicitly states, "The proposed CT Collaboration Live did not require a clinical study since substantial equivalence to the predicate device Collaboration Live (K200179) was demonstrated."

    What is NOT mentioned (which are required for your request):

    • A table of acceptance criteria and reported device performance.
    • Sample size used for the test set and data provenance.
    • Number of experts used to establish ground truth and their qualifications.
    • Adjudication method for the test set.
    • Whether a multi-reader multi-case (MRMC) comparative effectiveness study was done, or the effect size of human readers with/without AI assistance.
    • Whether a standalone performance study was done.
    • The type of ground truth used.
    • Sample size for the training set.
    • How ground truth for the training set was established.

    This document serves as a regulatory submission demonstrating substantial equivalence, not a detailed technical report on specific performance metrics or clinical study results.

    Ask a Question

    Ask a specific question about this device

    K Number
    K212777
    Date Cleared
    2021-09-24

    (23 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    Collaboration Live

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    Collaboration Live is indicated for remote console access of the Philips ultrasound system for diagnostic image viewing and review, consultation, guidance, support, and education in real time. Access must be granted by the healthcare professionals operating the ultrasound system. Compliance with the technical and operator requirements specified in the User Manual is required.

    It is the responsibility of the healthcare professionals at the remote client to ensure image quality, display contrast, and ambient light conditions are consistent with the generally accepted standards of the clinical application.

    Device Description

    Collaboration Live is software-based communication feature integrated with Philips Diagnostic Ultrasound Systems. Collaboration Live, together with remote-client Reacts software, enables two-way communication of text, voice, image, and video information between an ultrasound local system operator and a remote healthcare professional on a Windows, iOS, Android, or Chrome platform. Collaboration Live-Reacts facilitates: 1) remote diagnostic viewing and review, 2) remote clinical training and education, 3) remote peer-topeer collaboration, and 4) remote service support. Collaboration Live functionality includes a remote-control feature in which the ultrasound local system operator may grant a qualified remote user control of the ultrasound system parameters via a virtual control panel and virtual touch screen. By meeting the technical, operator, and environment requirements specified in the User Manual, healthcare professionals, using Reacts, may provide clinical diagnoses from a remote location as they would directly on the ultrasound system.

    AI/ML Overview

    The provided text describes a 510(k) premarket notification for Philips Ultrasound, Inc.'s "Collaboration Live" device, and its substantial equivalence to a predicate device. However, the document does not contain the specific details about acceptance criteria or a detailed study proving the device meets those criteria, as typically found in a clinical validation study.

    The document states:

    • "Validation testing was completed and produced under Philips's design controls procedures that comply with 21 CFR 820.30. Validation Testing, with pre-determined criteria, was conducted to evaluate and demonstrate the equivalency of Collaboration Live used on Windows, iOS, Android, and Chrome platforms."
    • "The results of the design control activities support that the software, which expands the compatible platforms and adds three features, does not raise new questions of safety or effectiveness. In addition to labeling, testing performed demonstrates that the Collaboration Live software meets the defined requirements and performance claims and are substantially equivalent to the predicate software (K201665)."

    This implies that some form of validation against pre-determined criteria was performed, but the specifics of those criteria (e.g., quantitative performance metrics like accuracy, sensitivity, specificity, resolution, latency, etc., for diagnostic image viewing and review), the study design, sample sizes, ground truth establishment, or expert involvement are not included in this FDA 510(k) summary document.

    The 510(k) summary focuses on demonstrating substantial equivalence, primarily by expanding compatible platforms and adding new features (Network Indicator, Remote Image Quality, Remote User Measurement) while maintaining the original indications for use. It asserts that these changes do not raise new questions of safety or effectiveness.

    Therefore, many of the requested details cannot be extracted from this document.

    Here's what can be inferred or explicitly stated based on the provided text, and what is missing:


    Acceptance Criteria and Device Performance:

    • 1. A table of acceptance criteria and the reported device performance:
      • Acceptance Criteria: Not explicitly stated in quantitative terms within this document. The document refers to "pre-determined criteria" and "defined requirements and performance claims" but does not list them. Given the device's function (remote console access for diagnostic image viewing and review, consultation, guidance, support, and education), the acceptance criteria would likely relate to image quality (resolution, clarity, color representation), latency, reliability of connection, and functionality of remote control features, all assessed for clinical applicability.
      • Reported Device Performance: The document only states that "testing performed demonstrates that the Collaboration Live software meets the defined requirements and performance claims." Specific performance metrics are not provided.

    Study Details:

    • 2. Sample sizes used for the test set and the data provenance: Not provided in the document.
      • Data Provenance: Not provided.
      • Retrospective/Prospective: Not specified.
    • 3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts: Not provided. The document mentions "healthcare professionals at the remote client" being responsible for ensuring image quality, but this refers to the user's responsibility, not the study's ground truth establishment.
    • 4. Adjudication method (e.g. 2+1, 3+1, none) for the test set: Not provided.
    • 5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:
      • Not explicitly stated and highly unlikely for this device. This device is a remote access/collaboration tool, not an AI diagnostic algorithm intended to improve human reader performance on a diagnostic task. Its validation would focus on the fidelity and functionality of the remote access.
    • 6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done: Not applicable. This device is inherently a human-in-the-loop system for real-time collaboration. The "alone" aspect would be assessing the technical performance of the remote viewing and control, which is implied by "Validation Testing, with pre-determined criteria, was conducted to evaluate and demonstrate the equivalency of Collaboration Live used on Windows, iOS, Android, and Chrome platforms."
    • 7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.): Not specified. For a device like this, ground truth would likely involve technical performance metrics (e.g., objective image quality assessments, latency measurements, functional tests of remote controls) alongside subjective assessments by clinicians regarding its usability and diagnostic adequacy when viewing images remotely.
    • 8. The sample size for the training set: Not applicable and not provided. This is not an AI/ML device that requires a "training set" in the typical sense for learning patterns from data. Its development involves software engineering and functional testing.
    • 9. How the ground truth for the training set was established: Not applicable and not provided.

    Summary of Device and its Purpose:

    • Device Name: Collaboration Live
    • Manufacturer: Philips Ultrasound, Inc.
    • Regulation Name: Medical image management and processing system
    • Regulatory Class: Class II (Product Codes: LLZ, IYN, IYO)
    • Indications for Use: Remote console access of the Philips ultrasound system for diagnostic image viewing and review, consultation, guidance, support, and education in real time. Access must be granted by the healthcare professionals operating the ultrasound system.
    • Key Features (as per this submission): Expansion of compatible platforms (Windows, iOS, Android, Chrome) and addition of Network Indicator, Remote Image Quality, Remote User Measurement.
    • Predicate Device: Collaboration Live – Philips Ultrasound (K201665, cleared September 15, 2020)

    In conclusion, while the document confirms that validation testing demonstrating equivalency and meeting defined requirements was performed, it does not disclose the specific acceptance criteria or the detailed results of that study, which is typical for a 510(k) summary as it focuses on establishing substantial equivalence rather than providing a full clinical validation report.

    Ask a Question

    Ask a specific question about this device

    K Number
    K201665
    Date Cleared
    2020-09-15

    (88 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    Collaboration Live

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    Collaboration Live is indicated for remote console access of the Philips ultrasound system for diagnostic image viewing and review, consultation, guidance, support, and education in real time. Access must be granted by the healthcare professionals operating the ultrasound system. Compliance with the technical and operator requirements specified in the User Manual is required.

    It is the responsibility of the healthcare professionals at the remote client to ensure image quality, display contrast, and ambient light conditions are consistent with the generally accepted standards of the clinical application.

    Device Description

    Collaboration Live is software-based communication feature integrated in Philips Diagnostic Ultrasound Systems. Collaboration Live together with remote-client Reacts enables two-way communication of text, voice, image, and video information between an ultrasound local system operator and a remote healthcare professional on a Windows device. Collaboration Live-Reacts facilitates: 1) remote diagnostic viewing and review, 2) remote clinical training and education, 3) remote peer collaboration, and 4) remote service support. Collaboration Live functionality includes a remote control feature in which the ultrasound local system operator may grant a qualified remote user control of the ultrasound system parameters via a virtual control panel and virtual touch screen. By meeting the technical, operator, and environment requirements specified in the User Manual, healthcare professionals using Reacts may provide clinical diagnoses from a remote location as they would directly on the ultrasound system.

    AI/ML Overview

    Please note: The provided document is a 510(k) summary for the "Collaboration Live" device. It focuses on demonstrating substantial equivalence to a predicate device, particularly regarding the change to allow diagnostic use for remote viewing. While it mentions "validation testing with pre-determined criteria," it does not provide the specific details of the acceptance criteria or the full study report. It only states that such testing was conducted and that the labeling was updated based on the findings.

    Therefore, much of the requested information regarding detailed acceptance criteria, specific reported performance, sample sizes, ground truth establishment, expert qualifications, and MRMC study details cannot be fully extracted from this document alone.

    Here's what can be extracted based on the provided text:


    1. A table of acceptance criteria and the reported device performance

    The document mentions "pre-determined criteria" for validation testing but does not explicitly list them or the quantitative reported device performance against these criteria. It broadly states that "remote display specifications and network bandwidth requirements for equivalent image quality for diagnostic viewing were determined."

    Given the limited information, a hypothetical table based on the statement "equivalent image quality for diagnostic viewing" might look like this, but these are inferred and not explicitly stated criteria or performance metrics in the document:

    Acceptance Criteria (Inferred)Reported Device Performance (Inferred)
    Equivalent image quality for diagnostic viewing compared to local ultrasound systemMet (stated that "equivalent image quality for diagnostic viewing were determined")
    Adherence to remote display specificationsMet (stated that "remote display specifications... were determined")
    Adherence to network bandwidth requirementsMet (stated that "network bandwidth requirements... were determined")

    2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)

    • Sample size for test set: Not specified in the provided document.
    • Data provenance: Not specified in the provided document. The document only mentions "Validation testing... was conducted."

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)

    Not specified in the provided document.

    4. Adjudication method (e.g. 2+1, 3+1, none) for the test set

    Not specified in the provided document.

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    The document does not mention an MRMC comparative effectiveness study, nor does it refer to AI assistance. The device is a "software-based communication feature" for remote viewing and control of ultrasound systems, not an AI-driven image analysis tool.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    The device "Collaboration Live" is explicitly designed for "remote console access" and "two-way communication... between an ultrasound local system operator and a remote healthcare professional." It's a system for human users to interact remotely with an ultrasound machine. Therefore, a "standalone algorithm only" performance study, without human involvement, would not be applicable to this type of device. The study implicitly involves human users making diagnostic decisions.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc)

    Not explicitly stated in the provided document beyond implying an evaluation of "equivalent image quality for diagnostic viewing." This would typically require expert assessment of image quality and diagnostic accuracy, but the specifics are not detailed.

    8. The sample size for the training set

    Not applicable. The "Collaboration Live" device is a software communication feature for remote access and viewing, not a machine learning or AI algorithm that requires a training set.

    9. How the ground truth for the training set was established

    Not applicable, as there is no training set for this type of device.

    Ask a Question

    Ask a specific question about this device

    K Number
    K200179
    Manufacturer
    Date Cleared
    2020-02-18

    (25 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    Collaboration Live

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    Collaboration Live is indicated for remote console access of the Philips ultrasound system for image review, consultation, guidance, support, and education in real time. Access must be granted by the technologist operating the system. Images reviewed remotely are not for diagnostic use.

    Device Description

    Collaboration Live is a new software feature integrated in Philips Epiq and Affiniti Diagnostic Ultrasound Systems (K182857). Collaboration Live enables two-way communication of text, voice, image, and video information between an ultrasound system operator and a remote user on a Windows desktop or laptop computer. Collaboration Live facilitates: 1) remote service support, 2) remote clinical training and education, and 3) remote peer-to-peer collaboration (non-diagnostic). Collaboration Live functionality includes a remote control feature in which the ultrasound system operator may grant a qualified remote user control of all ultrasound system parameters via a virtual control panel and virtual touch screen. The ultrasound system operator maintains the ability to take back system control at any time. The remote user interacts with the ultrasound system using the Collaboration Live remote application, which is called Reacts.

    AI/ML Overview

    The provided text describes the "Collaboration Live" device, a software feature for Philips ultrasound systems, and its substantial equivalence to a predicate device (GE Customer Remote Console). However, the document does not contain specific acceptance criteria, a detailed study proving performance, or the specific data requested in the prompt.

    Instead, it broadly states: "Software verification supported a determination of substantial equivalence with the predicate GE Customer Remote Console (K150193), and demonstrated that Collaboration Live meets the acceptance criteria and is adequate for its intended use."

    Without explicit acceptance criteria and corresponding performance data, it's impossible to fill out the requested table and answer the specific questions about sample size, expert qualifications, adjudication methods, MRMC studies, standalone performance, or training set details.

    Therefore, the answer below reflects the absence of this detailed information in the provided document.


    Acceptance Criteria and Device Performance Study

    The provided 510(k) summary for Philips Healthcare's "Collaboration Live" device states that "Software verification supported a determination of substantial equivalence with the predicate GE Customer Remote Console (K150193), and demonstrated that Collaboration Live meets the acceptance criteria and is adequate for its intended use."

    However, the document does not explicitly define the specific acceptance criteria or present a detailed study report with quantitative performance metrics for Collaboration Live. No tables showing acceptance criteria alongside reported device performance are included. The description focuses on demonstrating substantial equivalence based on technological similarities and software verification, rather than a quantifiable performance study against predefined criteria.

    Therefore, the following information cannot be extracted from the provided text:

    1. A table of acceptance criteria and the reported device performance:

    • Not provided in the document. The document only states that the device "meets the acceptance criteria" without specifying what those criteria are or presenting detailed performance data.

    2. Sample size used for the test set and the data provenance:

    • Not provided in the document. The document mentions "Software verification" but does not detail the test set size, its composition, or its origin (e.g., country, retrospective/prospective).

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts:

    • Not applicable / Not provided. Since no specific performance study with a test set and ground truth establishment is detailed, this information is absent.

    4. Adjudication method (e.g. 2+1, 3+1, none) for the test set:

    • Not applicable / Not provided.

    5. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:

    • Not performed / Not stated. The document explicitly states, "Collaboration Live did not require clinical testing to support a determination of substantial equivalence." This implies no clinical comparative effectiveness study, including MRMC studies, was conducted or reported. "Collaboration Live" is described as a remote access and collaboration tool, not an AI diagnostic aid that would typically involve human reader performance improvement studies.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done:

    • Not performed / Not stated. The device is a collaboration tool, not an autonomous diagnostic algorithm, so standalone performance in the typical sense is not an applicable characteristic for this type of device.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.):

    • Not applicable / Not provided. As no specific performance study against a ground truth is reported.

    8. The sample size for the training set:

    • Not provided. The document does not describe any machine learning or AI model training, thus no training set information is available.

    9. How the ground truth for the training set was established:

    • Not applicable / Not provided.
    Ask a Question

    Ask a specific question about this device

    Page 1 of 1