Search Filters

Search Results

Found 2 results

510(k) Data Aggregation

    K Number
    K140808
    Device Name
    Q-STATION
    Date Cleared
    2014-04-17

    (16 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    Q-STATION

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    Q-Station is application software intended to manage, view, analyze, and report qualitative and quantitative image data from ultrasound exams. It is designed to host optional advanced analysis applications via QLAB integration and provides integrated tools that allow users to manually assess and score cardiac wall motion and export images and/or exams and reports. Q-Station can view DICOM images of non-ultrasound images such as CT, MR, NM, CR, MG, XA, PET, RT, and X-Ray modalities for reference viewing. It supports connectivity to ultrasound systems, PACS and other DICOM storage repositories.

    Device Description

    Q-Station is designed to manage post-acquisition ultrasound images and other data, for the purposes of diagnosing the patient's condition. This includes using Q-Station on a PC to review images and measurements sent from an ultrasound acquisition device, analyze 3D and other data with QLAB. Q-Station is used to review various ultrasound exam types, including Adult echo, General Imaging, Stress echo, Vascular, and TEE. In addition, Q-Station can be used for reference viewing of non-ultrasound DICOM images. Q-Station can be used to add interpretive findings, key images, measurements and calculations and other comments that create reports that can be shared with other clinicians. During this review, users may also use Q-Station to import and export exams, print reports, and anonymize images for export. Q-Station supports QLAB Q-Apps for advanced analysis (K132165).

    AI/ML Overview

    Here's a breakdown of the acceptance criteria and the study information for the Philips Q-Station (K140808) based on the provided text:

    Important Note: The provided document is a 510(k) summary, which focuses on demonstrating substantial equivalence to predicate devices rather than proving performance against specific quantitative acceptance criteria in a traditional efficacy study. As such, the information you requested regarding numerical performance metrics, sample sizes for test sets, expert involvement for ground truth, and comparative effectiveness studies (MRMC) is not present in this type of regulatory submission. The submission explicitly states "The subject of this premarket submission, Q-Station 3.0 software did not require clinical studies to support substantial equivalence."

    Therefore, many of the requested fields will state "Not Applicable" or "Not Provided" in the table below, as the submission relies on verification and validation activities rather than formal clinical studies with statistical acceptance criteria.


    1. Table of Acceptance Criteria and Reported Device Performance

    Acceptance Criteria CategorySpecific Acceptance Criteria (as implied or stated)Reported Device Performance
    Functional EquivalenceFunctionality for managing post-acquisition ultrasound images and other data.Q-Station is designed to manage post-acquisition ultrasound images and other data, for the purposes of diagnosing the patient's condition.
    Analysis PackagesInclusion of Adult Echo, Pediatric Echo, and Vascular analysis packages.Includes Adult Echo, Pediatric Echo, Vascular analysis packages, stated as "essentially the same as those included with the EPIQ ultrasound system (K132304)".
    Multi-modality ViewingAbility to view non-ultrasound DICOM images (CT, MR, NM, CR, MG, XA, PET, RT, X-Ray) for reference.Can view CT, MR, NM, CR, MG, XA, PET, RT, and X-Ray images for reference viewing in 1-up or n-up formats.
    Measurement ToolsAbility to view, copy, edit system-defined measurement labels/groups/collections; create, edit, delete customized measurement labels/groups/collections.Device descriptions indicate these capabilities are present, similar to predicate devices.
    ConnectivitySupports connectivity to ultrasound systems, PACS, and other DICOM storage repositories.Device description explicitly states this support.
    Reliability RequirementsMeets all defined reliability requirements."Testing performed demonstrated that the Q-Station 3.0 meets all defined reliability requirements and performance claims."
    Performance ClaimsMeets all defined performance claims."Testing performed demonstrated that the Q-Station 3.0 meets all defined reliability requirements and performance claims."
    Safety TestingCompliance with safety testing from risk analysis.Included in verification and validation processes.
    System Level TestsSuccessful completion of system level tests.Included in verification and validation processes.
    Performance TestsSuccessful completion of performance tests.Included in verification and validation processes.

    2. Sample Sizes Used for the Test Set and Data Provenance

    • Sample Size for Test Set: Not provided. The submission states that "The subject of this premarket submission, Q-Station 3.0 software did not require clinical studies to support substantial equivalence." Testing involved "system level tests, performance tests, and safety testing from risk analysis," implying internal validation rather than a formal test set of patient data.
    • Data Provenance: Not provided (not applicable as clinical studies were not performed).

    3. Number of Experts Used to Establish the Ground Truth for the Test Set and Qualifications of Those Experts

    • Not applicable/Not provided. Clinical studies with expert-established ground truth were not conducted.

    4. Adjudication Method for the Test Set

    • Not applicable/Not provided. Clinical studies with adjudication were not conducted.

    5. If a Multi Reader Multi Case (MRMC) Comparative Effectiveness Study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    • No. An MRMC comparative effectiveness study was not done. The device is a Picture Archiving and Communications Systems Workstation, and this type of study is not relevant to demonstrating its substantial equivalence for its stated functions of viewing, analysis, and reporting.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    • Not explicitly described as a standalone algorithm performance study. The device itself is software for managing, viewing, and analyzing images, implicitly involving human interaction. The validation focused on the software's functionality, reliability, and safety when used with a human.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)

    • Not applicable/Not provided. For the internal verification and validation, ground truth would likely refer to expected software behavior based on product specifications and design requirements, rather than a clinical ground truth like pathology or expert consensus.

    8. The sample size for the training set

    • Not applicable/Not provided. This device is described as software for managing, viewing, and analyzing existing image data, rather than an AI/ML algorithm that requires a "training set" in the conventional sense. Its "analysis packages" are "essentially the same as those included with the EPIQ ultrasound system," suggesting pre-existing modules rather than newly trained AI.

    9. How the ground truth for the training set was established

    • Not applicable/Not provided, as there is no mention of a training set for an AI/ML algorithm.
    Ask a Question

    Ask a specific question about this device

    K Number
    K103815
    Device Name
    Q-STATION
    Date Cleared
    2011-01-25

    (27 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    Q-STATION

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    Q-Station software is a software application package. It is designed to manage, view and report image data acquired by Ultrasound systems and cardiac waveform data from Philips Stress Vue ECG systems. Q-Station offers support for QLAB plug-ins for analysis, quantification and reporting of data from ultrasound systems.

    Device Description

    O-Station is workstation software designed for managing, viewing and reporting qualitative and quantitative image data from ultrasound exams. It includes advanced analysis via QLAB integration (QLAB 8.0) and provides integrated tools that allow users to manually assess and score cardiac wall motion and export images and/or exams and reports. It supports connectivity to ultrasound systems, PACS, other DICOM storage repositories, and Philips Stress Vue ECG systems to aid clinicians in diagnostic activity. Q-Station supports QLAB Plug-ins.

    AI/ML Overview

    The provided 510(k) summary for Q-Station 1.0 focuses on demonstrating substantial equivalence to a predicate device (Xcelera K061995) rather than on specific clinical performance metrics with pre-defined acceptance criteria.

    The submission states that:

    • "No performance standards for PACS systems or components have been issued under the authority of Section 514."
    • "The Q-Station software has been designed to comply with the following voluntary standards: NEMA PS 3.1 - 3.18 (2008), Digital Imaging and Communications in Medicine (DICOM) Set and IEC/ISO 10918-1:1994 Technical Corrigendum 1:2005, Information technology - Digital compression and coding of continuous-tone still images."
    • "Software development for the Q-Station software follows documented processes for software design, verification and validation testing. A risk assessment has been completed to identify potential design hazards that could cause an error or injury based on the use of the quantification results. Appropriate steps have been taken to control all identified risks for this type of image display and quantification product."

    Therefore, the submission does not include a study that defines explicit acceptance criteria for diagnostic performance (e.g., sensitivity, specificity, accuracy) and then provides data to prove the device meets these criteria in the way new AI/CADe devices typically do. Instead, the focus is on compliance with standards and internal software development processes to mitigate risks and achieve substantial equivalence.

    Given the information provided, it is not possible to complete the requested table and details for acceptance criteria and a study proving those criteria were met. The document describes a regulatory submission process based on demonstrating substantial equivalence and compliance with general software/DICOM standards, not a specific clinical performance study with predefined metrics.

    Here's a breakdown of what can be inferred or what is explicitly not available based on the provided text:


    1. A table of acceptance criteria and the reported device performance

    Acceptance CriteriaReported Device Performance
    Not explicit (device is compared to predicate based on functionality and compliance with standards rather than specific performance metrics)Not provided in a quantitative, performance-based manner within the document for diagnostic accuracy, sensitivity, specificity, etc.
    Compliance with NEMA PS 3.1 - 3.18 (DICOM)Stated compliance with this standard.
    Compliance with IEC/ISO 10918-1:1994 (JPEG compression)Stated compliance with this standard.
    Functionality similar to predicate device (Xcelera K061995) for managing, viewing, reporting, and QLAB integration.Device description outlines these functionalities, implying they function similarly to the predicate.
    Risk assessment indicating identified risks are controlled.Stated that "Appropriate steps have been taken to control all identified risks."
    No new issues of safety or effectiveness are raised.Stated as a conclusion.

    2. Sample size used for the test set and the data provenance (e.g., country of origin of the data, retrospective or prospective)

    • Not provided. The document does not describe a clinical performance test set. The validation mentioned refers to software verification and validation, not clinical data evaluation for diagnostic performance.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts

    • Not applicable / Not provided. No specific test set with ground truth established by experts is mentioned for assessing diagnostic performance.

    4. Adjudication method (e.g., 2+1, 3+1, none) for the test set

    • Not applicable / Not provided. No specific test set with adjudication is mentioned.

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    • No. The document does not mention any MRMC study. Q-Station is described as a workstation software for managing, viewing, and reporting image data, including advanced analysis via QLAB integration and tools for manual assessment and scoring. It's not presented as an AI/CADe assistance tool in the context of improving human reader performance.

    6. If a standalone (i.e., algorithm only without human-in-the-loop performance) was done

    • No. The document describes a workstation for human interaction with imaging data, including manual assessment. It does not present a standalone algorithm for diagnostic tasks.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc)

    • Not applicable / Not provided. No specific ground truth for diagnostic performance assessment is mentioned.

    8. The sample size for the training set

    • Not applicable / Not provided. The device is described as software for managing, viewing, and reporting image data, not explicitly as a machine learning/AI device requiring a "training set" in the conventional sense of AI model development for diagnostic tasks.

    9. How the ground truth for the training set was established

    • Not applicable / Not provided. (See point 8).

    In summary, this 510(k) submission details a software workstation that functions as a tool for clinicians to view, manage, and analyze ultrasound and ECG data. Its regulatory pathway relies on demonstrating substantial equivalence to existing predicate devices and compliance with relevant industry standards (DICOM, JPEG), alongside internal software verification and validation processes. It does not present data from clinical performance studies against specific diagnostic acceptance criteria.

    Ask a Question

    Ask a specific question about this device

    Page 1 of 1