Search Filters

Search Results

Found 1 results

510(k) Data Aggregation

    K Number
    K131822
    Date Cleared
    2013-07-23

    (33 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    ULTRAEXTEND USWS-900A V2.1 AND V3.1

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    This software is intended for displaying and analyzing ultrasound images for medical diagnosis in cardiac and general examinations.

    Device Description

    UltraExtend USWS-900A v2.1 and v3.1 is a software package that can be installed in a general-purpose personal computer (PC) to enable data acquired from Aplio diagnostic ultrasound systems (Aplio XG, Aplio MX, Aplio Artida, Aplio 300, Aplio 400 and Aplio 500), to be loaded onto a PC for image processing with other application software product. UltraExtend USWS-900A v2.1 and v3.1 is a postprocessing software that implements functionality and operability equivalent to that of the diagnostic ultrasound system the data was acquired from, providing a seamless image reading environment from examination using the diagnostic ultrasound system to diagnosis using the PC.

    AI/ML Overview

    The provided document is a 510(k) Pre-market Notification for a software product called "UltraExtend USWS-900A v2.1 and v3.1." This submission is for a modification of an already cleared device and does not include a study proving device performance against acceptance criteria in the typical sense of a clinical trial for a novel device.

    Instead, the submission focuses on demonstrating substantial equivalence to predicate devices. This means that the device is shown to function similarly and be intended for the same use as legally marketed devices.

    Therefore, many of the requested categories for a study proving device performance are not directly applicable or are addressed differently in this type of submission.

    Here's a breakdown based on the provided text:

    Acceptance Criteria and Reported Device Performance

    The document states that "Risk Analysis, Verification/Validation testing conducted through bench testing, as well as software validation documentation... demonstrate that the device meets established performance and safety requirements and is therefore deemed safe and effective." However, it does not provide a table of specific acceptance criteria or quantitative performance metrics for those criteria. The "performance" being evaluated is primarily the functional equivalence and safety of the software modifications.

    Acceptance Criteria (Implied)Reported Device Performance (Implied)
    Functional Equivalence: The software should perform key functionalities (displaying, analyzing ultrasound images, accessing data from specific ultrasound systems, running applications like CHI-Q and TDI-Q, 2D wall motion tracking) in a manner equivalent to the predicate devices and the diagnostic ultrasound systems from which the data is acquired."UltraExtend USWS-900A v2.1 and v3.1 is a post-processing software that implements functionality and operability equivalent to that of the diagnostic ultrasound system the data was acquired from, providing a seamless image reading environment..." Modifications allow data from Aplio 300, 400, 500 systems to be accessible, and new applications (CHI-Q, TDI-Q) and features (2D wall motion tracking) were added.
    Safety: The modifications should not introduce new safety concerns and the device should comply with relevant regulations and standards."Risk Analysis, Verification/Validation testing conducted through bench testing... demonstrate that the device meets established performance and safety requirements and is therefore deemed safe and effective." Device designed and manufactured under Quality System Regulations (21 CFR §820 and ISO 13485 Standards) and IEC 62304 processes were implemented.
    Compatibility: The software should be compatible with specified operating systems (Windows XP for v2.1, Windows 7 for v3.1) and able to access data from the listed Aplio diagnostic ultrasound systems.UltraExtend USWS-900A v2.1 runs under Windows XP and v3.1 runs under Windows 7. Allows data acquired by Aplio 300, Aplio 400 and Aplio 500 Diagnostic Ultrasound Systems to be accessible.

    Study Details (Based on the document, many are not applicable for a 510(k) modification without clinical studies)

    1. Sample size used for the test set and the data provenance:

      • Test Set Sample Size: Not explicitly stated as a separate "test set" in the context of a clinical study. The validation involved "bench testing" and "software validation documentation." This typically means testing against a variety of use cases and scenarios, but the number of cases or the specific data used for this internal validation is not provided.
      • Data Provenance: Not specified. As no clinical studies were performed, there's no mention of country of origin or retrospective/prospective data for a clinical test set. The data would likely be internally generated or from existing Aplio systems.
    2. Number of experts used to establish the ground truth for the test set and the qualifications of those experts:

      • Not applicable as no clinical study with expert-established ground truth was conducted. The "ground truth" for software validation would be adherence to functional specifications and absence of bugs, verified by software engineers and quality assurance personnel.
    3. Adjudication method (e.g., 2+1, 3+1, none) for the test set:

      • Not applicable as no clinical study with adjudicated results was conducted.
    4. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:

      • No MRMC study was done. The document explicitly states: "UltraExtend USWS-900A v2.1 and v3.1 did not require clinical studies to support substantial equivalence." This is a software for displaying and analyzing images, not an AI diagnostic tool requiring MRMC evaluation for reader improvement.
    5. If a standalone (i.e. algorithm only without human-in-the loop performance) was done:

      • Not directly applicable in the sense of an algorithmic diagnostic performance study. The "standalone" performance here refers to the software's ability to correctly process and display images, and run its embedded applications. This was assessed through "Risk Analysis, Verification/Validation testing conducted through bench testing, as well as software validation documentation."
    6. The type of ground truth used (expert consensus, pathology, outcomes data, etc.):

      • For software validation, the "ground truth" would be the software requirements and specifications. The validation process verifies that the software functions as designed and meets these predefined requirements, rather than clinical ground truth (like pathology or expert consensus).
    7. The sample size for the training set:

      • Not applicable. This is not a machine learning or AI device that requires a separate "training set" in the context of developing a diagnostic algorithm. It's a software package for image post-processing and display.
    8. How the ground truth for the training set was established:

      • Not applicable, as no training set (in the ML context) was used.
    Ask a Question

    Ask a specific question about this device

    Page 1 of 1