Search Filters

Search Results

Found 2 results

510(k) Data Aggregation

    K Number
    K113456
    Device Name
    READY VIEW
    Date Cleared
    2012-06-15

    (207 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Why did this record match?
    Applicant Name (Manufacturer) :

    GE HEALTHCARE (GE MEDICAL SYSTEMS SCS)

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    READY View is an image analysis software that allows the user to process dynamic or functional volumetric data and to generate maps that display changes in image intensity over time, echo time, b-value (Diffusion imaging) and frequency (Spectroscopy). The combination of acquired images, reconstructed images, calculated parametric images, tissue segmentation, annotations and measurement performed by the clinician allows multiparametric analysis and may provide clinically relevant information for diagnosis.

    Device Description

    READY VIEW (K110573) is a suite of applications developed to improve multi-parametric exams by enobling the analysis of MR generated data sets containing multiple images for each scan location. The MR data sets may be any of the following:

    • A time series
    • A diffusion weighted scan
    • A diffusion tensor scan
    • A variable echo imaging
    • A blood oxygen level dependent imaging
    • Spectroscopy (Single voxel and 2D or 3D CSI)
      The READY View platform provides a combination of protocols, applications and tools that enables a fast, easy and quantified analysis of the multiple data sets.
      Brain View is a post processing image analysis software package that provides advanced techniques to aid in the diagnosis of neurological and oncological diseases. Brain View is an option with the READY View platform and offers two advanced protocols:
    • FiberTrak
    • Arterial Spin Labeling (ASL)
      READY View along with Brain View option are available on the Advantage Workstation (AW), Advantage Workstation Server Gen 2 and AW Server PACS, for viewing and processing Magnetic Resonance images.
      The basis for this submission is a modification of a legally marketed device to incorporate additional features. The following additional functional protocols can now be post processed using READY View software:
      Brain View which is a post processing image analysis software package that provides advanced techniques to aid in the diagnosis of neurological and oncological diseases now offers two additional advanced protocols :
    • BrainStat
    • BrainStat AIF
      Body View is a post processing image analysis software package that provides advanced techniques to aid in the diagnosis of oncological diseases in the human body. Body View is an option : with the READY View platform and offers two advanced protocols:
    • Signal Enhancement Ratio (SER)
    • MR Standard
      READY View along with Brain View and Body View options are available on the Advantage Workstation (AW), Advantage Workstation Server Gen 2 and AW Server PACS, for viewing and processing Magnetic Resonance images.
    AI/ML Overview

    The provided 510(k) submission for GE Healthcare's READY View states that no clinical studies were required to support substantial equivalence for this device. Therefore, there is no information available in this document regarding acceptance criteria, device performance, sample sizes, expert involvement, or ground truth establishment based on clinical data.

    The submission focuses entirely on non-clinical tests to demonstrate substantial equivalence to its predicate device (READY View K110573).

    Here's a breakdown of what the document does state regarding testing:

    1. A table of acceptance criteria and the reported device performance

    • Not Applicable. The document explicitly states: "The subject of this premarket submission, READY View, did not require clinical studies to support substantial equivalence." Therefore, no clinical performance metrics or acceptance criteria based on patient outcomes are provided.
    • The "Summary of Non-Clinical Tests" lists general quality assurance measures:
      • Risk Analysis
      • Requirements Reviews
      • Design Reviews
      • Performance testing (Verification)
      • Safety testing (Verification)
      • Simulated use testing (Validation)
        However, specific acceptance criteria or detailed results from these non-clinical tests are not disclosed in this summary.

    2. Sample sized used for the test set and the data provenance

    • Not Applicable. No clinical test set. The document states: "All clinical images required for verification and validation activities were obtained from legally marketed GE MR Systems." This indicates that existing images were used for internal testing and validation, but not for a formal clinical study to prove substantial equivalence of the new features. The number of such images is not specified, nor is their provenance (e.g., country of origin, retrospective/prospective).

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts

    • Not Applicable. No clinical test set requiring expert ground truth in the context of this submission.

    4. Adjudication method for the test set

    • Not Applicable. No clinical test set.

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    • Not Applicable. No MRMC study was conducted or reported. The device is described as "post processing image analysis software" that "allows multi-parametric analysis and may provide clinically relevant information for diagnosis." There is no mention of AI assistance for human readers or comparative effectiveness in this context.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    • Not Applicable. While the device is "algorithm only" in its function as image analysis software, the submission does not present a standalone performance study in the typical sense of measuring diagnostic accuracy against a ground truth. Its equivalence is based on non-clinical testing and comparison to its predicate device.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)

    • Not Applicable. For non-clinical verification and validation activities utilizing "clinical images," the type of ground truth used to assess the software's processing capabilities (e.g., whether it correctly generates maps or graphs based on its algorithms) is not specified. It's likely that the "ground truth" for these internal tests was the expected output of the algorithms given the input data, verified by engineers or subject matter experts against predefined specifications, rather than clinical ground truth (like pathology or expert consensus on diagnosis).

    8. The sample size for the training set

    • Not Applicable. There is no mention of machine learning or AI training sets in this submission. The device description focuses on its function as "post processing image analysis software" that applies protocols and tools for analysis, not on learning from data.

    9. How the ground truth for the training set was established

    • Not Applicable. As no training set is mentioned, this question is not relevant to the provided text.

    In summary, the 510(k) submission for READY View (K113456) explicitly states that clinical studies were not required to demonstrate substantial equivalence for the modifications introduced (new protocols like BrainStat, BrainStat AIF, SER, MR Standard). The reliance was on non-clinical software verification and validation activities.

    Ask a Question

    Ask a specific question about this device

    K Number
    K110573
    Device Name
    READY VIEW
    Date Cleared
    2011-05-03

    (63 days)

    Product Code
    Regulation Number
    892.2050
    Reference & Predicate Devices
    Why did this record match?
    Applicant Name (Manufacturer) :

    GE HEALTHCARE (GE MEDICAL SYSTEMS SCS)

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    READY View is an image analysis software that allows the user to process dynamic or functional volumetric data and to generate maps that display changes in image intensity over time, echo time, b-value (Diffusion imaging) and frequency (Spectroscopy). The combination of acquired images, reconstructed images, calculated parametric images, tissue segmentation, annotations and measurement performed by the clinician allows multi-parametric analysis and may provide clinically relevant information for diagnosis.

    Device Description

    READY View is a suite of applications developed to improve multiparametric exams by enabling the analysis of MR generated data sets containing multiple images for each scan location. The MR data sets may be any of the following:

    • A time series
    • A diffusion weighted scan
    • A diffusion tensor scan
    • A variable echo imaging
    • A blood oxygen level dependent imaging
    • Spectroscopy (Single voxel and 2D or 3D CSI)
      The READY View platform provides a combination of protocols, applications and tools that enables a fast, easy and quantified analysis of the multiple data sets.
      Brain View is a post processing image analysis software package that provides advanced techniques to aid in the diganosis of neurological and oncological diseases. Brain View is an option with the READY View platform and offers two advanced protocols:
    • FiberTrak
    • Arterial Spin Labeling (ASL)
      READY View along with Brain View option is available on the Advantage Workstation (AW) and Advantage Workstation Server Gen 2, for viewing and processing Magnetic Resonance images.
    AI/ML Overview

    Here's an analysis of the acceptance criteria and the study information for the GE Healthcare READY View device, based on the provided text.

    Based on the provided document, the GE Healthcare READY View device did not require clinical studies to support substantial equivalence. This means that specific acceptance criteria typically associated with clinical performance metrics (like sensitivity, specificity, accuracy, etc.) and a study to prove they were met are not detailed.

    Instead, the submission focuses on non-clinical tests to demonstrate its safety, effectiveness, and substantial equivalence to a predicate device.

    Here's the breakdown of the requested information, drawing from what's available in the document:


    1. A table of acceptance criteria and the reported device performance

    Since a clinical study with performance metrics wasn't conducted, the acceptance criteria and performance are based on successful completion of non-clinical tests and demonstration of equivalence to the predicate device.

    Acceptance Criteria CategorySpecific Criteria (Implicitly Met)Reported Device Performance/Conclusion
    Non-Clinical Testing- Risk Analysis conducted- Risk Analysis complete
    - Requirements Reviews conducted- Requirements Reviews complete
    - Design Reviews conducted- Design Reviews complete
    - Performance testing (Verification) completed successfully- Performance testing (Verification) complete and successful
    - Safety testing (Verification) completed successfully- Safety testing (Verification) complete and successful
    - Simulated use testing (Validation) completed successfully- Simulated use testing (Validation) complete and successful
    Substantial Equivalence- Device is as safe, as effective, and performance is substantially equivalent to predicate devices (FuncTool K960265 and Volume Viewer Plus K041521).- "GE Healthcare considers READY View to be as safe, as effective, and performance is substantially equivalent to the predicate devices." The FDA concurred with this determination, stating, "We have reviewed your Section 510(k) premarket notification of intent to market the device referenced above and have determined the device is substantially equivalent... You may, therefore, market the device." This implies all equivalence criteria (e.g., similar indications for use, technological characteristics, and performance based on non-clinical testing) were deemed met. The document also states, "The READY View software employs the same algorithm technology as its predicate device FuncTool (K960265)."

    2. Sample size used for the test set and the data provenance

    • Sample size for the test set: Not applicable, as no clinical efficacy study with a distinct "test set" of patient data (in the sense of a clinical trial) was performed for this submission. The "test set" for the non-clinical performance and safety testing would have been internal engineering and software validation tests, not patient imaging data reported in the submission in detail.
    • Data provenance: Not applicable for a clinical test set. The document refers to "MR functional data of the human body acquired from a MR Scanner" for processing by the device, covering targeted anatomies like "Brain, Breast, Prostate, and Liver." However, the origin of specific data used for validation is not specified as it was not a clinical efficacy study.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts

    • Not applicable. There was no clinical study involving expert consensus on patient cases for ground truth, as the submission relies on substantial equivalence and non-clinical testing.

    4. Adjudication method for the test set

    • Not applicable. No clinical study with a test set requiring adjudication was performed.

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    • No. The document explicitly states: "The subject of this premarket submission, READY View, did not require clinical studies to support substantial equivalence." Therefore, no MRMC study or AI-assisted reader performance evaluation was conducted or reported.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    • Yes, in essence. The "Performance testing (Verification)" and "Simulated use testing (Validation)" described are essentially demonstrations of the algorithm's functionality and performance as a software product. While not described as an "AI algorithm," it's a software application performing image analysis without human intervention during its processing steps. The output of the software (calculated parametric images, graphs, etc.) is then used by a clinician. The focus here is on the software's ability to process data and generate accurate representations, mirroring the algorithms of its predicate devices.

    7. The type of ground truth used

    • For the non-clinical verification and validation activities, the "ground truth" would have been established by engineering specifications, software design documents, and expected outputs based on known input data. For example, if a function calculates a specific parameter, the ground truth would be the mathematically correct calculation of that parameter given predefined input values. The comparison against predicate devices also serves as a form of "ground truth" for functionality and expected output.

    8. The sample size for the training set

    • Not applicable directly. READY View is not presented as a machine learning/AI model that requires a distinct "training set" in the modern sense. It's an image analysis software employing established algorithms, explicitly stated to use "the same algorithm technology as its predicate device FuncTool (K960265)." Therefore, there's no mention of a training set size.

    9. How the ground truth for the training set was established

    • Not applicable, as a discrete "training set" for a machine learning model was not used. The algorithms are based on established scientific principles for processing MR functional data, similar to the predicate device.
    Ask a Question

    Ask a specific question about this device

    Page 1 of 1