K Number
K190362
Device Name
HealthPNX
Date Cleared
2019-05-06

(80 days)

Product Code
Regulation Number
892.2080
Panel
RA
Reference & Predicate Devices
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

The Zebra Pneumothorax device is a software workflow tool designed to aid the clinical assessment of adult Chest X-Ray cases with features suggestive of Pneumothorax in the medical care environment. HealthPNX analyzes cases using an artificial intelligence algorithm to identify suspected findings. It makes case-level output available to a PACS/workstation for worklist prioritization or triage. HealthPNX is not intended to direct attention to specific portions or anomalies of an image. Its results are not intended to be used on a stand-alone basis for clinical decision-making nor is it intended to rule out Pneumothorax or otherwise preclude clinical assessment of X-Ray cases.

Device Description

Zebra's HealthPNX is a radiological computer-assisted triage and notification software system. The software automatically analyzes PA/AP chest x-rays and alerts the PACS/workstation once findings suspicious of pneumothorax are identified.

The following modules compose the HealthPNX software system:

Data input and validation: After a chest x-ray has been performed, a copy of the study is automatically retrieved and processed by the HealthPNX device. Following retrieval of a study, the validation feature assesses the input data (i.e. age, modality, view) to ensure compatibility for processing by the algorithm.

HealthPNX algorithm: Once a study has been validated, the algorithm analyzes the frontal chest x-ray for detection of suspected findings suggestive of pneumothorax.

IMA Integration feature: The study analysis and the results of a successful study analysis is provided to IMA, that notifies the PACS/workstation through the worklist interface.

Error codes feature: In the case of a study failure during data validation or the analysis by the algorithm, an error is provided to the system.

The radiologist is then able to review the study earlier than in standard of care workflow.

In summary, the HealthPNX device is intended to provide a passive notification through the PACS/workstation to the radiologists indicating the existence of a case that may potentially benefit from the prioritization. It doesn't output an image and therefore it does not mark, highlight, or direct users' attention to a specific location on the original chest X ray.

The device aim is to aid in prioritization and triage of radiological medical images only.

AI/ML Overview

Here's a breakdown of the acceptance criteria and study details for the HealthPNX device, based on the provided FDA 510(k) summary:

1. Table of Acceptance Criteria and Reported Device Performance

Performance MetricAcceptance Criteria (Goal)Reported Device Performance
Detection Accuracy (AUC)Above 80% (compared to ground truth)98.3% (95% CI: [97.40%, 99.02%])
Overall AgreementNot explicitly stated as a separate "goal", but demonstrated high agreement93.03% (95% CI: [90.66%, 94.95%])
SensitivityNot explicitly stated as a separate "goal", but met intended performance93.15% (95% CI: [87.76%, 96.67%])
SpecificityNot explicitly stated as a separate "goal", but met intended performance92.99% (95% CI: [90.19%, 95.19%])
Triage Time ReductionNot explicitly stated, but demonstrated statistically significant reductionReduced by 60.93 minutes (from 68.98 mins to 8.05 mins)
Performance Time (Device Analysis to Notification)Not explicitly stated, but compared to predicate (3.35 mins)22.1 seconds

2. Sample Size Used for the Test Set and Data Provenance

  • Sample Size for Test Set: 588 anonymized Chest X-Ray cases (146 pneumothorax positive, 442 pneumothorax negative).
  • Data Provenance: Retrospective cohort from the USA and Israel.

3. Number of Experts Used to Establish Ground Truth for the Test Set and Qualifications

  • Number of Experts: Three (3) US Board Certified Radiologists.
  • Qualifications: US Board Certified Radiologists. Years of experience are not specified.

4. Adjudication Method for the Test Set

The provided text states that "The validation data set was truthed (ground truth) by three US Board Certified Radiologists (truthers)." It does not explicitly detail an adjudication method like 2+1 or 3+1. This implies that the consensus of these three radiologists established the ground truth, but the specific rules (e.g., majority vote, unanimous) are not described.

5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study Was Done, and Effect Size

  • Yes, an MRMC-like study was done. The text states: "The triage effectiveness was evaluated by three different US Board Certified Radiologists (readers) that read these cases prospectively in real time with the HealthPNX device (HealthPNX prioritized work-list) and without (standard of care, 'First-in-First-out' or 'FIFO' queue) with a washout period separating between the two read periods with and without the HealthPNX device."
  • Effect Size of Human Readers' Improvement:
    • Without AI (Standard of Care): Mean triage time of 68.98 minutes (95% CI: [60.53, 77.43] minutes).
    • With AI (HealthPNX): Mean triage time of 8.05 minutes (95% CI: [5.93, 10.16] minutes).
    • Improvement (Reduction): 60.93 minutes. This represents a statistically significant reduction in triage time for time-sensitive images.

6. If a Standalone Study (Algorithm Only) Was Done

  • Yes, a standalone study was done. The text explicitly states: "The stand-alone detection accuracy was measured on this cohort respective to ground truth." and "Overall, the HealthPNX was able to demonstrate an area under the curve (AUC) of 98.3% (95% CI: [97.40%, 99.02%])".

7. The Type of Ground Truth Used

  • Expert Consensus: The ground truth for the test set was established by three (3) US Board Certified Radiologists.

8. The Sample Size for the Training Set

The document does not explicitly state the sample size for the training set. It only discusses the validation/test set.

9. How the Ground Truth for the Training Set Was Established

The document does not provide information on how the ground truth for the training set was established. It only describes the process for the validation/test set.

§ 892.2080 Radiological computer aided triage and notification software.

(a)
Identification. Radiological computer aided triage and notification software is an image processing prescription device intended to aid in prioritization and triage of radiological medical images. The device notifies a designated list of clinicians of the availability of time sensitive radiological medical images for review based on computer aided image analysis of those images performed by the device. The device does not mark, highlight, or direct users' attention to a specific location in the original image. The device does not remove cases from a reading queue. The device operates in parallel with the standard of care, which remains the default option for all cases.(b)
Classification. Class II (special controls). The special controls for this device are:(1) Design verification and validation must include:
(i) A detailed description of the notification and triage algorithms and all underlying image analysis algorithms including, but not limited to, a detailed description of the algorithm inputs and outputs, each major component or block, how the algorithm affects or relates to clinical practice or patient care, and any algorithm limitations.
(ii) A detailed description of pre-specified performance testing protocols and dataset(s) used to assess whether the device will provide effective triage (
e.g., improved time to review of prioritized images for pre-specified clinicians).(iii) Results from performance testing that demonstrate that the device will provide effective triage. The performance assessment must be based on an appropriate measure to estimate the clinical effectiveness. The test dataset must contain sufficient numbers of cases from important cohorts (
e.g., subsets defined by clinically relevant confounders, effect modifiers, associated diseases, and subsets defined by image acquisition characteristics) such that the performance estimates and confidence intervals for these individual subsets can be characterized with the device for the intended use population and imaging equipment.(iv) Stand-alone performance testing protocols and results of the device.
(v) Appropriate software documentation (
e.g., device hazard analysis; software requirements specification document; software design specification document; traceability analysis; description of verification and validation activities including system level test protocol, pass/fail criteria, and results).(2) Labeling must include the following:
(i) A detailed description of the patient population for which the device is indicated for use;
(ii) A detailed description of the intended user and user training that addresses appropriate use protocols for the device;
(iii) Discussion of warnings, precautions, and limitations must include situations in which the device may fail or may not operate at its expected performance level (
e.g., poor image quality for certain subpopulations), as applicable;(iv) A detailed description of compatible imaging hardware, imaging protocols, and requirements for input images;
(v) Device operating instructions; and
(vi) A detailed summary of the performance testing, including: test methods, dataset characteristics, triage effectiveness (
e.g., improved time to review of prioritized images for pre-specified clinicians), diagnostic accuracy of algorithms informing triage decision, and results with associated statistical uncertainty (e.g., confidence intervals), including a summary of subanalyses on case distributions stratified by relevant confounders, such as lesion and organ characteristics, disease stages, and imaging equipment.