K Number
K241727
Device Name
BriefCase-Triage
Date Cleared
2024-07-12

(28 days)

Product Code
Regulation Number
892.2080
Panel
RA
Reference & Predicate Devices
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

Briefcase-Triage is a radiological computer- aided triage and notification software indicated for use in the analysis of CTPA images, in adults or transitional adolescents aged 18 and older. The device is intended to assist hospital networks and appropriately trained medical specialists in workflow triage by flagging and communicating suspected positive cases of Pulmonary Embolism (PE) pathologies.

Briefcase-Triage uses an artificial intelligence algorithm to analyze images and highlight cases with detected findings in parallel to the ongoing standard of care image interpretation. The user is presented with notifications for cases with suspected PE findings. Notifications include compressed preview images that are meant for informational purposes only and not intended for diagnostic use beyond notification. The device does not alter the original medical image and is not intended to be used as a diagnostic device.

The results of Briefcase-Triage are intended to be used in conjunction with other patient information and based on their professional judgment, to assist with triage/prioritization of medical images. Notified clinicians are responsible for viewing full images per the standard of care.

Device Description

Briefcase-Triage is a radiological computer-assisted triage and notification software device. The software is based on an algorithm programmed component and is intended to run on a linux-based server in a cloud environment.

The Briefcase-Triage receives filtered DICOM Images, and processes them chronologically by running the algorithms on each series to detect suspected cases. Following the Al processing, the output of the algorithm analysis is transferred to an image review software (desktop application). When a suspected case is detected, the user receives a pop-up notification and is presented with a compressed, low-quality, grayscale image that is captioned "not for diagnostic use, for prioritization only" which is displayed as a preview function. This preview is meant for informational purposes only, does not contain any marking of the findings, and is not intended for primary diagnosis beyond notification.

Presenting the users with worklist prioritization facilitates efficient triage by prompting the user to assess the relevant original images in the PACS. Thus, the suspect case receives attention earlier than would have been the case in the standard of care practice alone.

AI/ML Overview

Here's a breakdown of the acceptance criteria and study details for the Aidoc BriefCase-Triage device, based on the provided document:

1. Table of Acceptance Criteria and Reported Device Performance

Device Name: BriefCase-Triage (for Pulmonary Embolism - PE)

Acceptance CriteriaPerformance GoalReported Device Performance (Default Operating Point)
Sensitivity≥ 80%94.39% (95% CI: 90.41%, 97.07%)
Specificity≥ 80%94.39% (95% CI: 91.04%, 96.67%)
Time-to-notification (compared to predicate)Comparable benefit in time savingMean 26.42 seconds (95% CI: 25.3-27.54) vs Predicate's 78.0 seconds (95% CI: 73.6-82.3)

Additional Operating Points (AOPs) Performance:

Operating PointSensitivity (95% CI)Specificity (95% CI)
AOP199.53% (97.42%-99.99%)86.67% (82.16%-90.39%)
AOP297.66% (94.63%-99.24%)91.93% (88.14%-94.82%)
AOP391.59% (87.03%-94.94%)96.49% (93.64%-98.3%)
AOP485.98% (80.6%-90.34%)98.25% (95.95%-99.43%)

Other reported secondary endpoints for the default operating point:

  • NPV: 98.96% (95% CI: 98.21%- 99.4%)
  • PPV: 74.79% (95% CI: 64.80%- 82.7%)
  • PLR: 16.81 (95% CI: 10.43- 27.09)
  • NLR: 0.059 (95% CI: 0.034- 0.103)

2. Sample Size and Data Provenance for the Test Set

  • Sample Size: 499 cases
  • Data Provenance: Retrospective, multicenter study from 6 US-based clinical sites. The cases were distinct in time or center from the cases used to train the algorithm.

3. Number of Experts and Qualifications for Ground Truth

  • Number of Experts: Three (3)
  • Qualifications: Senior board-certified radiologists.

4. Adjudication Method

  • The document states "the ground truth as determined by three senior board-certified radiologists." It does not explicitly state the adjudication method (e.g., 2+1, consensus, majority vote). However, "determined by" implies that their collective judgment established the ground truth for each case.

5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study

  • No, a multi-reader multi-case (MRMC) comparative effectiveness study was not explicitly described. The study focuses on the standalone performance of the AI algorithm against a ground truth established by experts, and a comparison of its notification time with a predicate device. It does not evaluate human reader performance with and without AI assistance.

6. Standalone Performance Study

  • Yes, a standalone study (algorithm only without human-in-the-loop performance) was conducted. The primary endpoints (sensitivity and specificity) evaluated the device's ability to identify PE cases independently.

7. Type of Ground Truth Used

  • Expert Consensus: The ground truth was "determined by three senior board-certified radiologists"

8. Sample Size for the Training Set

  • The document states that the algorithm was "trained during software development on images of the pathology" and that the subject device was trained on a "larger data set" compared to the predicate. However, it does not specify the exact sample size of the training set.

9. How Ground Truth for the Training Set was Established

  • "critical findings were tagged in all CTs in the training data set." This implies manual labeling (annotation) of findings by experts on the training images. While not explicitly stated, it is common practice that these tags are done by medical professionals.

§ 892.2080 Radiological computer aided triage and notification software.

(a)
Identification. Radiological computer aided triage and notification software is an image processing prescription device intended to aid in prioritization and triage of radiological medical images. The device notifies a designated list of clinicians of the availability of time sensitive radiological medical images for review based on computer aided image analysis of those images performed by the device. The device does not mark, highlight, or direct users' attention to a specific location in the original image. The device does not remove cases from a reading queue. The device operates in parallel with the standard of care, which remains the default option for all cases.(b)
Classification. Class II (special controls). The special controls for this device are:(1) Design verification and validation must include:
(i) A detailed description of the notification and triage algorithms and all underlying image analysis algorithms including, but not limited to, a detailed description of the algorithm inputs and outputs, each major component or block, how the algorithm affects or relates to clinical practice or patient care, and any algorithm limitations.
(ii) A detailed description of pre-specified performance testing protocols and dataset(s) used to assess whether the device will provide effective triage (
e.g., improved time to review of prioritized images for pre-specified clinicians).(iii) Results from performance testing that demonstrate that the device will provide effective triage. The performance assessment must be based on an appropriate measure to estimate the clinical effectiveness. The test dataset must contain sufficient numbers of cases from important cohorts (
e.g., subsets defined by clinically relevant confounders, effect modifiers, associated diseases, and subsets defined by image acquisition characteristics) such that the performance estimates and confidence intervals for these individual subsets can be characterized with the device for the intended use population and imaging equipment.(iv) Stand-alone performance testing protocols and results of the device.
(v) Appropriate software documentation (
e.g., device hazard analysis; software requirements specification document; software design specification document; traceability analysis; description of verification and validation activities including system level test protocol, pass/fail criteria, and results).(2) Labeling must include the following:
(i) A detailed description of the patient population for which the device is indicated for use;
(ii) A detailed description of the intended user and user training that addresses appropriate use protocols for the device;
(iii) Discussion of warnings, precautions, and limitations must include situations in which the device may fail or may not operate at its expected performance level (
e.g., poor image quality for certain subpopulations), as applicable;(iv) A detailed description of compatible imaging hardware, imaging protocols, and requirements for input images;
(v) Device operating instructions; and
(vi) A detailed summary of the performance testing, including: test methods, dataset characteristics, triage effectiveness (
e.g., improved time to review of prioritized images for pre-specified clinicians), diagnostic accuracy of algorithms informing triage decision, and results with associated statistical uncertainty (e.g., confidence intervals), including a summary of subanalyses on case distributions stratified by relevant confounders, such as lesion and organ characteristics, disease stages, and imaging equipment.