(165 days)
Annalise Enterprise is a device designed to be used in the medical care environment to aid in triage and prioritization of studies with features suggestive of the following findings:
- · acute subdural/epidural hematoma*
· acute subarachnoid hemorrhage *
· intra-axial hemorrhage*
· intraventricular hemorrhage*
*These findings are intended to be used together as one device.
The device analyzes studies using an artificial intelligence algorithm to identify findings. It makes study-level output available to an order and imaging management system for worklist prioritization or triage.
The device is not intended to direct attention to specific portions of an image and only provides notification for suspected findings.
Its results are not intended:
· to be used on a standalone basis for clinical decision making
· to rule out specific findings, or otherwise preclude clinical assessment of CTB studies
Intended modality:
Annalise Enterprise identifies suspected findings in non-contrast brain CT studies.
Intended user:
The device is intended to be used by trained clinicians who, as part of their scope of practice, are qualified to interpret brain CT studies.
Intended patient population:
The intended population is patients who are 22 years or older.
Annalise Enterprise CTB Triage Trauma is a software workflow tool which uses an artificial intelligence (Al) algorithm to identify suspected findings on non-contrast brain CT studies in the medical care environment. The findings identified by the device include acute subdural/ epidural hematoma, acute subarachnoid hemorrhage, intra-axial hemorrhage, and intraventricular hemorrhage.
Radiological findings are identified by the device using an AI algorithm - a convolutional neural network trained using deep-learning techniques. Images used to train the algorithm were sourced from datasets that included a range of equipment manufacturers including Toshiba, GE Medical Systems, Siemens, Philips, and Canon Medical Systems. This dataset, which contained over 200,000 CT brain imaging studies, was labelled by trained radiologists regarding the presence of the four findings of interest.
The performance of the device's AI algorithm was validated in a standalone performance evaluation, in which the case-level output from the device was compared with a reference standard ('ground truth'). This was determined by two ground truthers, with a third truther used in the event of disagreement. All truthers were US board-certified neuroradiologists.
The device interfaces with image and order management systems (such as PACS/RIS) to obtain noncontrast brain CT studies for processing by the AI algorithm. Following processing, if any of the clinical findings of interest are identified in a non-contrast brain CT study, the device provides a notification to the image and order management system for prioritization of that study in the worklist. This enables users to review the studies containing features suggestive of these clinical findings earlier than in the standard clinical workflow. It is important to note that the device will never decrease a study's existing priority in the worklist. This ensures that worklist items will never have their priorities downgraded based on AI results.
The device workflow is performed parallel to and in conjunction with the standard clinical workflow for interpretation of non-contrast brain CTs. The device is intended to aid in prioritization and triage of radiological medical images only.
Here is a summary of the acceptance criteria and the study proving the device meets them, based on the provided text:
Acceptance Criteria and Device Performance
Finding | Slice Thickness Range | Operating Point | Sensitivity % (Se) (95% CI) | Specificity % (Sp) (95% CI) |
---|---|---|---|---|
Acute subdural/ Epidural hematoma | 1.5mm & ≤5.0mm | 0.060177 | 82.4 (78.6,86.1) | 89.6 (83.7,94.8) |
Acute subarachnoid hemorrhage | 1.5mm & ≤5.0mm | 0.020255 | 90.7 (86.3,95.1) | 92.4 (86.7,97.1) |
0.030010 | 87.4 (82.4,91.8) | 96.2 (92.4,99.0) | ||
Intra-axial hemorrhage | 1.5mm & ≤5.0mm | 0.203600 | 93.4 (91.3,95.1) | 85.1 (80.9,88.9) |
0.322700 | 90.3 (87.9,92.5) | 90.3 (86.8,93.8) | ||
Intraventricular hemorrhage | 1.5mm & ≤5.0mm | 0.008430 | 95.6 (91.2,98.9) | 86.0 (78.5,92.5) |
0.015487 | 92.3 (86.8,96.7) | 89.2 (82.8,94.6) | ||
0.051859 | 87.9 (80.2,94.5) | 97.8 (94.6,100.0) | ||
Triage Turn-around Time (Bench Study) | N/A | N/A | 81.6 seconds (95% CI: 80.3 - 82.9) | N/A |
The results for sensitivity and specificity above were presented across different operating points and slice thickness ranges, demonstrating the device's performance for each specific finding. The submission states that these results demonstrate the device establishes effective triage based on high sensitivity and specificity and are "substantially equivalent to those of the predicate device."
Specific acceptance criteria are not explicitly defined as pass/fail thresholds in the provided text. Instead, the reported performance metrics (sensitivity, specificity, and turn-around time) are presented as proof of meeting the requirements for 'Radiological computer aided triage and notification software' and supporting substantial equivalence.
Study Information
-
Sample sizes used for the test set and the data provenance:
- Standalone Performance Evaluation (retrospective):
- Total cases: 1,485 cases for slice thickness ≤1.5mm (1,003 positive, 482 negative) and 1,878 cases for slice thickness >1.5mm (1,257 positive, 621 negative).
- Provenance: Collected consecutively from five US hospital network sites. The test dataset was newly acquired and independent from the training dataset.
- Triage Effectiveness Study (internal bench study):
- Total cases: 277 cases positive for any of the findings eligible for prioritization.
- Provenance: Collected from multiple data sources spanning a variety of geographical locations, patient demographics, and technical characteristics.
- Standalone Performance Evaluation (retrospective):
-
Number of experts used to establish the ground truth for the test set and the qualifications of those experts:
- Number of Experts: At least two neuroradiologists, with a third used in case of disagreement.
- Qualifications: US board-certified neuroradiologists, ABR-certified, and protocol-trained.
-
Adjudication method for the test set:
- Consensus determined by two ground truthers, and a third ground truther in the event of disagreement (2+1 adjudication). The cases were annotated in a blinded fashion.
-
If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:
- A multi-reader multi-case (MRMC) comparative effectiveness study was not specifically described in the provided text. The performance assessment focused on standalone performance and triage effectiveness (turn-around time).
-
If a standalone (i.e., algorithm only without human-in-the-loop performance) was done:
- Yes, a standalone performance evaluation was done. The case-level output from the device's AI algorithm was compared directly with the reference standard (ground truth).
-
The type of ground truth used (expert consensus, pathology, outcomes data, etc.):
- Expert Consensus: The ground truth was determined by the consensus of multiple ABR-certified and protocol-trained neuroradiologists.
-
The sample size for the training set:
- "Over 200,000 CT brain imaging studies."
-
How the ground truth for the training set was established:
- The training dataset, containing over 200,000 CT brain imaging studies, "was labelled by trained radiologists regarding the presence of the four findings of interest."
§ 892.2080 Radiological computer aided triage and notification software.
(a)
Identification. Radiological computer aided triage and notification software is an image processing prescription device intended to aid in prioritization and triage of radiological medical images. The device notifies a designated list of clinicians of the availability of time sensitive radiological medical images for review based on computer aided image analysis of those images performed by the device. The device does not mark, highlight, or direct users' attention to a specific location in the original image. The device does not remove cases from a reading queue. The device operates in parallel with the standard of care, which remains the default option for all cases.(b)
Classification. Class II (special controls). The special controls for this device are:(1) Design verification and validation must include:
(i) A detailed description of the notification and triage algorithms and all underlying image analysis algorithms including, but not limited to, a detailed description of the algorithm inputs and outputs, each major component or block, how the algorithm affects or relates to clinical practice or patient care, and any algorithm limitations.
(ii) A detailed description of pre-specified performance testing protocols and dataset(s) used to assess whether the device will provide effective triage (
e.g., improved time to review of prioritized images for pre-specified clinicians).(iii) Results from performance testing that demonstrate that the device will provide effective triage. The performance assessment must be based on an appropriate measure to estimate the clinical effectiveness. The test dataset must contain sufficient numbers of cases from important cohorts (
e.g., subsets defined by clinically relevant confounders, effect modifiers, associated diseases, and subsets defined by image acquisition characteristics) such that the performance estimates and confidence intervals for these individual subsets can be characterized with the device for the intended use population and imaging equipment.(iv) Stand-alone performance testing protocols and results of the device.
(v) Appropriate software documentation (
e.g., device hazard analysis; software requirements specification document; software design specification document; traceability analysis; description of verification and validation activities including system level test protocol, pass/fail criteria, and results).(2) Labeling must include the following:
(i) A detailed description of the patient population for which the device is indicated for use;
(ii) A detailed description of the intended user and user training that addresses appropriate use protocols for the device;
(iii) Discussion of warnings, precautions, and limitations must include situations in which the device may fail or may not operate at its expected performance level (
e.g., poor image quality for certain subpopulations), as applicable;(iv) A detailed description of compatible imaging hardware, imaging protocols, and requirements for input images;
(v) Device operating instructions; and
(vi) A detailed summary of the performance testing, including: test methods, dataset characteristics, triage effectiveness (
e.g., improved time to review of prioritized images for pre-specified clinicians), diagnostic accuracy of algorithms informing triage decision, and results with associated statistical uncertainty (e.g., confidence intervals), including a summary of subanalyses on case distributions stratified by relevant confounders, such as lesion and organ characteristics, disease stages, and imaging equipment.