K Number
K232436
Device Name
Rapid SDH
Manufacturer
Date Cleared
2023-10-25

(72 days)

Product Code
Regulation Number
892.2080
Panel
RA
Reference & Predicate Devices
N/A
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

Rapid SDH is a radiological computer aided triage and notification software indicated for use in the triage and notification of hemispheric SDH in non-enhanced head images. The device is intended to assist trained radiologists in workflow triage by providing notification of suspected findings of hemispheric Subdural Hemorrhage (SDH) in head CT images. Rapid SDH uses an artificial intelligence algorithm to analyze images and highlight cases with suspected hemispheric SDH on a server or standalone desktop application in parallel to the ongoing standard of care image interpretation. The user is presented with notifications for cases with suspected hemispheric SDH findings include compressed preview images, that are meant for informational purposes only and not intended for diagnostic use beyond notification. The device does not alter the original medical image and is not intended to be used as a diagnostic device.

The results of Rapid SDH are intended to be used in conjunction with other patient information and based on professional judgment, to assist with triage/prioritization of medical images. Notified clinicians are responsible for viewing full images per the standard of care.

Device Description

Rapid SDH is a radiological computer-assisted triage and notification software device. The Rapid SDH module is a Non-Contrast Computed Tomography (NCCT) processing module which operates within the integrated Rapid Platform to provide triage and notification of suspected hemispheric sub-dural hemorrhage (SDH). The Rapid SDH module is an Al/ML module. The output of the module is a priority notification to clinicians indicating the suspicion of SDH based on positive findings. The Rapid SDH module uses the basic services supplied by the Rapid Platform including DICOM processing, job management, imaging module execution and imaging output including the notification and compressed image.

AI/ML Overview

Here's a breakdown of the acceptance criteria and the study that proves the device meets them, based on the provided FDA 510(k) summary for iSchemaView, Inc.'s Rapid SDH:

Executive Summary of Device Purpose:
Rapid SDH is a radiological computer-aided triage and notification software that uses an AI algorithm to identify suspected hemispheric Subdural Hemorrhage (SDH) in non-enhanced head CT images. Its primary function is to assist radiologists in workflow triage by providing rapid notifications, not for diagnostic purposes.


1. Table of Acceptance Criteria and Reported Device Performance

Acceptance Criteria (Performance Goal)Reported Device Performance (with 95% Confidence Interval)
Primary Endpoint: Exceed 80% performance goal (presumably sensitivity, as it's the most critical for triage of potentially urgent cases)Sensitivity: 0.924 (0.871 - 0.956)
Specificity: 0.987 (0.954 - 0.996)
ROC AUC (using Rapid SDH Volume estimate): 0.995 (0.986, 1.0)
Secondary Endpoint: Median processing time to notify clinician of 45 secondsMedian Processing Time: 45 seconds (min: 33 seconds, max: 107 seconds)

2. Sample Size Used for the Test Set and Data Provenance

  • Sample Size: 310 samples (147 positive cases, 163 negative cases).
  • Data Provenance: Retrospective, multinational study. Specific countries are not listed, but various sites are named (e.g., Gradient, Riverside Regional Medical Center, Image Core Lab, Augusta University Medical Center, Ascension, D3, Segmed, Baptist, Hospital de Clinicas de POA, Stanford CA, Ospedale Regionale di Lugano, NYU, Flagler Hospital, MUSC).

3. Number of Experts Used to Establish Ground Truth and Qualifications

  • Number of Experts: Three (3)
  • Qualifications of Experts: Neuro-radiologists. No further details on years of experience or other specific qualifications are provided in this document.

4. Adjudication Method for the Test Set

The adjudication method used to establish ground truth is not explicitly stated in the provided document beyond "Truth was established using three (3) expert neuro-radiologists." Common methods like 2+1 or 3+1 (where dissenting opinions require a tie-breaker or consensus review) are not detailed. It implies a consensus approach among the three experts.


5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study

  • Was an MRMC study done? The document does not indicate that a Multi-Reader Multi-Case (MRMC) comparative effectiveness study was done to evaluate how human readers improve with AI vs. without AI assistance. The study described focuses on the standalone performance of the AI algorithm.
  • Effect size of human reader improvement: Not applicable, as no MRMC study comparing human readers with and without AI assistance was reported.

6. Standalone Performance Study (Algorithm Only)

  • Was a standalone study done? Yes, the performance data presented is for the standalone (algorithm-only) performance of the Rapid SDH software in identifying SDH in CT scans. The primary endpoints (sensitivity, specificity, AUC) and secondary endpoint (processing time) are all metrics of the algorithm's performance without human intervention in the loop for the performance evaluation itself.

7. Type of Ground Truth Used

  • Type of Ground Truth: Expert consensus. The document states: "Truth was established using three (3) expert neuro-radiologists." This indicates that the ground truth labels for the presence or absence of SDH were determined by the agreement of these medical professionals.

8. Sample Size for the Training Set

  • The document does not specify the sample size used for the training set. It only describes the test set used for performance validation.

9. How Ground Truth for the Training Set Was Established

  • The document does not detail how the ground truth for the training set was established. It only describes the ground truth establishment for the test set. Given it's an AI/ML module, it's highly likely that a similar expert review process would have been used for training data, but it's not explicitly stated.

§ 892.2080 Radiological computer aided triage and notification software.

(a)
Identification. Radiological computer aided triage and notification software is an image processing prescription device intended to aid in prioritization and triage of radiological medical images. The device notifies a designated list of clinicians of the availability of time sensitive radiological medical images for review based on computer aided image analysis of those images performed by the device. The device does not mark, highlight, or direct users' attention to a specific location in the original image. The device does not remove cases from a reading queue. The device operates in parallel with the standard of care, which remains the default option for all cases.(b)
Classification. Class II (special controls). The special controls for this device are:(1) Design verification and validation must include:
(i) A detailed description of the notification and triage algorithms and all underlying image analysis algorithms including, but not limited to, a detailed description of the algorithm inputs and outputs, each major component or block, how the algorithm affects or relates to clinical practice or patient care, and any algorithm limitations.
(ii) A detailed description of pre-specified performance testing protocols and dataset(s) used to assess whether the device will provide effective triage (
e.g., improved time to review of prioritized images for pre-specified clinicians).(iii) Results from performance testing that demonstrate that the device will provide effective triage. The performance assessment must be based on an appropriate measure to estimate the clinical effectiveness. The test dataset must contain sufficient numbers of cases from important cohorts (
e.g., subsets defined by clinically relevant confounders, effect modifiers, associated diseases, and subsets defined by image acquisition characteristics) such that the performance estimates and confidence intervals for these individual subsets can be characterized with the device for the intended use population and imaging equipment.(iv) Stand-alone performance testing protocols and results of the device.
(v) Appropriate software documentation (
e.g., device hazard analysis; software requirements specification document; software design specification document; traceability analysis; description of verification and validation activities including system level test protocol, pass/fail criteria, and results).(2) Labeling must include the following:
(i) A detailed description of the patient population for which the device is indicated for use;
(ii) A detailed description of the intended user and user training that addresses appropriate use protocols for the device;
(iii) Discussion of warnings, precautions, and limitations must include situations in which the device may fail or may not operate at its expected performance level (
e.g., poor image quality for certain subpopulations), as applicable;(iv) A detailed description of compatible imaging hardware, imaging protocols, and requirements for input images;
(v) Device operating instructions; and
(vi) A detailed summary of the performance testing, including: test methods, dataset characteristics, triage effectiveness (
e.g., improved time to review of prioritized images for pre-specified clinicians), diagnostic accuracy of algorithms informing triage decision, and results with associated statistical uncertainty (e.g., confidence intervals), including a summary of subanalyses on case distributions stratified by relevant confounders, such as lesion and organ characteristics, disease stages, and imaging equipment.