(114 days)
InferRead CT Stroke.AI is a radiological computer aided triage and notification software for use in the analysis of Non-Enhanced Head CT images. The device is intended to assist hospital networks and trained radiologists in workflow triage by flagging suspected positive findings of intracranial hemorrhage (ICH).
InferRead CT Stroke.AI uses an artificial intelligence algorithm to analyze images and highlight cases with detected ICH on a standalone desktop application in parallel to the ongoing standard of care image interpretation. The user is presented with a worklist with marked cases of suspected ICH findings. The device does not alter the original medical image, does not remove cases from queue, and is not intended to be used as a diagnostic device. If the clinician does not view the case, or if a case is not flagged, cases remain to be processed per the standard of care.
The results of InferRead CT Stroke.AI are intended to be used in conjunction with other patient information and based on professional judgment, to assist with triage/prioritization of medical images. Notified clinicians are responsible for viewing full images per the standard of care.
InferRead CT Stroke.AI is a radiological computer-assisted triage and notification software device. The software device is a computer program with a deep learning algorithm running on Ubuntu operating system. The device can be deployed as an onsite server in the hospital and the user interacts with the software from a client workstation. The device can be broken down into 4 modules, the NeoViewer, Docking Toolbox, RePACS, and DLServer.
The Docking Toolbox module receives DICOM series and inspects the series against a list of requirements. Series that pass the requirements are sent into the system for prediction for intracranial hemorrhage. Series are processed in a first-out order. When hemorrhage is detected, the system marks the case in the work list prompting the user to conduct preemptive triage and prioritization.
When the user refreshes the page, cases with suspected findings will be marked with an indicator. Cases are identified, such as by Name and Patient ID. A preview is available but is not intended for primary diagnosis and a radiologist must review the case per their standard process. The suspected cases assist in triaging intracranial hemorrhage cases sooner than standard of care practice alone.
Here's a breakdown of the acceptance criteria and study proving the device meets them, based on the provided FDA 510(k) summary for InferRead CT Stroke.AI:
1. Table of Acceptance Criteria and Reported Device Performance
The acceptance criteria were implicitly defined by the null hypothesis and target performance goals for sensitivity and specificity. The study aimed to demonstrate statistically significant performance above an 80% threshold for both metrics.
| Metric | Acceptance Criteria (Lower Bound 95% CI) | Reported Device Performance (Value with 95% CI) |
|---|---|---|
| Sensitivity | > 80% | 0.916 (95% CI: 0.867-0.951) |
| Specificity | > 80% | 0.922 (95% CI: 0.872-0.957) |
Additional Performance Metrics Reported:
- Area Under the Receiver Operating Characteristic Curve (AUC): 0.962
- InferRead Time-to-Notification: 1.07 ± 0.57 minutes (mean ± SD)
- Standard of Care Time-to-Open-Exam: 75.4 ± 192.7 minutes (mean ± SD)
2. Sample Size Used for the Test Set and Data Provenance
- Sample Size: 369 non-contrast brain CT scans (studies).
- Data Provenance: Obtained from three hospitals in the U.S. The study was retrospective.
- Case Distribution: Approximately equal numbers of positive (51.5% with ICH) and negative (48.5% without ICH) cases.
3. Number of Experts Used to Establish the Ground Truth for the Test Set and Their Qualifications
- The document states that the ground truth was established by "trained neuro-radiologists."
- It does not specify the exact number of neuro-radiologists or their specific years of experience.
4. Adjudication Method for the Test Set
- The document does not explicitly describe an adjudication method (e.g., 2+1, 3+1). It only states that the ground truth was "established by trained neuro-radiologists." This implies some form of consensus reading or a single expert's definitive diagnosis, but the process is not detailed.
5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study was done
- No, an MRMC comparative effectiveness study was not explicitly described in terms of human readers improving with AI vs. without AI assistance.
- The study did compare the "InferRead time-to-notification" with the "standard of care time-to-open-exam," which suggests a comparison of workflow efficiency with the AI system's notification versus traditional worklist review.
- Effect Size (Time-to-Notification): InferRead CT Stroke.AI achieved a notification time of 1.07 ± 0.57 minutes, significantly faster than the standard of care time-to-open-exam of 75.4 ± 192.7 minutes (P < 0.001). This demonstrates a substantial reduction in the time to flag suspected cases.
6. If a Standalone (i.e., algorithm only without human-in-the-loop performance) was done
- Yes, a standalone performance evaluation was conducted. The reported sensitivity (0.916) and specificity (0.922) are for the InferRead deep learning algorithm's performance in detecting ICH, independent of human interpretation or intervention at the point of evaluation.
- The study assessed the algorithm's output directly against the established ground truth.
7. The Type of Ground Truth Used
- Expert Consensus: The ground truth was established by "trained neuro-radiologists" in the detection of intracranial hemorrhage (ICH). It appears to be based on expert radiological review of the CT images.
8. The Sample Size for the Training Set
- The document does not provide the sample size for the training set. It only mentions that the device uses an "artificial intelligence algorithm with database of images" for training.
9. How the Ground Truth for the Training Set Was Established
- The document does not describe how the ground truth for the training set was established. It only states that the algorithm was "trained on medical images."
Summary of Missing Information:
The document provides good detail on the test set performance and a general statement about ground truth establishment for the test set ("trained neuro-radiologists"). However, it lacks specific information regarding:
- The number and specific qualifications/experience of the experts establishing ground truth for the test set.
- The adjudication method used for the test set ground truth.
- Any details about the training set, including its size and how its ground truth was established.
{0}------------------------------------------------
Image /page/0/Picture/0 description: The image contains the logo of the U.S. Food and Drug Administration (FDA). On the left is the Department of Health & Human Services logo. To the right of that is the FDA logo, which is a blue square with the letters "FDA" in white. To the right of the blue square is the text "U.S. FOOD & DRUG ADMINISTRATION" in blue.
August 12, 2021
Infervision Medical Technology Co., Ltd. % Mr. Matt Deng Director Infervision US, Inc. 1900 Market Street PHILADELPHIA PA 19103
Re: K211179
Trade/Device Name: InferRead CT Stroke.AI Regulation Number: 21 CFR 892.2080 Regulation Name: Radiological computer aided triage and notification software Regulatory Class: Class II Product Code: QAS Dated: July 8, 2021 Received: July 12, 2021
Dear Mr. Deng:
We have reviewed your Section 510(k) premarket notification of intent to market the device referenced above and have determined the device is substantially equivalent (for the indications for use stated in the enclosure) to legally marketed predicate devices marketed in interstate commerce prior to May 28, 1976, the enactment date of the Medical Device Amendments, or to devices that have been reclassified in accordance with the provisions of the Federal Food, Drug, and Cosmetic Act (Act) that do not require approval of a premarket approval application (PMA). You may, therefore, market the device, subject to the general controls provisions of the Act. Although this letter refers to your product as a device, please be aware that some cleared products may instead be combination products. The 510(k) Premarket Notification Database located at https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfpmn/pmn.cfm identifies combination product submissions. The general controls provisions of the Act include requirements for annual registration, listing of devices, good manufacturing practice, labeling, and prohibitions against misbranding and adulteration. Please note: CDRH does not evaluate information related to contract liability warranties. We remind you, however, that device labeling must be truthful and not misleading.
If your device is classified (see above) into either class II (Special Controls) or class III (PMA), it may be subject to additional controls. Existing major regulations affecting your device can be found in the Code of Federal Regulations, Title 21, Parts 800 to 898. In addition, FDA may publish further announcements concerning your device in the Federal Register.
Please be advised that FDA's issuance of a substantial equivalence determination does not mean that FDA has made a determination that your device complies with other requirements of the Act or any Federal statutes and regulations administered by other Federal agencies. You must comply with all the Act's requirements, including, but not limited to: registration and listing (21 CFR Part 807); labeling (21 CFR Part 801 and Part 809); medical device reporting (reporting of medical device-related adverse events) (21 CFR
{1}------------------------------------------------
- for devices or postmarketing safety reporting (21 CFR 4, Subpart B) for combination products (see https://www.fda.gov/combination-products/guidance-regulatory-information/postmarketing-safety-reportingcombination-products); good manufacturing practice requirements as set forth in the quality systems (QS) regulation (21 CFR Part 820) for devices or current good manufacturing practices (21 CFR 4, Subpart A) for combination products; and, if applicable, the electronic product radiation control provisions (Sections 531-542 of the Act); 21 CFR 1000-1050.
Also, please note the regulation entitled, "Misbranding by reference to premarket notification" (21 CFR Part 807.97). For questions regarding the reporting of adverse events under the MDR regulation (21 CFR Part 803), please go to https://www.fda.gov/medical-device-safety/medical-device-reportingmdr-how-report-medical-device-problems.
For comprehensive regulatory information about mediation-emitting products, including information about labeling regulations, please see Device Advice (https://www.fda.gov/medicaldevices/device-advice-comprehensive-regulatory-assistance) and CDRH Learn (https://www.fda.gov/training-and-continuing-education/cdrh-learn). Additionally, you may contact the Division of Industry and Consumer Education (DICE) to ask a question about a specific regulatory topic. See the DICE website (https://www.fda.gov/medical-device-advice-comprehensive-regulatoryassistance/contact-us-division-industry-and-consumer-education-dice) for more information or contact DICE by email (DICE@fda.hhs.gov) or phone (1-800-638-2041 or 301-796-7100).
Sincerely.
For
Thalia T. Mills, Ph.D. Director Division of Radiological Health OHT7: Office of In Vitro Diagnostics and Radiological Health Office of Product Evaluation and Quality Center for Devices and Radiological Health
Enclosure
{2}------------------------------------------------
Indications for Use
510(k) Number (if known) K211179
Device Name InferRead CT Stroke.AI
Indications for Use (Describe)
InferRead CT Stroke.AI is a radiological computer aided triage and notification software for use in the analysis of Non-Enhanced Head CT images. The device is intended to assist hospital networks and trained radiologists in workflow trage by flagging suspected positive findings of intracranial hemorrhage (ICH).
InferRead CT Stroke.Al uses an artificial intelligence algorithm to analyze images and highlight cases with detected ICH on a standalone desktop application in parallel to the ongoing standard of care image interpretation. The user is presented with a worklist with marked cases of suspected ICH findings. The device does not alter the original medical image, does not remove cases from queue, and is not intended to be used as a diagnostic device. If the clinician does not view the case, or if a case is not flagged, cases remain to be processed per the standard of care.
The results of InferRead CT Stroke.AI are intended to be used in conjunction with other patient information and based on professional judgment, to assist with triage/prioritization of medical images. Notified clinicians are responsible for viewing full images per the standard of care.
| Type of Use (Select one or both, as applicable) | |
|---|---|
| [X] Prescription Use (Part 21 CFR 801 Subpart D) | [] Over-The-Counter Use (21 CFR 801 Subpart C) |
CONTINUE ON A SEPARATE PAGE IF NEEDED.
This section applies only to requirements of the Paperwork Reduction Act of 1995.
DO NOT SEND YOUR COMPLETED FORM TO THE PRA STAFF EMAIL ADDRESS BELOW.
The burden time for this collection of information is estimated to average 79 hours per response, including the time to review instructions, search existing data sources, gather and maintain the data needed and complete and review the collection of information. Send comments regarding this burden estimate or any other aspect of this information collection, including suggestions for reducing this burden, to:
Department of Health and Human Services Food and Drug Administration Office of Chief Information Officer Paperwork Reduction Act (PRA) Staff PRAStaff(@fda.hhs.gov
"An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB number."
{3}------------------------------------------------
510(k) Summary
Infervision Medical Technology Co., Ltd. K211179
This 510(k) Summary is in conformance with 21 CFR 807.92
| Submitter: | Infervision Medical Technology Co., Ltd.Room B401, 4th Floor, Building 1,No. 12 Shangdi Information Road,Haidian District, Beijing, 100085, ChinaPhone: +86 10-86462323 |
|---|---|
| Primary Contact: | Matt DengEmail: matt.deng@infervision.aiPhone: 919-886-6082 |
| Second PrimaryContact: | Frank WuEmail: frank.wu@infervision.aiPhone: 857-988-1888 |
| Company Contact: | Xiaoyan FanEmail: fxiaoyan@infervision.comPhone: +86 13810508664 |
| Date Prepared: | April 8, 2021 |
| Device Name and Classification | |
| Trade Name: | InferRead CT Stroke.AI |
| Common Name: | Radiological computer aided triage and notification software |
| Classification: | Class II |
21 CFR 892.2080, Radiological computer aided triage and Regulation Number: notification software Classification Panel: Radiology Product Code:
QAS
Predicate Device:
| Primary Predicate | |
|---|---|
| Trade Name | BriefCase |
| 510(k) Submitter/Holder | Aidoc |
| Class | Class II |
| Regulation Number | 21 CFR 892.2080 |
| Classification Panel | Radiology |
| Product Code | QAS |
{4}------------------------------------------------
Device Description
InferRead CT Stroke.AI is a radiological computer-assisted triage and notification software device. The software device is a computer program with a deep learning algorithm running on Ubuntu operating system. The device can be deployed as an onsite server in the hospital and the user interacts with the software from a client workstation. The device can be broken down into 4 modules, the NeoViewer, Docking Toolbox, RePACS, and DLServer.
The Docking Toolbox module receives DICOM series and inspects the series against a list of requirements. Series that pass the requirements are sent into the system for prediction for intracranial hemorrhage. Series are processed in a first-out order. When hemorrhage is detected, the system marks the case in the work list prompting the user to conduct preemptive triage and prioritization.
When the user refreshes the page, cases with suspected findings will be marked with an indicator. Cases are identified, such as by Name and Patient ID. The user may filter and sort by suspected ICH and identify the case. A preview is available but is not intended for primary diagnosis and a radiologist must review the case per their standard process. The suspected cases assist in triaging intracranial hemorrhage cases sooner than standard of care practice alone.
Intended Use/Indications for Use
InferRead CT Stroke.AI is a radiological computer aided triage and notification software for use in the analysis of non-enhanced head CT images. The device is intended to assist hospital networks and trained radiologists in workflow triage by flagging suspected positive findings of intracranial hemorrhage (ICH).
InferRead CT Stroke.AI uses an artificial intelligence algorithm to analyze images and highlight cases with detected ICH on a standalone desktop application in parallel to the ongoing standard of care image interpretation. The user is presented with a worklist with marked cases of suspected ICH findings. The device does not alter the original medical image, does not remove cases from queue, and is not intended to be used as a diagnostic device. If the clinician does not view the case, or if a case is not flagged, cases remain to be processed per the standard of care.
The results of InferRead CT Stroke.AI are intended to be used in conjunction with other patient information and based on professional judgment, to assist with triage/prioritization of medical images. Notified clinicians are responsible for viewing full images per the standard of care.
{5}------------------------------------------------
Comparison of Technological Characteristics
The subject and predicate devices are radiological computer-assisted triage and notification software. Both devices are artificial intelligence algorithms incorporated software packages for use with CT scanners, PACS, and workstations. Both devices process images intended to aid in prioritization and triage of non-enhanced head CT cases with intracranial hemorrhage. Both have the same intended use and indications for use for flagging suspected cases, and indicating to the clinician for review.
The predicate device sends pop up notifications and compressed previews to the workstations of radiologist. The subject device doesn't send pop up notifications as the predicate device. Instead, to fulfill the notification, the subject device visually marks the case in the worklist, indicating to a radiologist the need to review those images for ICH. For both devices, the user must be alert and receptive to the outputs of the device. Similarly to predicate device, the subject device also works in parallel to the standard of care. The indication prompts preemptive triage of the flagged case where the radiologist may decide to perform evaluation. Similarly, if the notification is rejected, the case remains in their standard queue to be handled per their standard of care.
The subject device provides a viewer on the workstation allowing the radiologist to preview the DICOM similarly to the compressed preview of the subject device. This viewer allows the user to Scroll through series. Similar to the predicate, the preview is for informational purposes only and not for diagnostic use. The notified clinicians are responsible for using the local imaging system for viewing the original images and engage the referring clinician for diagnosis and treatment decisions.
{6}------------------------------------------------
The subject and predicate software utilizes a deep learning algorithm trained on medical images. The same type of safety and effectiveness questions as the predicate. That is, accurate detection of intracranial hemorrhage within the study on which a physician can base a clinically useful triage/prioritization assessment considering all available clinical information. Like the predicate, the subject device does not reading queue. Both devices operate in parallel with the standard of care, which remains the default option for all cases.
| Item | InferRead CT Stroke.AI(Subject Device) | Aidoc Briefcase ICH (K180647)(Predicate Device) | Comparison |
|---|---|---|---|
| Intended Use /Indications for Use | InferRead CT Stroke.AI is aradiological computer aided triageand notification software for use inthe analysis of Non-Enhanced HeadCT images. The device is intended toassist hospital networks and trainedradiologists in workflow triage byflagging suspected positive findingsof intracranial hemorrhage (ICH).InferRead CT Stroke.AI uses anartificial intelligence algorithm toanalyze images and highlight caseswith detected ICH on a standalonedesktop application in parallel to theongoing standard of care imageinterpretation. The user is presentedwith a worklist with marked cases ofsuspected ICH findings. The devicedoes not alter the original medical | BriefCase is a radiological computeraided triage and notification softwareindicated for use in the analysis of non-enhanced head CT images. The device isintended to assist hospital networks andtrained radiologists in workflow triageby flagging and communication ofsuspected positive findings ofpathologies in head CT images, namelyIntracranial Hemorrhage (ICH).BriefCase uses an artificial intelligencealgorithm to analyze images andhighlight cases withdetected ICH on a standalone desktopapplication in parallel to the ongoingstandard of careimage interpretation. The user ispresented with notifications for caseswith suspected ICH | InferRead CT Stroke.AIand the previously clearedBriefCase (K180647)have the same intendeduse and\indications for usein terms of findingsuspected intracranialhemorrhage in noncontrast head CT, flaggingsuspected cases, andindicating the case to theattention of the clinician. |
| image, does not remove cases from queue, and is not intended to be used as a diagnostic device. If the clinician does not view the case, or if a case is not flagged, cases remain to be processed per the standard of care.The results of InferRead CT Stroke.AI are intended to be used in conjunction with other patient information and based on professional judgment, to assist with triage/prioritization of medical images. Notified clinicians are responsible for viewing full images per the standard of care. | findings. Notifications include compressed preview images that are meant for informational purposes only and not intended for diagnostic use beyond notification. The device does not alter the original medical image and is not intended to be used as a diagnostic device.The results of BriefCase are intended to be used in conjunction with other patient information and based on professional judgment, to assist with triage/prioritization of medical images. Notified clinicians are responsible for viewing full images per the standard of care. | Both are designed to be used by the radiologist, prompt the radiologist to start preemptive triage of a flagged case | |
| User Population | Radiologist | Radiologist | Both are indicated for use in analysis of non- enhanced head CT |
| Anatomical Region of Interest | Head | Head | |
| Data Acquisition Protocol | Non contrast CT scan of the head | Non contrast CT scan of the head or neck | |
| View DICOM data | DICOM information about the patient, study and current image | DICOM information about the patient, study and current image | Both display DICOM information for informational purposes only |
| Segmentation ofregion of interest | No; device does not mark, highlight,or direct users' attention to a specificlocation in the original image | No; device does not mark, highlight, ordirect users' attention to a specificlocation in the original image | Neither marks, highlightsor directs attention to aspecific location in theoriginal image |
| Algorithm | Artificial intelligence algorithm withdatabase of images | Artificial intelligence algorithm withdatabase of images | Both use artificialintelligence algorithmwith a database of images |
| Notification /Prioritization | Yes, Case level indicator | Yes, pop-up notifications, case levelindicator | In both, the suspectedcases are indicated to theuser. The subject deviceprovides case levelindicator and allow theuser to sort suspectedcases to the top. |
| Preview Images | Presentation of a preview of thestudy for initial assessment notmeant for diagnostic purposesThe device operates in parallelwith the standard of care, whichremains the default option for allcases | Presentation of a preview of thestudy for initial assessment notmeant for diagnostic purposesThe device operates in parallelwith the standard of care, whichremains the default option for allcases | Both allow the user toview the image. Thedevice is intended to workin parallel with standardof care. |
| Alteration oforiginal image | No | No | Neither alters the originalimage. |
| Removal of casesfrom worklistqueue | No | No | Neither removes casesfrom the worklist queue. |
Detailed Comparison of the Subject and Predicate Devices
{7}------------------------------------------------
{8}------------------------------------------------
{9}------------------------------------------------
Performance Data
Infervision conducted a retrospective study to assess the clinical performance and notification functionality of the InferRead CT Stroke.AI software. The study evaluated the InferRead deep learning algorithm in terms of sensitivity and specificity with respect to a ground truth, as established by trained neuro-radiologists, in the detection of intracranial hemorrhage (ICH) in the brain. In addition, the study reported and compared the InferRead time-to-notification and the Time-to-open-exam for the standard of care. The InferRead time-to-notification includes the time to receive the DICOM scan, analyze and the worklist application shows the results. The standard of care time-to-open-exam consisted of the time from the initial scan of the patient to when the radiologist first opens the exam for review.
A total of 369 non-contrast brain CT scans (studies) were obtained from three hospitals in the U.S. There were approximately equal numbers of positive and negative cases (51.5% of images with ICH and 48.5% without ICH, respectively) included in the analysis. Comparing the InferRead software output to the ground truth, the sensitivity and specificity of InferRead CT Stroke.AI are 0.916 (95% CI: 0.867-0.951) and 0.922 (95% CI: 0.872-0.957), which are significantly higher than the 80% null hypothesis (p values < 0.001). This study met the pre-specified performance goals of 80% for sensitivity and specificity.
In addition, the area under the receiver operating characteristic curve (AUC) was 0.962, demonstrating the clinical utility and potential benefits of the InferRead software based on the imaging study results.
The InferRead time-to-notification is 1.07 ± 0.57 (mean ± SD) minutes, which is substantially lower than the standard of care time-to-open-exam of 75.4±192.7 minutes (P < 0.001). This validation study shows that InferRead CT Stroke.AI is both safe and effective.
Conclusions
Based on the technological comparison, the major difference is that the subject device marks the suspected case in the work list instead of a pop up notification. In both cases, the user must be receptive to the visual outputs of the device. After comparison, we do not find different issues of safety or effectiveness. In the same way as the predicate, the subject also uses deep learning algorithm to process images and predict intracranial hemorrhage. The intended use of the two are the same.
The performance testing demonstrated that the subject device performed similarly. These results show that the subject device does not have significant differences and is as safe and effective as a legally marketed predicate device.
§ 892.2080 Radiological computer aided triage and notification software.
(a)
Identification. Radiological computer aided triage and notification software is an image processing prescription device intended to aid in prioritization and triage of radiological medical images. The device notifies a designated list of clinicians of the availability of time sensitive radiological medical images for review based on computer aided image analysis of those images performed by the device. The device does not mark, highlight, or direct users' attention to a specific location in the original image. The device does not remove cases from a reading queue. The device operates in parallel with the standard of care, which remains the default option for all cases.(b)
Classification. Class II (special controls). The special controls for this device are:(1) Design verification and validation must include:
(i) A detailed description of the notification and triage algorithms and all underlying image analysis algorithms including, but not limited to, a detailed description of the algorithm inputs and outputs, each major component or block, how the algorithm affects or relates to clinical practice or patient care, and any algorithm limitations.
(ii) A detailed description of pre-specified performance testing protocols and dataset(s) used to assess whether the device will provide effective triage (
e.g., improved time to review of prioritized images for pre-specified clinicians).(iii) Results from performance testing that demonstrate that the device will provide effective triage. The performance assessment must be based on an appropriate measure to estimate the clinical effectiveness. The test dataset must contain sufficient numbers of cases from important cohorts (
e.g., subsets defined by clinically relevant confounders, effect modifiers, associated diseases, and subsets defined by image acquisition characteristics) such that the performance estimates and confidence intervals for these individual subsets can be characterized with the device for the intended use population and imaging equipment.(iv) Stand-alone performance testing protocols and results of the device.
(v) Appropriate software documentation (
e.g., device hazard analysis; software requirements specification document; software design specification document; traceability analysis; description of verification and validation activities including system level test protocol, pass/fail criteria, and results).(2) Labeling must include the following:
(i) A detailed description of the patient population for which the device is indicated for use;
(ii) A detailed description of the intended user and user training that addresses appropriate use protocols for the device;
(iii) Discussion of warnings, precautions, and limitations must include situations in which the device may fail or may not operate at its expected performance level (
e.g., poor image quality for certain subpopulations), as applicable;(iv) A detailed description of compatible imaging hardware, imaging protocols, and requirements for input images;
(v) Device operating instructions; and
(vi) A detailed summary of the performance testing, including: test methods, dataset characteristics, triage effectiveness (
e.g., improved time to review of prioritized images for pre-specified clinicians), diagnostic accuracy of algorithms informing triage decision, and results with associated statistical uncertainty (e.g., confidence intervals), including a summary of subanalyses on case distributions stratified by relevant confounders, such as lesion and organ characteristics, disease stages, and imaging equipment.