K Number
K210666
Device Name
Chest-CAD
Date Cleared
2021-07-20

(137 days)

Product Code
Regulation Number
892.2070
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP Authorized
Intended Use
Chest-CAD is a computer-assisted detection (CADe) software device that analyzes chest radiograph studies using machine learning techniques to identify, categorize, and highlight suspicious regions of interest (ROI). Any suspicious ROI identified by Chest-CAD is assigned to one of the following categories: Cardiac, Mediastinum/Hila, Lungs, Pleura, Bones, Soft Tissues, Hardware, or Other. The device is intended for use as a concurrent reading aid for physicians. Chest-CAD is indicated for adults only.
Device Description
Chest-CAD is a computer-assisted detection (CADe) software device designed to assist physicians in identifying suspicious regions of interest (ROIs) in adult chest X-rays. Suspicious ROIs identified by Chest-CAD are assigned to one of the following categories: Cardiac, Mediastinum/Hila, Lungs, Pleura, Bones, Soft Tissues, Hardware, or Other. Chest-CAD detects suspicious ROIs by analyzing radiographs using deep learning algorithms for computer vision and provides relevant annotations to assist physicians with their interpretations. For each image within a study, Chest-CAD generates a DICOM Presentation State file (output overlay). If any suspicious ROI is detected by Chest-CAD in the study, the output overlay for all images includes the text "ROI(s) Detected:" followed by a list of the category/categories for which suspicious ROI(s) were found, such as "Lungs, Bones". In addition, if suspicious ROI(s) are detected in the image, bounding boxes surrounding each detected suspicious ROI are included in the output overlay. If no suspicious ROI is detected by Chest-CAD in the study, the output overlay for each image will include the text "No ROI(s) Detected" and no bounding boxes will be included. Regardless of whether a suspicious ROI is detected, the overlay includes text identifying the X-ray study as analyzed by Chest-CAD and a customer configurable message containing a link to or instructions for users to access labeling. The Chest-CAD overlay can be toggled on or off by the physician within their Picture Archiving and Communication System (PACS) viewer, allowing for concurrent review of the X-ray study.
More Information

P000041

Not Found

Yes
The document explicitly states that the device uses "machine learning techniques" and "deep learning algorithms".

No
Chest-CAD is a diagnostic software device that assists physicians in identifying suspicious regions of interest on chest radiographs; it does not provide therapy or treatment.

No

Chest-CAD is explicitly stated to be a "computer-assisted detection (CADe) software device" intended as a "concurrent reading aid for physicians" to "identify, categorize, and highlight suspicious regions of interest." It does not make a final diagnosis itself, but rather assists the physician in their diagnostic process.

Yes

The device is described as a "software device" and its function is to analyze existing chest radiograph images and provide an overlay. There is no mention of any hardware component being part of the device itself. While it interacts with PACS viewers and processes images, these are external systems and data, not part of the device's hardware.

Based on the provided information, this device is not an IVD (In Vitro Diagnostic).

Here's why:

  • IVD Definition: In Vitro Diagnostic devices are used to examine specimens (like blood, urine, tissue) taken from the human body to provide information for diagnosis, monitoring, or screening. This testing is performed outside the body (in vitro).
  • Chest-CAD's Function: Chest-CAD analyzes medical images (chest radiographs) that are acquired from the patient's body. It does not analyze biological specimens.
  • Intended Use: The intended use is as a "concurrent reading aid for physicians" to identify suspicious regions in chest X-rays. This is an aid to image interpretation, not a diagnostic test performed on a biological sample.

Therefore, Chest-CAD falls under the category of medical imaging software or a computer-assisted detection (CADe) device, not an In Vitro Diagnostic device.

No
The letter does not explicitly state that the FDA has reviewed and approved or cleared a PCCP for this specific device.

Intended Use / Indications for Use

Chest-CAD is a computer-assisted detection (CADe) software device that analyzes chest radiograph studies using machine learning techniques to identify, categorize, and highlight suspicious regions of interest (ROI). Any suspicious ROI identified by Chest-CAD is assigned to one of the following categories: Cardiac, Mediastinum/Hila, Lungs, Pleura, Bones, Soft Tissues, Hardware, or Other. The device is intended for use as a concurrent reading aid for physicians. Chest-CAD is indicated for adults only.

Product codes

MYN

Device Description

Chest-CAD is a computer-assisted detection (CADe) software device designed to assist physicians in identifying suspicious regions of interest (ROIs) in adult chest X-rays. Suspicious ROIs identified by Chest-CAD are assigned to one of the following categories: Cardiac, Mediastinum/Hila, Lungs, Pleura, Bones, Soft Tissues, Hardware, or Other. Chest-CAD detects suspicious ROIs by analyzing radiographs using deep learning algorithms for computer vision and provides relevant annotations to assist physicians with their interpretations.

For each image within a study, Chest-CAD generates a DICOM Presentation State file (output overlay). If any suspicious ROI is detected by Chest-CAD in the study, the output overlay for all images includes the text "ROI(s) Detected:" followed by a list of the category/categories for which suspicious ROI(s) were found, such as "Lungs, Bones". In addition, if suspicious ROI(s) are detected in the image, bounding boxes surrounding each detected suspicious ROI are included in the output overlay. If no suspicious ROI is detected by Chest-CAD in the study, the output overlay for each image will include the text "No ROI(s) Detected" and no bounding boxes will be included. Regardless of whether a suspicious ROI is detected, the overlay includes text identifying the X-ray study as analyzed by Chest-CAD and a customer configurable message containing a link to or instructions for users to access labeling. The Chest-CAD overlay can be toggled on or off by the physician within their Picture Archiving and Communication System (PACS) viewer, allowing for concurrent review of the X-ray study.

Mentions image processing

Yes

Mentions AI, DNN, or ML

Yes

Input Imaging Modality

X-ray
Digital X-ray

Anatomical Site

Chest

Indicated Patient Age Range

Adults only.

Intended User / Care Setting

Physician
Care setting not specified but implies hospitals, outpatient centers, and specialty centers from data source.

Description of the training set, sample size, data source, and annotation protocol

Not Found

Description of the test set, sample size, data source, and annotation protocol

Bench Testing:
Sample size: 20,000 chest radiograph cases
Data source: 12 hospitals, outpatient centers, and specialty centers in the United States representative of the intended use population.
Annotation protocol: Not specified, but likely refers to ground truth for performance evaluation.

Clinical Data (MRMC study):
Sample size: 238 cases
Data source: 9 hospitals, outpatient centers, and specialty centers in the United States.
Annotation protocol: Each case was previously evaluated by a panel of U.S. board-certified radiologists who assigned a ground truth binary label indicating the presence or absence of a suspicious ROI for each Chest-CAD category.

Summary of Performance Studies (study type, sample size, AUC, MRMC, standalone performance, key results)

Standalone Performance Assessment (Bench Testing)

  • Study type: Standalone performance assessment
  • Sample size: 20,000 chest radiograph cases
  • Key results:
    • Overall Sensitivity: 0.908 (95% Wilson's Confidence Interval: 0.905, 0.911)
    • Overall Specificity: 0.887 (95% Wilson's Confidence Interval: 0.885, 0.889)
    • Overall AUC of the Receiver Operating Characteristic (ROC) curve: 0.976 (95% Bootstrap Confidence Interval: 0.975, 0.976)
    • AUC per category: Cardiac (0.961), Mediastinum/Hila (0.921), Lungs (0.967), Pleura (0.973), Bones (0.930), Soft Tissues (0.981), Hardware (0.994), Other (0.953). Highest AUC for Hardware (0.994), lowest for Mediastinum/Hila (0.921).
    • Sensitivity per category: Cardiac (0.889), Mediastinum/Hila (0.856), Lungs (0.888), Pleura (0.919), Bones (0.854), Soft Tissues (0.938), Hardware (0.967), Other (0.906). Highest sensitivity for Hardware (0.967), lowest for Bones (0.854).
    • Specificity per category: Cardiac (0.892), Mediastinum/Hila (0.830), Lungs (0.915), Pleura (0.899), Bones (0.856), Soft Tissues (0.919), Hardware (0.960), Other (0.872). Highest specificity for Hardware (0.960), lowest for Mediastinum/Hila (0.830).
    • Free-Response ROC (FROC) curves were also estimated for each Chest-CAD category.

Clinical Data (Multiple Reader, Multiple Case - MRMC Retrospective Reader Study)

  • Study type: Fully-crossed multiple reader, multiple case (MRMC) retrospective reader study
  • Sample size: 24 clinical readers evaluated 238 cases
  • Key results:
    • The accuracy of readers aided by Chest-CAD ("Aided") was superior to the accuracy of readers when unaided by Chest-CAD ("Unaided") as determined by the case-level, across-category aggregate Area Under the Curve (AUC) of the Receiver Operating Characteristic (ROC) curve (Dorfman, Berbaum, and Metz (DBM) modeling approach).
    • Reader AUC estimates improved from 0.836 (95% Bootstrap CI: 0.816, 0.856) (Unaided) to 0.894 (95% Bootstrap CI: 0.879, 0.909) (Aided).
    • Reader sensitivity improved from 0.757 (95% Wilson's CI: 0.750, 0.764) (Unaided) to 0.856 (95% Wilson's CI: 0.850, 0.862) (Aided).
    • Reader specificity improved from 0.843 (95% Wilson's CI: 0.839, 0.847) (Unaided) to 0.870 (95% Wilson's CI: 0.866, 0.873) (Aided).

Key Metrics (Sensitivity, Specificity, PPV, NPV, etc.)

Standalone Performance:

  • Overall Sensitivity: 0.908 (95% Wilson's Confidence Interval: 0.905, 0.911)
  • Overall Specificity: 0.887 (95% Wilson's Confidence Interval: 0.885, 0.889)
  • Overall AUC: 0.976 (95% Bootstrap Confidence Interval: 0.975, 0.976)
  • Sensitivity by Category:
    • Cardiac: 0.889 (0.881, 0.897)
    • Mediastinum/Hila: 0.856 (0.844, 0.867)
    • Lungs: 0.888 (0.882, 0.893)
    • Pleura: 0.919 (0.912, 0.925)
    • Bones: 0.854 (0.838, 0.868)
    • Soft Tissues: 0.938 (0.916, 0.955)
    • Hardware: 0.967 (0.963, 0.970)
    • Other: 0.906 (0.889, 0.920)
  • Specificity by Category:
    • Cardiac: 0.892 (0.887, 0.897)
    • Mediastinum/Hila: 0.830 (0.824, 0.835)
    • Lungs: 0.915 (0.908, 0.921)
    • Pleura: 0.899 (0.894, 0.904)
    • Bones: 0.856 (0.850, 0.861)
    • Soft Tissues: 0.919 (0.916, 0.923)
    • Hardware: 0.960 (0.956, 0.964)
    • Other: 0.872 (0.867, 0.877)
  • AUC by Category:
    • Cardiac: 0.961
    • Mediastinum/Hila: 0.921
    • Lungs: 0.967
    • Pleura: 0.973
    • Bones: 0.930
    • Soft Tissues: 0.981
    • Hardware: 0.994
    • Other: 0.953

Clinical Reader Study (Aided vs. Unaided):

  • Reader AUC (Unaided): 0.836 (95% Bootstrap CI: 0.816, 0.856)
  • Reader AUC (Aided): 0.894 (95% Bootstrap CI: 0.879, 0.909)
  • Reader Sensitivity (Unaided): 0.757 (95% Wilson's CI: 0.750, 0.764)
  • Reader Sensitivity (Aided): 0.856 (95% Wilson's CI: 0.850, 0.862)
  • Reader Specificity (Unaided): 0.843 (95% Wilson's CI: 0.839, 0.847)
  • Reader Specificity (Aided): 0.870 (95% Wilson's CI: 0.866, 0.873)

Predicate Device(s)

P000041

Reference Device(s)

Not Found

Predetermined Change Control Plan (PCCP) - All Relevant Information

Not Found

§ 892.2070 Medical image analyzer.

(a)
Identification. Medical image analyzers, including computer-assisted/aided detection (CADe) devices for mammography breast cancer, ultrasound breast lesions, radiograph lung nodules, and radiograph dental caries detection, is a prescription device that is intended to identify, mark, highlight, or in any other manner direct the clinicians' attention to portions of a radiology image that may reveal abnormalities during interpretation of patient radiology images by the clinicians. This device incorporates pattern recognition and data analysis capabilities and operates on previously acquired medical images. This device is not intended to replace the review by a qualified radiologist, and is not intended to be used for triage, or to recommend diagnosis.(b)
Classification. Class II (special controls). The special controls for this device are:(1) Design verification and validation must include:
(i) A detailed description of the image analysis algorithms including a description of the algorithm inputs and outputs, each major component or block, and algorithm limitations.
(ii) A detailed description of pre-specified performance testing methods and dataset(s) used to assess whether the device will improve reader performance as intended and to characterize the standalone device performance. Performance testing includes one or more standalone tests, side-by-side comparisons, or a reader study, as applicable.
(iii) Results from performance testing that demonstrate that the device improves reader performance in the intended use population when used in accordance with the instructions for use. The performance assessment must be based on appropriate diagnostic accuracy measures (
e.g., receiver operator characteristic plot, sensitivity, specificity, predictive value, and diagnostic likelihood ratio). The test dataset must contain a sufficient number of cases from important cohorts (e.g., subsets defined by clinically relevant confounders, effect modifiers, concomitant diseases, and subsets defined by image acquisition characteristics) such that the performance estimates and confidence intervals of the device for these individual subsets can be characterized for the intended use population and imaging equipment.(iv) Appropriate software documentation (
e.g., device hazard analysis; software requirements specification document; software design specification document; traceability analysis; description of verification and validation activities including system level test protocol, pass/fail criteria, and results; and cybersecurity).(2) Labeling must include the following:
(i) A detailed description of the patient population for which the device is indicated for use.
(ii) A detailed description of the intended reading protocol.
(iii) A detailed description of the intended user and user training that addresses appropriate reading protocols for the device.
(iv) A detailed description of the device inputs and outputs.
(v) A detailed description of compatible imaging hardware and imaging protocols.
(vi) Discussion of warnings, precautions, and limitations must include situations in which the device may fail or may not operate at its expected performance level (
e.g., poor image quality or for certain subpopulations), as applicable.(vii) Device operating instructions.
(viii) A detailed summary of the performance testing, including: test methods, dataset characteristics, results, and a summary of sub-analyses on case distributions stratified by relevant confounders, such as lesion and organ characteristics, disease stages, and imaging equipment.

0

July 20, 2021

Image /page/0/Picture/1 description: The image shows the logo of the U.S. Food and Drug Administration (FDA). The logo consists of two parts: on the left, there is a seal with an eagle emblem, and on the right, there is the text "FDA U.S. FOOD & DRUG ADMINISTRATION" in blue. The text is arranged in three lines, with "FDA" in a larger font size and a blue square behind it.

Imagen Technologies, Inc % Robert Lindsey, Ph.D. Chief Science Officer 151 West 26th Street, 10th Floor NEW YORK NY 10001

Re: K210666

Trade/Device Name: Chest-CAD Regulation Number: 21 CFR 892.2070 Regulation Name: Medical image analyzer Regulatory Class: Class II Product Code: MYN Dated: June 9, 2021 Received: June 10, 2021

Dear Dr. Lindsey:

We have reviewed your Section 510(k) premarket notification of intent to market the device referenced above and have determined the device is substantially equivalent (for the indications for use stated in the enclosure) to legally marketed predicate devices marketed in interstate commerce prior to May 28, 1976, the enactment date of the Medical Device Amendments, or to devices that have been reclassified in accordance with the provisions of the Federal Food, Drug, and Cosmetic Act (Act) that do not require approval of a premarket approval application (PMA). You may, therefore, market the device, subject to the general controls provisions of the Act. Although this letter refers to your product as a device, please be aware that some cleared products may instead be combination products. The 510(k) Premarket Notification Database located at https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfpmp/pmn.cfm identifies combination product submissions. The general controls provisions of the Act include requirements for annual registration, listing of devices, good manufacturing practice, labeling, and prohibitions against misbranding and adulteration. Please note: CDRH does not evaluate information related to contract liability warranties. We remind you, however, that device labeling must be truthful and not misleading.

If your device is classified (see above) into either class II (Special Controls) or class III (PMA), it may be subject to additional controls. Existing major regulations affecting your device can be found in the Code of Federal Regulations, Title 21, Parts 800 to 898. In addition, FDA may publish further announcements concerning your device in the Federal Register.

Please be advised that FDA's issuance of a substantial equivalence determination does not mean that FDA has made a determination that your device complies with other requirements of the Act or any Federal statutes and regulations administered by other Federal agencies. You must comply with all the Act's requirements, including, but not limited to: registration and listing (21 CFR Part 807); labeling (21 CFR Part 801); medical device reporting of medical device-related adverse events) (21 CFR 803) for devices or postmarketing safety reporting (21 CFR 4, Subpart B) for combination products (see

1

https://www.fda.gov/combination-products/guidance-regulatory-information/postmarketing-safety-reportingcombination-products); good manufacturing practice requirements as set forth in the quality systems (QS) regulation (21 CFR Part 820) for devices or current good manufacturing practices (21 CFR 4, Subpart A) for combination products; and, if applicable, the electronic product radiation control provisions (Sections 531-542 of the Act); 21 CFR 1000-1050.

Also, please note the regulation entitled, "Misbranding by reference to premarket notification" (21 CFR Part 807.97). For questions regarding the reporting of adverse events under the MDR regulation (21 CFR Part 803), please go to https://www.fda.gov/medical-device-safety/medical-device-reportingmdr-how-report-medical-device-problems.

For comprehensive regulatory information about medical devices and radiation-emitting products, including information about labeling regulations, please see Device Advice (https://www.fda.gov/medicaldevices/device-advice-comprehensive-regulatory-assistance) and CDRH Learn (https://www.fda.gov/training-and-continuing-education/cdrh-learn). Additionally, you may contact the Division of Industry and Consumer Education (DICE) to ask a question about a specific regulatory topic. See the DICE website (https://www.fda.gov/medical-device-advice-comprehensive-regulatoryassistance/contact-us-division-industry-and-consumer-education-dice) for more information or contact DICE by email (DICE@fda.hhs.gov) or phone (1-800-638-2041 or 301-796-7100).

Sincerely,

For

Thalia T. Mills, Ph.D. Director Division of Radiological Health OHT7: Office of In Vitro Diagnostics and Radiological Health Office of Product Evaluation and Quality Center for Devices and Radiological Health

Enclosure

2

Indications for Use

510(k) Number (if known) K210666

Device Name Chest-CAD

Indications for Use (Describe)

Chest-CAD is a computer-assisted detection (CADe) software device that analyzes chest radiograph studies using machine learning techniques to identify, categorize, and highlight suspicious regions of interest (ROI). Any suspicious ROI identified by Chest-CAD is assigned to one of the following categories: Cardiac, Mediastinum/Hila, Lungs, Pleura, Bones, Soft Tissues, Hardware, or Other. The device is intended for use as a concurrent reading aid for physicians. Chest-CAD is indicated for adults only.

Type of Use (Select one or both, as applicable)
Prescription Use (Part 21 CFR 801 Subpart D)Over-The-Counter Use (21 CFR 801 Subpart C)

CONTINUE ON A SEPARATE PAGE IF NEEDED.

This section applies only to requirements of the Paperwork Reduction Act of 1995.

DO NOT SEND YOUR COMPLETED FORM TO THE PRA STAFF EMAIL ADDRESS BELOW.

The burden time for this collection of information is estimated to average 79 hours per response, including the time to review instructions, search existing data sources, gather and maintain the data needed and complete and review the collection of information. Send comments regarding this burden estimate or any other aspect of this information collection, including suggestions for reducing this burden, to:

Department of Health and Human Services Food and Drug Administration Office of Chief Information Officer Paperwork Reduction Act (PRA) Staff PRAStaff(@fda.hhs.gov

"An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB number."

3

In accordance with 21 CFR 807.87(h) (and 21 CFR 807.92) the 510(k) Summary for Chest-CAD is provided below.

SUBMITTER 1.

| Applicant: | Imagen Technologies, Inc.
151 West 26th Street, Suite 1001
New York, NY 10001 |
|---------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Contact and Primary
Correspondent: | Robert Lindsey, Ph.D.
Chief Science Officer
Imagen Technologies, Inc.
151 West 26th Street, Suite 1001
New York, NY 10001
917-830-4721
rob@imagen.ai |
| Secondary Correspondent: | Becky Ditty
Consultant
Biologics Consulting
1555 King St., Suite 300
Alexandria, VA 22314
269-888-2516
bditty@biologicsconsulting.com |
| Date Prepared: | July 12th, 2021 |

2. DEVICE

Device Trade Name:Chest-CAD
Device Common Name or
Classification Name:Medical Image Analyzer
Regulation21 CFR 892.2070
Regulatory Class:II
Product Code:MYN

4

PREDICATE DEVICE 3.

On January 22, 2020, FDA published the final rule down-classifying medical image analyzers (product code MYN) from Class III to Class II. Therefore, Riverain Technologies' RapidScreen™ RS-2000 (P000041) has been identified as the predicate device for Chest-CAD.

DEVICE DESCRIPTION 4.

Chest-CAD is a computer-assisted detection (CADe) software device designed to assist physicians in identifying suspicious regions of interest (ROIs) in adult chest X-rays. Suspicious ROIs identified by Chest-CAD are assigned to one of the following categories: Cardiac, Mediastinum/Hila, Lungs, Pleura, Bones, Soft Tissues, Hardware, or Other. Chest-CAD detects suspicious ROIs by analyzing radiographs using deep learning algorithms for computer vision and provides relevant annotations to assist physicians with their interpretations.

For each image within a study, Chest-CAD generates a DICOM Presentation State file (output overlay). If any suspicious ROI is detected by Chest-CAD in the study, the output overlay for all images includes the text "ROI(s) Detected:" followed by a list of the category/categories for which suspicious ROI(s) were found, such as "Lungs, Bones". In addition, if suspicious ROI(s) are detected in the image, bounding boxes surrounding each detected suspicious ROI are included in the output overlay. If no suspicious ROI is detected by Chest-CAD in the study, the output overlay for each image will include the text "No ROI(s) Detected" and no bounding boxes will be included. Regardless of whether a suspicious ROI is detected, the overlay includes text identifying the X-ray study as analyzed by Chest-CAD and a customer configurable message containing a link to or instructions for users to access labeling. The Chest-CAD overlay can be toggled on or off by the physician within their Picture Archiving and Communication System (PACS) viewer, allowing for concurrent review of the X-ray study.

ട്. INTENDED USE/INDICATIONS FOR USE

Chest-CAD is a computer-assisted detection (CADe) software device that analyzes chest radiograph studies using machine learning techniques to identify, categorize, and highlight suspicious regions of interest (ROI). Any suspicious ROI identified by Chest-CAD is assigned to one of the following categories: Cardiac, Mediastinum/Hila, Lungs, Pleura, Bones, Soft Tissues, Hardware, or Other. The device is intended for use as a concurrent reading aid for physicians. Chest-CAD is indicated for adults only.

SUBSTANTIAL EQUIVALENCE 6.

Comparison of Indications

The predicate device for Chest-CAD (Riverain Technologies' RapidScreen™ RS-2000) has the following FDA-approved Indications for Use:

The RapidScreen™ RS-2000 is a computer-aided detection (CAD) system intended to identify and mark regions of interest (ROIs) on digitized frontal chest radiographs. It identifies

5

features associated with solitary pulmonary nodules from 9 to 30 mm in size, which could represent early-stage lung cancer. The device is intended for use as an aid only after the physician has performed an initial interpretation of the radiograph.

RapidScreen™ RS-2000 and Chest-CAD both analyze chest radiographs, and both identify regions of interest (ROI) in the chest. Chest-CAD detects ROIs and assigns each ROI to one of eight categories compared to RapidScreen™ RS-2000 that detects ROIs and assigns ROIs to a single category (i.e., features associated with pulmonary nodules). RapidScreen™ RS-2000 is indicated for use as a second read, while Chest-CAD is indicated for use as a concurrent read. However, both devices are only intended as an aid to the physician and not intended to replace the diagnosis by the physician. The differences in Indications for Use do not constitute a new intended use, as both devices are intended to assist physicians by identifying and marking ROIs in chest radiographs.

Technological Comparisons

Table 1 provides a comparison of the Technological Characteristics of Chest-CAD to the predicate RapidScreen™ RS-2000.

Proposed DevicePredicate
NumberK210666P000041
ApplicantImagen TechnologiesRiverain Medical Group
Device NameChest-CADRapidScreen™ RS-2000
Classification Regulation892.2070892.2070
Product CodeMYNMYN
Image ModalityX-rayX-ray
Study TypeChestChest
Clinical OutputIdentify and mark regions of
interest (ROIs) on chest
radiographsIdentify and mark regions of
interest (ROIs) on chest
radiographs
Clinical FindingIdentified ROIs are assigned to
one of the following categories:
Cardiac, Mediastinum/Hila,
Lungs, Pleura, Bones, Soft
Tissues, Hardware, or OtherIdentified ROIs are assigned to a
single category (i.e., features
associated with solitary
pulmonary nodules from 9 to 30
mm in size)
Intended UsersPhysicianPhysician
Intended User WorkflowDevice intended for use as a
reading aid for physicians
interpreting chest radiographsDevice intended for use as a
reading aid for physicians
interpreting chest radiographs
Patient PopulationAdults with Chest RadiographsAdults with Chest Radiographs
Algorithm MethodologyArtificial Neural NetworksArtificial Neural Networks
Table 1:Technological Comparison
------------------------------------

6

Proposed DevicePredicate
PlatformSecure cloud-based processing
and delivery of chest radiographsSecure on-premise processing
and delivery of chest radiographs
Image SourceDigital X-rayFilm X-ray
Image ViewingImage displayed on PACS systemImage displayed on video
monitor

Chest-CAD's intended end-users, imaging modality, output display on X-ray studies, and assistive functionality during chest radiograph interpretation workflows are similar to those of RapidScreen™ RS-2000. Chest-CAD differs from RapidScreen™ RS-2000 in that Chest-CAD detects ROIs and assigns each ROI to one of eight categories compared to RapidScreen™ RS-2000 that detects ROIs and assigns ROIs to a single category (i.e., features associated with pulmonary nodules). Chest-CAD operates on digital X-rays from a DICOM node, whereas RapidScreen™ RS-2000 operates on digitized X-ray films. RapidScreen™ RS-2000 was approved when digital X-rays were not standard of care, however, the Riverain device was approved by FDA to process digital X-rays in P000041/S001. The fundamental purpose of both devices is to identify ROIs on chest X-rays for further consideration by the physician, and these differences in technological characteristics do not raise different concerns of safety and effectiveness.

7. PERFORMANCE DATA

Biocompatibility Testing

There are no direct or indirect patient-contacting components of the subject device. Therefore, patient contact information is not needed for this device.

Electrical Safety and Electromagnetic Compatibility (EMC)

The subject device is a software-only device. Therefore, electrical safety and EMC testing was not necessary to establish the substantial equivalence of this device.

Software Verification and Validation Testing

Software verification and validation testing were conducted, and documentation was provided as recommended by FDA's Guidance for Industry and FDA Staff. "Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices." The software level of concern for Chest-CAD is Moderate, since a malfunction of, or a latent design flaw in, the software device may lead to an erroneous diagnosis or a delay in delivery of appropriate medical care that would likely lead to Minor Injury.

Bench Testing

Imagen conducted a standalone performance assessment on 20,000 chest radiograph cases from 12 hospitals, outpatient centers, and specialty centers in the United States representative of the

7

intended use population. The results of the standalone testing demonstrated that Chest-CAD detects suspicious ROIs with high sensitivity (0.908; 95% Wilson's Confidence Interval: 0.905, 0.911), high specificity (0.887; 95% Wilson's Confidence Interval: 0.885, 0.889), and high Area Under the Curve (AUC) of the Receiver Operating Characteristic (ROC) curve (0.976, 95% Bootstrap Confidence Interval: 0.975, 0.976).

The AUC of the ROC curve was also estimated for each Chest-CAD category and Figure 1 shows AUCs remained high across the eight categories (further detail described in Table 2). The highest AUCs of the ROC curve were for Hardware (0.994) and the lowest were for Mediastinum/Hila (0.921). Sensitivity and specificity were calculated for each of the Chest-CAD categories. As shown in Table 3, sensitivity was highest for Hardware (0.967) and was lowest for Bones (0.854). Specificity was highest for Hardware (0.960) and lowest for Mediastinum/Hila (0.830). The Free-Response ROC (FROC) curve was also estimated for each Chest-CAD category and Figure 2 shows the box-level sensitivity versus the false positives per image. The FROC curves terminate at the device's box-level sensitivity for each category due to the cascaded nature of the Chest-CAD predictions.

Image /page/7/Figure/4 description: The image is a plot of sensitivity vs 1 - specificity for different body parts. The plot shows the ROC curves for hardware, soft tissues, pleura, lungs, cardiac, other, bones, and mediastinum/hila. The area under the curve (AUC) is shown in parentheses for each body part, with hardware having the highest AUC of 0.994 and mediastinum/hila having the lowest AUC of 0.921.

Chest-CAD ROC Curve by Category Figure 1:

8

CategoryAUC95% Bootstrap CI
Cardiac0.9610.959, 0.963
Mediastinum/Hila0.9210.918, 0.924
Lungs0.9670.966, 0.969
Pleura0.9730.972, 0.975
Bones0.9300.926, 0.934
Soft Tissues0.9810.977, 0.985
Hardware0.9940.994, 0.995
Other0.9530.950, 0.957

Table 2: AUC of the ROC Curve for Chest-CAD Model Predictions by Category

Abbreviations: AUC= Area Under the Curve; CI= Confidence Interval.

Table 3: Sensitivity and Specificity for Chest-CAD Model Predictions by Category

CategorySensitivitySpecificity
95%
Wilson's CI95%
Wilson's CI
Cardiac0.889
(0.881, 0.897)0.892
(0.887, 0.897)
Mediastinum/Hila0.856
(0.844, 0.867)0.830
(0.824, 0.835)
Lungs0.888
(0.882, 0.893)0.915
(0.908, 0.921)
Pleura0.919
(0.912, 0.925)0.899
(0.894, 0.904)
Bones0.854
(0.838, 0.868)0.856
(0.850, 0.861)
Soft Tissues0.938
(0.916, 0.955)0.919
(0.916, 0.923)
Hardware0.967
(0.963, 0.970)0.960
(0.956, 0.964)
Other0.906
(0.889, 0.920)0.872
(0.867, 0.877)

Abbreviations: CI= Confidence Interval.

9

Image /page/9/Figure/2 description: This image is a plot of box-level sensitivity versus false positives per image. There are eight different lines plotted on the graph, each representing a different anatomical structure: Hardware, Lungs, Cardiac, Mediastinum/Hila, Soft Tissues, Pleura, Other, and Bones. The x-axis represents the number of false positives per image, ranging from 0.0 to 0.8. The y-axis represents the box-level sensitivity, ranging from 0.0 to 0.8.

Chest-CAD Free-Response ROC (FROC) Curve by Category Figure 2:

Animal Testing

Not applicable. Animal studies are not necessary to establish the substantial equivalence of this device.

Clinical Data

Imagen conducted a fully-crossed multiple reader, multiple case (MRMC) retrospective reader study to determine the impact of Chest-CAD on reader performance in detecting suspicious ROIs in chest radiograph cases. The primary objective of this study was to determine whether the accuracy of readers aided by Chest-CAD ("Aided") was superior to the accuracy of readers when unaided by Chest-CAD ("Unaided") as determined by the case-level, across-category aggregate Area Under the Curve (AUC) of the Receiver Operating Characteristic (ROC) curve.

24 clinical readers each evaluated 238 cases in Chest-CAD's Indications for Use under both Aided and Unaided conditions. The cases were from 9 hospitals, outpatient centers, and specialty centers in the United States. Each case was previously evaluated by a panel of U.S. boardcertified radiologists who assigned a ground truth binary label indicating the presence or absence of a suspicious ROI for each Chest-CAD category. The MRMC study consisted of two independent reading sessions separated by a washout period of at least 28 days in order to avoid memory bias. For each case, each reader was required to provide a binary determination of the presence or absence of a suspicious ROI for each category and to provide a confidence score representing their certainty.

10

The results of the study found that the accuracy of readers in the intended use population was superior when Aided by Chest-CAD than when Unaided by Chest-CAD, as measured by the task of suspicious ROI detection using the AUC of the ROC curve as calculated by the Dorfman, Berbaum, and Metz (DBM) modeling approach.

Image /page/10/Figure/3 description: The image is a plot of sensitivity vs 1-specificity. There are two curves plotted on the graph, one for aided and one for unaided. The aided curve is in blue and the unaided curve is in red. The aided curve is generally higher than the unaided curve, indicating that the aided modality has better performance.

Figure 3: Clinical Reader Study Results - Aided and Unaided ROC Curves

In particular, the study results demonstrated improvements when Aided versus Unaided:

  • When calculated using Wilcoxon rank-sum scores with bootstrap confidence intervals as outlined in Beiden et al. 20001, reader AUC estimates improved from 0.836 (95% Bootstrap CI: 0.816, 0.856) to 0.894 (95% Bootstrap CI: 0.879, 0.909).
  • Reader sensitivity improved from 0.757 (95% Wilson's CI: 0.750, 0.764) to 0.856 (95% Wilson's CI: 0.850, 0.862).
  • Reader specificity improved from 0.843 (95% Wilson's CI: 0.839, 0.847) to 0.870 (95% ● Wilson's CI: 0.866, 0.873).

'Beiden, S.V., Wagner, R.F., & Campbell, G. (2000). Components-of-variance models and multiple-bootstrap experiments: An alternative method for random-effects, receiver operating characteristic analysis. Academic Radiology, 7, p.341-p.349.

11

CONCLUSION 8.

The conclusions drawn from the standalone and clinical studies demonstrate that Chest-CAD is as safe, as effective, and performs as well as RapidScreen™ RS-2000. The special controls for the Medical Image Analyzer (CADe) 21 CFR 892.2070 regulation are satisfied by demonstrating effectiveness of the device in both the standalone testing and the clinical testing, showing superiority of Aided versus Unaided reads in the clinical testing, and communicating testing results in the labeling. Chest-CAD's intended end-users, imaging modality, output display on Xray studies, and assistive functionality during chest radiograph interpretation workflows are similar to those of RapidScreen™ RS-2000. The technological differences identified and discussed in Section 6 do not raise different concerns of safety and effectiveness. Thus, Chest-CAD is substantially equivalent to RapidScreen™ RS-2000 for the intended use of computerassisted detection.