K Number
K242729
Device Name
AutoContour (Model RADAC V4)
Manufacturer
Date Cleared
2024-12-09

(90 days)

Product Code
Regulation Number
892.2050
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP Authorized
Intended Use
AutoContour is intended to assist radiation treatment planners in contouring and reviewing structures within medical images in preparation for radiation therapy treatment planning.
Device Description
As with AutoContour Model RADAC V3, the AutoContour Model RADAC V4 device is software that uses DICOM-compliant image data (CT or MR) as input to: (1) automatically contour various structures of interest for radiation therapy treatment planning using machine learning based contouring. The deep-learning based structure models are trained using imaging datasets consisting of anatomical organs of the head and neck, thorax, abdomen and pelvis for adult male and female patients, (2) allow the user to review and modify the resulting contours, and (3) generate DICOM-compliant structure set data the can be imported into a radiation therapy treatment planning system. AutoContour Model RADAC V4 consists of 3 main components: - 1. A .NET client application designed to run on the Windows Operating System allowing the user to load image and structure sets for upload to the cloud-based server for automatic contouring, perform registration with other image sets, as well as review, edit, and export the structure set. - 2. A local "agent" service designed to run on the Windows Operating System that is configured by the user to monitor a network storage location for new CT and MR datasets that are to be automatically contoured. - 3. A cloud-based automatic contouring service that produces initial contours based on image sets sent by the user from the .NET client application.
More Information

Not Found

Yes
The device description explicitly states that it uses "machine learning based contouring" and "deep-learning based structure models".

No.
The device is intended to assist in contouring and reviewing structures for radiation therapy treatment planning, not to directly treat or diagnose a disease.

No

This device is intended to assist radiation treatment planners in contouring and reviewing structures within medical images for radiation therapy treatment planning, not to diagnose medical conditions.

Yes

The device description explicitly states that the device is "software" and details its components as a .NET client application, a local "agent" service, and a cloud-based automatic contouring service, all of which are software-based. There is no mention of any accompanying hardware components being part of the device itself.

Based on the provided information, this device is not an In Vitro Diagnostic (IVD).

Here's why:

  • Intended Use: The intended use is to "assist radiation treatment planners in contouring and reviewing structures within medical images in preparation for radiation therapy treatment planning." This is a clinical decision support tool for image analysis, not a test performed on biological samples to diagnose or monitor a disease.
  • Device Description: The device processes DICOM-compliant image data (CT or MR) to automatically contour anatomical structures. It does not analyze biological specimens like blood, urine, or tissue.
  • Input Data: The input is medical imaging data, not biological samples.
  • Output: The output is DICOM-compliant structure set data, which is used in radiation therapy planning, not diagnostic results from a biological test.

IVDs are defined as reagents, instruments, and systems intended for use in the diagnosis of disease or other conditions, including a determination of the state of health, in order to cure, mitigate, treat, or prevent disease or its sequelae. This device does not fit that definition. It is a software tool for medical image processing and analysis to aid in treatment planning.

No
The letter does not state that the FDA has reviewed and approved or cleared a Predetermined Change Control Plan (PCCP) for this specific device.

Intended Use / Indications for Use

AutoContour is intended to assist radiation treatment planners in contouring and reviewing structures within medical images in preparation for radiation therapy treatment planning.

Product codes

OKB

Device Description

As with AutoContour Model RADAC V3, the AutoContour Model RADAC V4 device is software that uses DICOM-compliant image data (CT or MR) as input to: (1) automatically contour various structures of interest for radiation therapy treatment planning using machine learning based contouring. The deep-learning based structure models are trained using imaging datasets consisting of anatomical organs of the head and neck, thorax, abdomen and pelvis for adult male and female patients, (2) allow the user to review and modify the resulting contours, and (3) generate DICOM-compliant structure set data the can be imported into a radiation therapy treatment planning system.

AutoContour Model RADAC V4 consists of 3 main components:

    1. A .NET client application designed to run on the Windows Operating System allowing the user to load image and structure sets for upload to the cloud-based server for automatic contouring, perform registration with other image sets, as well as review, edit, and export the structure set.
    1. A local "agent" service designed to run on the Windows Operating System that is configured by the user to monitor a network storage location for new CT and MR datasets that are to be automatically contoured.
    1. A cloud-based automatic contouring service that produces initial contours based on image sets sent by the user from the .NET client application.

Mentions image processing

Yes

Mentions AI, DNN, or ML

automatic contouring various structures of interest for radiation therapy treatment planning using machine learning based contouring. The deep-learning based structure models are trained using imaging datasets

The updated submission expands the use of machine-learning based contouring to include additional organs and volumes of Interest found in MR and CT image types.

(a) very similar CNN architecture was used to train these new CT models
(a) very similar CNN architecture was used to train these new MR models
Further tests were performed on independent datasets from those included in training and validation sets in order to validate the generalizability of the machine learning model.

Input Imaging Modality

DICOM-compliant image data (CT or MR)
PET/CT input for registration/fusion only.

Anatomical Site

head and neck, thorax, abdomen and pelvis

Indicated Patient Age Range

adult

Intended User / Care Setting

radiation treatment planners

Description of the training set, sample size, data source, and annotation protocol

For CT structure models there were an average of 341 training image sets. CT training images were gathered from 4 institutions, in 2 different countries, the United States and Switzerland. Ground truthing of each test data set were generated manually using consensus (NRG/RTOG) quidelines as appropriate by three clinically experienced experts consisting of 2 radiation therapy physicists and 1 radiation dosimetrist.

The MR training data set used for initial testing of the Brain models (SpinalCord_Cerv, Brain, and Lens_L/R) had an average of 149 training image sets and were acquired from the Cancer Imaging Archive GLIS-RT dataset. These data sets consisted primarily of glioblastoma and astrocytoma patients. Images were acquired on either a GE Signa HDxT (3T) or Siemens Skyra (3T) scanner and had an average slice thickness of 1mm, In-plane resolution between 0.5-1.0 mm, and acquisition parameters of TR=2.3-8.9ms, TE=3.0-3.2s.

The MR training data used for initial testing of the MR Pelvis models (A Pud Int L/R, Bladder, Bladder Trigone, Colon Sigmoid, External Pelvis, Femur L/R, NVB L/R, PenileBulb, Rectal Spacer, Rectum, and Urethra) had an average of 306 training image sets and were taken from 2 open source datasets, and one institution within the United States.

Description of the test set, sample size, data source, and annotation protocol

The test datasets were independent from those used for training and consisted of approximately 10% of the number of training image sets used as input for the model. For CT structure models there were an average of 54 testing image sets. Ground truthing of each test data set were generated manually using consensus (NRG/RTOG) quidelines as appropriate by three clinically experienced experts consisting of 2 radiation therapy physicists and 1 radiation dosimetrist.

Additional external clinical testing was performed in order to validate the accuracy of the models on image sets acquired that were unique to the training datasets. Both AutoContour and manually added ground truth contours following the same structure guidelines used for structure model training were added to the image sets.
External Clinical CT Data Sources:

  • CT Pelvis: TCIA - Pelvic-Ref
  • CT Head and Neck: TCIA - Head-Neck-PET-CT
  • CT Abdomen: TCIA - Pancreas-CT-CB
  • CT Thorax: TCIA - NSCLC; TCIA - LCTSC; TCIA- QIN-BREAST and Prone Thorax (N/A- Testing data was shared from several institutions)
  • CT HDR Female: Female HDR Pelvis (N/A- Testing data was shared from 2 different institutions based in the United States.)
  • CT Prostatectomy: Pelvis ProstateBed (N/A- Testing data was shared from 1 institution based in the United States)
    Ground truthing of each test data set was generated manually using consensus (NRG/RTOG) guidelines as appropriate by three clinically experienced experts consisting of 2 radiation therapy physicists and 1 radiation dosimetrist.

The MR training data set used for initial testing of the Brain models had 45 testing image sets. Ground truthing of each test data set was generated manually using consensus (NRG/RTOG) guidelines as appropriate by three clinically experienced experts consisting of 2 radiation therapy physicists and 1 radiation dosimetrist.
Additional external clinical testing was performed in order to validate the accuracy of the models on image sets acquired that were unique to the training datasets.
External Clinical MR Data Sources:

  • MR Brain: MR - Renown (N/A)
  • MR Pelvis: Gold Atlas Pelvis; SynthRad; MRLinac Pelvis (N/A- Testing data was shared by 2 institutions utilizing MR Linacs for image acquisitions.)
    For the Brain models, datasets acquired via data-use agreement from a clinical partner were acquired containing 20 MR T1 Ax post (BRAVO) image scans acquired with a GE MR750w scanner. Images had an average slice thickness of 1.6mm, In-plane resolution between 0.94 mm. and acquisition parameters of TR=5.98ms. TE=96.8s. Data for testing of the MR Pelvis structure models were acquired from 2 publicly available datasets, which contained images of patients with prostate or rectal cancer, as well as 1 dataset shared from 2 institutions utilizing an MR Linac. Various scanner models and acquisition settings were used.

Summary of Performance Studies

Non-clinical tests were performed according to Radformation's AutoContour Complete Test Protocol and Report, which demonstrates that AutoContour Model RADAC V4 performs as intended per its indications for use. Further tests were performed on independent datasets from those included in training and validation sets in order to validate the generalizability of the machine learning model. There were no changes to the testing protocol between AutoContour RADAC V3 and RADAC V4.

Mean Dice Similarity Coefficient (DSC) was used to validate the accuracy of structure model outputs when tested on image data sequestered from the original training data population.
For CT Structure models large, medium and small structures resulted in a mean DSC of 0.92+/-0.06, 0.85+/-0.09, and 0.81+/-0.12 respectively.
In external clinical CT testing, all structures passed the minimum DSC criteria for small, medium and large structures with an mean DSC of 0.76+/-0.09, 0.84+/-0.09, and 0.94+/-0.02 respectively. The qualitative clinical appropriateness of AutoContour structures generated on these scans was graded by clinical experts on a scale from 1 to 5. An average rating of 4.57 was found across all CT structure models, demonstrating that only minor edits would be required for clinical use.

For MR Structure models, a mean training DSC of 0.96+/-0.03 was found for large models, 0.84+/-0.07 for medium models, 0.74+/- 0.09 for small models.
In external clinical MR testing, all structures passed the minimum DSC criteria for small, medium, and large structures with a mean DSC of 0.61+/-0.14, 0.84+/-0.09, 0.80+/-.09 respectively. The qualitative clinical appropriateness of AutoContour structures generated on these scans was graded by clinical experts on a scale from 1 to 5. An average rating of 4.6 was found across all MR structure models, demonstrating that only minor edits would be required for clinical use.

Key Metrics

Mean Dice Similarity Coefficient (DSC) (Avg), DSC Std Dev, Lower Bound 95% Confidence Interval, External Reviewer Average Rating (1-5).
For CT models, Pass criteria for mean DSC: Large (>= 0.8), Medium (>= 0.65), Small (>= 0.5).
For MR models, Pass criteria for mean DSC: Large (>= 0.8), Medium (>= 0.65), Small (>= 0.5).
Clinical appropriateness rating on a scale from 1 to 5, where 5 is no edits required and 1 is full manual re-contour required. Average score >= 3 used to determine clinical benefit.

Predicate Device(s)

K230685

Reference Device(s)

Not Found

Predetermined Change Control Plan (PCCP) - All Relevant Information

Not Found

§ 892.2050 Medical image management and processing system.

(a)
Identification. A medical image management and processing system is a device that provides one or more capabilities relating to the review and digital processing of medical images for the purposes of interpretation by a trained practitioner of disease detection, diagnosis, or patient management. The software components may provide advanced or complex image processing functions for image manipulation, enhancement, or quantification that are intended for use in the interpretation and analysis of medical images. Advanced image manipulation functions may include image segmentation, multimodality image registration, or 3D visualization. Complex quantitative functions may include semi-automated measurements or time-series measurements.(b)
Classification. Class II (special controls; voluntary standards—Digital Imaging and Communications in Medicine (DICOM) Std., Joint Photographic Experts Group (JPEG) Std., Society of Motion Picture and Television Engineers (SMPTE) Test Pattern).

0

December 9, 2024

Image /page/0/Picture/1 description: The image contains the logo of the U.S. Food and Drug Administration (FDA). On the left is the Department of Health & Human Services logo. To the right of that is the FDA logo, which consists of the letters "FDA" in a blue square. To the right of the blue square is the text "U.S. FOOD & DRUG ADMINISTRATION" in blue.

Radformation, Inc. Jennifer Wampler Regulatory Affairs Specialist 261 Madison Avenue 9th Floor New York, New York 10016

Re: K242729

Trade/Device Name: AutoContour (Model RADAC V4) Regulation Number: 21 CFR 892.2050 Regulation Name: Medical Image Management And Processing System Regulatory Class: Class II Product Code: OKB Dated: September 5, 2024 Received: September 10, 2024

Dear Jennifer Wampler:

We have reviewed your section 510(k) premarket notification of intent to market the device referenced above and have determined the device is substantially equivalent (for the indications for use stated in the enclosure) to legally marketed predicate devices marketed in interstate commerce prior to May 28, 1976, the enactment date of the Medical Device Amendments, or to devices that have been reclassified in accordance with the provisions of the Federal Food, Drug, and Cosmetic Act (the Act) that do not require approval of a premarket approval application (PMA). You may, therefore, market the device, subject to the general controls provisions of the Act. Although this letter refers to your product as a device, please be aware that some cleared products may instead be combination products. The 510(k) Premarket Notification Database available at https://www.accessdata.fda.gov/scripts/cdrb/cfdocs/cfpmn/pmn.cfm identifies combination product submissions. The general controls provisions of the Act include requirements for annual registration, listing of devices, good manufacturing practice, labeling, and prohibitions against misbranding and adulteration. Please note: CDRH does not evaluate information related to contract liability warranties. We remind you, however, that device labeling must be truthful and not misleading.

If your device is classified (see above) into either class II (Special Controls) or class III (PMA), it may be subject to additional controls. Existing major regulations affecting your device can be found in the Code of Federal Regulations, Title 21, Parts 800 to 898. In addition, FDA may publish further announcements concerning your device in the Federal Register.

1

Additional information about changes that may require a new premarket notification are provided in the FDA guidance documents entitled "Deciding When to Submit a 510(k) for a Change to an Existing Device" (https://www.fda.gov/media/99812/download) and "Deciding When to Submit a 510(k) for a Software Change to an Existing Device" (https://www.fda.gov/media/99785/download).

Your device is also subject to, among other requirements, the Quality System (QS) regulation (21 CFR Part 820), which includes, but is not limited to, 21 CFR 820.30, Design controls; 21 CFR 820.90, Nonconforming product; and 21 CFR 820.100, Corrective and preventive action. Please note that regardless of whether a change requires premarket review, the QS regulation requires device manufacturers to review and approve changes to device design and production (21 CFR 820.30 and 21 CFR 820.70) and document changes and approvals in the device master record (21 CFR 820.181).

Please be advised that FDA's issuance of a substantial equivalence determination does not mean that FDA has made a determination that your device complies with other requirements of the Act or any Federal statutes and regulations administered by other Federal agencies. You must comply with all the Act's requirements, including, but not limited to: registration and listing (21 CFR Part 807); labeling (21 CFR Part 801); medical device reporting of medical device-related adverse events) (21 CFR Part 803) for devices or postmarketing safety reporting (21 CFR Part 4, Subpart B) for combination products (see https://www.fda.gov/combination-products/guidance-regulatory-information/postmarketing-safety-reportingcombination-products); good manufacturing practice requirements as set forth in the quality systems (QS) regulation (21 CFR Part 820) for devices or current good manufacturing practices (21 CFR Part 4, Subpart A) for combination products; and, if applicable, the electronic product radiation control provisions (Sections 531-542 of the Act); 21 CFR Parts 1000-1050.

All medical devices, including Class I and unclassified devices and combination product device constituent parts are required to be in compliance with the final Unique Device Identification System rule ("UDI Re"). The UDI Rule requires, among other things, that a device bear a unique device identifier (UDI) on its label and package (21 CFR 801.20(a)) unless an exception or alternative applies (21 CFR 801.20(b)) and that the dates on the device label be formatted in accordance with 21 CFR 801.18. The UDI Rule (21 CFR 830.300(a) and 830.320(b)) also requires that certain information be submitted to the Global Unique Device Identification Database (GUDID) (21 CFR Part 830 Subpart E). For additional information on these requirements, please see the UDI System webpage at https://www.fda.gov/medical-device-advicecomprehensive-regulatory-assistance/unique-device-identification-system-udi-system.

Also, please note the regulation entitled, "Misbranding by reference to premarket notification" (21 CFR 807.97). For questions regarding the reporting of adverse events under the MDR regulation (21 CFR Part 803), please go to https://www.fda.gov/medical-device-safety/medical-device-reportingmdr-how-report-medical-device-problems.

For comprehensive regulatory information about mediation-emitting products, including information about labeling regulations, please see Device Advice (https://www.fda.gov/medicaldevices/device-advice-comprehensive-regulatory-assistance) and CDRH Learn (https://www.fda.gov/training-and-continuing-education/cdrh-learn). Additionally, you may contact the Division of Industry and Consumer Education (DICE) to ask a question about a specific regulatory topic. See

2

the DICE website (https://www.fda.gov/medical-device-advice-comprehensive-regulatoryassistance/contact-us-division-industry-and-consumer-education-dice) for more information or contact DICE by email (DICE@fda.hhs.gov) or phone (1-800-638-2041 or 301-796-7100).

Sincerely,

Locon Weidner

Lora D. Weidner, Ph.D. Assistant Director Radiation Therapy Team DHT8C: Division of Radiological Imaging and Radiation Therapy Devices OHT8: Office of Radiological Health Office of Product Evaluation and Quality Center for Devices and Radiological Health

Enclosure

3

Indications for Use

Submission Number (if known)

K242729

Device Name

AutoContour (Model RADAC V4)

Indications for Use (Describe)

AutoContour is intended to assist radiation treatment planners in contouring and reviewing structures within medical images in preparation for radiation therapy treatment planning.

Type of Use (Select one or both, as applicable)

Prescription Use (Part 21 CFR 801 Subpart D)

Over-The-Counter Use (21 CFR 801 Subpart C)

CONTINUE ON A SEPARATE PAGE IF NEEDED.

This section applies only to requirements of the Paperwork Reduction Act of 1995.

*DO NOT SEND YOUR COMPLETED FORM TO THE PRA STAFF EMAIL ADDRESS BELOW."

The burden time for this collection of information is estimated to average 79 hours per response, including the time to review instructions, search existing data sources, gather and maintain the data needed and complete and review the collection of information. Send comments regarding this burden estimate or any other aspect of this information collection, including suggestions for reducing this burden, to:

Department of Health and Human Services Food and Drug Administration Office of Chief Information Officer Paperwork Reduction Act (PRA) Staff PRAStaff(@fda.hhs.gov

"An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB number."

4

This 510(k) Summary has been created per the requirements of the Safe Medical Device Act (SMDA) of 1990, and the content is provided in conformance with 21 CFR Part 807.92.

5.1. Submitter's Information

Table 1 : Submitter's Information
Submitter's Name:Kurt Sysock
Company:Radformation, Inc.
Address:261 Madison Avenue, 9th Floor
New York, NY 10016
Contact Person:Alan Nelson
Chief Technology Officer, Radformation
Phone:518-888-5727
Fax:----------
Email:anelson@radformation.com
Date of Summary Preparation09/05/2024

5.2. Device Information

Table 2 : Device Information
Trade Name:AutoContour Model RADAC V4
Common Name:AutoContour, AutoContouring, AutoContour Agent,
AutoContour Cloud Server
Classification Name:Class II
Classification:Medical image management and processing system
Regulation Number:892.2050
Product Code:QKB
Classification Panel:Radiology

5

5.3. Predicate Device Information

AutoContour Model RADAC V4 (Subject Device) makes use of its prior submissions -AutoContour Model RADAC V3 (K230685) - as the Predicate Device.

5.4. Device Description

As with AutoContour Model RADAC V3, the AutoContour Model RADAC V4 device is software that uses DICOM-compliant image data (CT or MR) as input to: (1) automatically contour various structures of interest for radiation therapy treatment planning using machine learning based contouring. The deep-learning based structure models are trained using imaging datasets consisting of anatomical organs of the head and neck, thorax, abdomen and pelvis for adult male and female patients, (2) allow the user to review and modify the resulting contours, and (3) generate DICOM-compliant structure set data the can be imported into a radiation therapy treatment planning system.

AutoContour Model RADAC V4 consists of 3 main components:

    1. A .NET client application designed to run on the Windows Operating System allowing the user to load image and structure sets for upload to the cloud-based server for automatic contouring, perform registration with other image sets, as well as review, edit, and export the structure set.
    1. A local "agent" service designed to run on the Windows Operating System that is configured by the user to monitor a network storage location for new CT and MR datasets that are to be automatically contoured.
    1. A cloud-based automatic contouring service that produces initial contours based on image sets sent by the user from the .NET client application.

5.5. Indications for Use

AutoContour is intended to assist radiation treatment planners in contouring and reviewing structures within medical images in preparation for radiation therapy treatment planning.

5.6. Technological Characteristics

The Subject Device, AutoContour Model RADAC V4 makes use of AutoContour Model RADAC V3 (K230685) as the Predicate Device for substantial equivalence comparison. The functionality and technical components of this prior submission remain unchanged in AutoContour Model RADAC V4. This submission is intended to build on the technological characteristics of the 510(k) cleared AutoContour Model RADAC V3 pertaining to new structure models for both CT and MRI.

6

5.6.1. Updates vs. AutoContour (K230685)

The updated submission expands the use of machine-learning based contouring to include additional organs and volumes of Interest found in MR and CT image types.

| Table 3: Technological Characteristics

AutoContour Model RADAC V4 vs. AutoContour Model RADAC V3 (K230685)
CharacteristicSubject Device: AutoContour Model
RADAC V4Predicate Device: AutoContour Model
RADAC V3 (K230685)
Indications for
UseAutoContour is intended to assist radiation
treatment planners in contouring and
reviewing structures within medical
images in preparation for radiation therapy
treatment planning.AutoContour is intended to assist radiation
treatment planners in contouring and
reviewing structures within medical images
in preparation for radiation therapy
treatment planning.
Design: Image
registrationManual and Automatic Rigid registration.
Automatic Deformable RegistrationManual and Automatic Rigid registration.
Automatic Deformable Registration
Design:
Supported
modalitiesCT or MR input for contouring or
registration/fusion.
PET/CT input for registration/fusion only.
DICOM RTSTRUCT and REGISTRATION
for input.
(Minor differences)CT or MR input for contouring or
registration/fusion.
PET/CT input for registration/fusion only.
DICOM RTSTRUCT and REGISTRATION
for input.
Design:
Reporting and
data routingNo built-in reporting, supports exporting
DICOM RTSTRUCT, REGISTRATION
and DOSE files.
(Minor differences)No built-in reporting, supports exporting
DICOM RTSTRUCT file.
(Substantially Equivalent)

7

| Regions and
Volumes of
interest (ROI) | CT or MR input for contouring of
anatomical regions: Head and Neck,
Thorax, Abdomen and Pelvis. | |
|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|
| CT or MR input for contouring of
anatomical regions: Head and Neck,
Thorax, Abdomen and Pelvis.
CT Models:
A_Aorta A_Aorta_Asc A_Aorta_Dsc A_Brachiocephis A_Carotid_L A_Carotid_R A_Coronary A_LAD A_Pulmonary A_Subclavian_L A_Subclavian_R Atrium_L Atrium_R Bladder Bladder_F Bone_Hyoid Bone_Illium_L Bone_Illium_R Bone_Mandible Bone_Pelvic Bone_Skull Bone_Sternum Bone_Teeth Bowel Bowel_Bag Bowel_Large Bowel_Small BrachialPlex_L BrachialPlex_R Brain Brainstem Breast_L Breast_R Breast_Prone Bronchus BuccalMucosa Carina CaudaEquina Cavity_Oral Cavity_Oral_Ext Chestwall_L Chestwall_OAR Chestwall_R Chestwall_RC_L Chestwall_RC_R | CT or MR input for contouring of
anatomical regions: Head and Neck,
Thorax, Abdomen and Pelvis.
CT Models:
A_Aorta A_Aorta_Asc A_Aorta_Dsc A_LAD A_Pulmonary Bladder Bladder_F Bone_Illium_L Bone_Illium_R Bone_Mandible Bone_Pelvic Bone_Skull Bone_Sternum Bowel Bowel_Bag Bowel_Large Bowel_Small BrachialPlex_L BrachialPlex_R Brain Brainstem Breast_L Breast_R Bronchus BuccalMucosa Carina CaudaEquina Cavity_Oral Cavity_Oral_Ext Chestwall_L Chestwall_OAR Chestwall_R Chestwall_RC_L Chestwall_RC_R Cochlea_L Cochlea_R Colon_Sigmoid Cornea_L Cornea_R Duodenum Ear_Internal_L Ear_Internal_R Esophagus External Eye_L | |
| | | |
| Clavicle_R
● | Femur_Head_L
● | |
| Cochlea_L
● | Femur_Head_R
● | |
| Cochlea_R
● | Femur_L
● | |
| Colon_Sigmoid
● | Femur_R
● | |
| Cornea_L
● | Femur_RTOG_L
● | |
| Cornea_R
● | Femur_RTOG_R
● | |
| Dental_Artifact
● | GallBladder
● | |
| Duodenum
● | Genitals_F
● | |
| Ear_Internal_L
● | Genitals_M
● | |
| Ear_Internal_R
● | Glnd_Lacrimal_L
● | |
| Esophagus
● | Glnd_Lacrimal_R
● | |
| External
● | Glnd_Submand_L
● | |
| Eye_L
● | Glnd_Submand_R
● | |
| Eye_R
● | Glnd_Thyroid
● | |
| Femur_Head_L
● | HDR_Cylinder
● | |
| Femur_Head_R
● | Heart
● | |
| Femur_L
● | Hippocampus_L
● | |
| Femur_R
● | Hippocampus_R
● | |
| Femur_RTOG_L
● | Humerus_L
● | |
| Femur_RTOG_R
● | Humerus_R
● | |
| Foley_Balloon
● | Kidney_L
● | |
| GallBladder
● | Kidney_R
● | |
| Genitals_F
● | Kidney_Outer_L
● | |
| Genitals_M
● | Kidney_Outer_R
● | |
| Glnd_Lacrimal_L
● | Larynx
● | |
| Glnd_Lacrimal_R
● | Larynx_Glottic
● | |
| Glnd_Submand_L
● | Larynx_NRG
● | |
| Glnd_Submand_R
● | Larynx_SG
● | |
| Glnd_Thyroid
● | Lens_L
● | |
| HDR_Bladder
● | Lens_R
● | |
| HDR_Bowel
● | Lips
● | |
| HDR_Cylinder
● | Liver
● | |
| HDR_Rectum
● | LN_Ax_L
● | |
| Heart
● | LN_Ax_L1_L
● | |
| Heart_Prone
● | LN_Ax_L1_R
● | |
| Hippocampus_L
● | LN_Ax_L2_L
● | |
| Hippocampus_R
● | LN_Ax_L2_L3_L
● | |
| Humerus_L
● | LN_Ax_L2_L3_R
● | |
| Humerus_R
● | LN_Ax_L2_R
● | |
| Iliac_Int_L
● | LN_Ax_L3_L
● | |
| Iliac_Int_R
● | LN_Ax_L3_R
● | |
| Iliac_L
● | LN_Ax_R
● | |
| Iliac_R
● | LN_IMN_L
● | |
| Kidney_L
● | LN_IMN_R
● | |
| Kidney_R
● | LN_IMN_RC_L
● | |
| Kidney_Outer_L
● | LN_IMN_RC_R
● | |
| Kidney_Outer_R
● | LN_Inguinofem_L
● | |
| Larynx
● | LN_Inguinofem_R
● | |
| Larynx_Glottic
● | LN_Neck_IA
● | |
| Larynx_NRG
● | LN_Neck_IB-V_L
● | |
| Larynx_SG
● | LN_Neck_IB-V_R
● | |
| | | |
| Lens_L Lens_R Lips Liver LN_Ax_L LN_Ax_L1_ESTRO_L LN_Ax_L1_ESTRO_R LN_Ax_L1_L LN_Ax_L1_R LN_Ax_L2_ESTRO_L LN_Ax_L2_ESTRO_R LN_Ax_L2_L LN_Ax_L2_L3_L LN_Ax_L2_L3_R LN_Ax_L2_R LN_Ax_L3_ESTRO_L LN_Ax_L3_ESTRO_R LN_Ax_L3_L LN_Ax_L3_R LN_Ax_R LN_IMN_L LN_IMN_R LN_IMN_RC_L LN_IMN_RC_R LN_Inguinofem_L LN_Inguinofem_R LN_InPec_ESTRO_L LN_InPec_ESTRO_R LN_Neck_IA LN_Neck_IB_L LN_Neck_IB_R LN_Neck_IB-V_L LN_Neck_IB-V_R LN_Neck_II_L LN_Neck_II_R LN_Neck_II-IV_L LN_Neck_II-IV_R LN_Neck_II-V_L LN_Neck_II-V_R LN_Neck_III_L LN_Neck_III_R LN_Neck_IV_L LN_Neck_IV_R LN_Neck_V_L LN_Neck_V_R LN_Neck_VIA LN_Neck_VIIA_L LN_Neck_VIIA_R LN_Neck_VIIB_L LN_Neck_VIIB_R | LN_Neck_II_L LN_Neck_II_R LN_Neck_II-IV_L LN_Neck_II-IV_R LN_Neck_II-V_L LN_Neck_II-V_R LN_Neck_III_L LN_Neck_III_R LN_Neck_IV_L LN_Neck_IV_R LN_Neck_V_L LN_Neck_V_R LN_Neck_VIA LN_Neck_VIIA_L LN_Neck_VIIA_R LN_Neck_VIIB_L LN_Neck_VIIB_R LN_Paraaortic LN_Pelvics LN_Pelvic_NRG LN_Sclav_L LN_Sclav_R LN_Sclav_RADCOMP_L LN_Sclav_RADCOMP_R Lobe_Temporal_L Lobe_Temporal_R Lung_L Lung_R Macula_L Macula_R Marrow_Ilium_L Marrow_Ilium_R Musc_Constrict Nipple_L Nipple_R OpticChiasm OpticNrv_L OpticNrv_R Pancreas Parotid_L Parotid_R PenileBulb Pericardium Pituitary Prostate Rectum Rectum_F Retina_L Retina_R Rib | |
| LN_Pelvics_F LN_Pelvics LN_Pelvic_NRG LN_Post_Neck_L LN_Post_Neck_R LN_Presacral LN_Sclav_ESTRO_L LN_Sclav_ESTRO_R LN_Sclav_L LN_Sclav_R LN_Sclav_RADCOMP_L LN_Sclav_RADCOMP_R Lobe_Temporal_L Lobe_Temporal_R Lung_L Lung_R Macula_L Macula_R Marrow_Ilium_L Marrow_Ilium_R Musc_Constrict Musc_Iliopsoas_L Musc_Iliopsoas_R Myocardium Nipple_L Nipple_Prone Nipple_R OpticChiasm OpticNrv_L OpticNrv_R Pancreas Parotid_L Parotid_R PenileBulb Pericardium Pharynx Pituitary Prostate ProstateBed Rectum Rectum_F Retina_L Retina_R Rib Rib01_L Rib01_R Rib02_L Rib02_R Rib03_L Rib03_R | Rib_R SeminalVes SpinalCanal SpinalCord Spleen Stomach Trachea UteroCervix V_Venacava_I V_Venacava_S VB VB_C1 VB_C2 VB_C3 VB_C4 VB_C5 VB_C6 VB_C7 VB_L1 VB_L2 VB_L3 VB_L4 VB_L5 VB_T01 VB_T02 VB_T03 VB_T04 VB_T05 VB_T06 VB_T07 VB_T08 VB_T09 VB_T10 VB_T11 VB_T12

MR Models: Brainstem Cerebellum Eye_L Eye_R Glnd_Prostate Hippocampus_L Hippocampus_R Hypo_True Hypothalamus OpticChiasm OpticNrv_L OpticNrv_R OpticTract_L OpticTract_R | |
| | | |
| Rib04_R
● | Pituitary
● | |
| Rib05_L
● | Prostate
● | |
| Rib05_R
● | SeminalVes
● | |
| Rib06_L
● | | |
| Rib06_R
● | | |
| Rib07_L
● | | |
| Rib07_R
● | | |
| Rib08_L
● | | |
| Rib08_R
● | | |
| Rib09_L
● | | |
| Rib09_R
● | | |
| Rib10_L
● | | |
| Rib10_R
● | | |
| Rib11_L
● | | |
| Rib11_R
● | | |
| Rib12_L
● | | |
| Rib12_R
● | | |
| Rib_L
● | | |
| Rib_R
● | | |
| SacralPlex_L
● | | |
| SacralPlex_R
● | | |
| SeminalVes
● | | |
| SpinalCanal
● | | |
| SpinalCord
● | | |
| Spleen
● | | |
| Stomach
● | | |
| Trachea
● | | |
| UteroCervix
● | | |
| V_Brachioceph_L
● | | |
| V_Brachioceph_R
● | | |
| V_Jugular_L
● | | |
| V_Jugular_R
● | | |
| V_Venacava_I
● | | |
| V_Venacava_S
● | | |
| VB
● | | |
| VB_C1
● | | |
| VB_C2
● | | |
| VB_C3
● | | |
| VB_C4
● | | |
| VB_C5
● | | |
| VB_C6
● | | |
| VB_C7
● | | |
| VB_L1
● | | |
| VB_L2
● | | |
| VB_L3
● | | |
| VB_L4
● | | |
| VB_L5
● | | |
| VB_T01
● | | |
| VB_T02
● | | |
| VB_T03
● | | |
| | VB_T05 VB_T06 VB_T07 VB_T08 VB_T09 VB_T10 VB_T11 VB_T12 Ventricle_L Ventricle_R MR Models: A_Pud_Int_L A_Pud_Int_R Bladder Bladder_Trigone Brain Brainstem Cerebellum Colon_Sigmoid External_Pelvis Eye_L Eye_R Femur_L Femur_R Gind_Prostate Hippocampus_L Hippocampus_R Hypo_True Hypothalamus Lens_L Lens_R NVB_L NVB_R OpticChiasm OpticNrv_L OpticNrv_R OpticTract_L OpticTract_R PenileBulb Pituitary Prostate Rectal_Spacer Rectum SeminalVes SpinalCord_Cerv Urethra | |
| Computer platform & Operating | Windows based .NET front-end application that also serves as agent Uploader supporting Microsoft Windows | Windows based .NET front-end application that also serves as agent Uploader |

8

9

10

11

12

13

System
10 (64-bit) and Microsoft Windows Server 2016.
Cloud-based Server based automatic contouring application compatible with Linux.
Windows python-based automatic contouring application supporting Microsoft Windows 10 (64-bit) and Microsoft Windows Server 2016.supporting Microsoft Windows 10 (64-bit) and Microsoft Windows Server 2016.
Cloud-based Server based automatic contouring application compatible with Linux.
Windows python-based automatic contouring application supporting Microsoft Windows 10 (64-bit) and Microsoft Windows Server 2016.

As shown in Table 3, almost all technological characteristics are either substantially equivalent or a subset of the Predicate Device's technological characteristics.

5.7. Discussion of differences

Minor differences

The following minor differences exist, but do not represent any significant additional risks or decreased effectiveness for the device for its intended use:

  • New CT Models:
    Compared with the Predicate Device, AutoContour Model RADAC V4 supports contouring 77 new models on CT images (the new models are listed below). The addition of these models do not represent a significant deviation from the intended use and operation of AutoContour, nor does it represent a new significant unmitigated risk because:

(a) very similar CNN architecture was used to train these new CT models (b) all new models passed the same DSC test protocol criteria that was applied to the models in the predicate device for similar structure sizes (c) the same risk mitigations that have been applied to the predicate device models have also been applied to all new models

  • о A Brachiocephls
  • O A_Carotid_L
  • o A Carotid R
  • O A_Coronary_R
  • O A_Subclavian_L
  • O A_Subclavian_R
  • O Atrium_L
  • O Atrium_R
  • O Bone Hyoid

14

  • Bone_Teeth O
  • Breast_Prone O
  • Clavicle_L o
  • o Clavicle_R
  • Dental_Artifact o
  • Foley_Balloon O
  • HDR_Bladder O
  • O HDR_Bowel
  • HDR_Rectum O
  • Heart_Prone O
  • lliac_Int_l O
  • Iliac_Int_R o
  • O Iliac_L
  • O Iliac_R
  • LN_Ax_L1_ESTRO_L O
  • LN_Ax_L1_ESTRO_R O
  • O LN_Ax_L2_ESTRO_L
  • LN_Ax_L2_ESTRO_R O
  • LN_Ax_L3_ESTRO_L O
  • LN_Ax_L3_ESTRO_R o
  • O LN_InPec_ESTRO_L
  • LN_InPec_ESTRO_R O
  • O LN_Neck_IB_L
  • LN_Neck_IB_R O
  • O LN_Pelvics_F
  • O LN_Post_Neck_L
  • O LN_Post_Neck_R
  • LN_Presacral O
  • LN_Sclav_ESTRO_L o

15

  • LN_Sclav_ESTRO_R o
  • Musc_Iliopsoas_L O
  • Musc_lliopsoas_R o
  • Myocardium o
  • Nipple_Prone o
  • Pharynx o
  • ProstateBed O
  • Rib_R O
  • Rib01_L O
  • Rib01_R o
  • Rib02_L O
  • Rib02_R o
  • Rib03_L O
  • Rib03_R o
  • Rib04_L O
  • Rib04_R O
  • Rib05_L o
  • Rib05_R o
  • Rib06_L o
  • Rib06_R o
  • Rib07_L o
  • Rib07_R O
  • O Rib08_L
  • Rib08_R O
  • Rib09_L O
  • Rib09_R O
  • Rib10_L O
  • Rib10_R o
  • Rib11_L o

16

  • Rib11 R O
  • O Rib12 L
  • O Rib12 R
  • O SacralPlex_L
  • O SacralPlex R
  • V_Brachioceph_L O
  • V_Brachioceph_R O
  • O V_Jugular_L
  • V_Jugular_R O
  • O Ventricle L
  • o Ventricle R

New MR Models: ●

Compared with the Predicate Device, AutoContour Model RADAC V4 supports contouring 18 new models on MR images (the new models are listed below). The addition of these models do not represent a significant deviation from the intended use and operation of AutoContour, nor does it represent a new significant unmitigated risk because:

(a) very similar CNN architecture was used to train these new MR models (b) all new models passed the same DSC test protocol criteria that was applied to the models in the predicate device for similar structure sizes (c) the same risk mitigations that have been applied to the predicate device models have also been applied to all new models

  • O A_Pud_Int_L
  • O A_Pud_Int_R
  • Bladder O
  • Bladder_Trigone O
  • Brain O
  • Colon_Sigmoid O
  • O External_Pelvis
  • O Femur_L
  • O Femur_R
  • Lens L O
  • o Lens R

17

  • NVB L o
  • NVB R O
  • PenileBulb O
  • Rectal Spacer O
  • O Rectum
  • SpinalCord Cerv O
  • o Urethra
  • New DICOM outputs ●

AutoContour Model RADAC 4 now supports the export of the Deformable Registration and Deformed Dose to DICOM such that these Registrations and Dose grids to be reviewed as needed in outside Treatment Planning Systems and be evaluated for accuracy within independent Registration QA Systems. This minor difference does not represent a decrease in safety or effectiveness relative to the Predicate Device because:

  • The ability to generate Deformable Registrations and Dose files . was previously supported within the predicate device (AutoCOntour RADAC V2 and V3) for the purposes of structure transfer and dose summation evaluation of previously treated plans in ClearCheck.
  • The same risk mitigations present in the predicate device (i.e. . registration approval and review tools) are present in the AutoContour RADAC 4.
  • Users are provided with the option to validate the . appropriateness of the AutoContour Deformable Registration algorithm within independent review platforms (eg. Treatment Planning system)

5.8. Performance Data

The following performance data were provided in support of the substantial equivalence determination.

Sterilization & Shelf-life Testing

AutoContour is a pure software device and is not supplied sterile because the device doesn't come in contact with the patient. AutoContour is a pure software device and does not have a Shelf Life.

Biocompatibility

AutoContour is a pure software device and does not come in contact with the patient.

Electrical safety and electromagnetic compatibility (EMC)

18

AutoContour is a pure software device, hence no Electromagnetic Compatibility and Electrical Safety testing was conducted for the Subject Device.

Software Verification and Validation Testing

Summary

As with the Predicate Device, no clinical trials were performed for AutoContour Model RADAC V4. Non-clinical tests were performed according to Radformation's AutoContour Complete Test Protocol and Report, which demonstrates that AutoContour Model RADAC V4 performs as intended per its indications for use. Further tests were performed on independent datasets from those included in training and validation sets in order to validate the generalizability of the machine learning model.

Description of Changes to Test Protocol

There were no changes to the testing protocol between AutoContour RADAC V3 and RADAC V4.

Testing Summarv

Mean Dice Similarity Coefficient (DSC) was used to validate the accuracy of structure model outputs when tested on image data sequestered from the original training data population.The test datasets were independent from those used for training and consisted of approximately 10% of the number of training image sets used as input for the model. For CT structure models there were an average of 341 training and 54 testing image sets. CT training images were gathered from 4 institutions, in 2 different countries, the United States and Switzerland.

Ground truthing of each test data set were generated manually using consensus (NRG/RTOG) quidelines as appropriate by three clinically experienced experts consisting of 2 radiation therapy physicists and 1 radiation dosimetrist.

Structure models were categorized into three size categories as DSC metrics can be sensitive to structure volume. A structure would pass initial validation if the mean DSC exceeded 0.8 for Large volume structures (eg. Bladder, Spleen) 0.65 for Medium volume structures (eg. Gallbladder, Duodenum) and 0.5 for Small structures (eg. Cornea, Retina). For CT Structure models large, medium and small structures resulted in a mean DSC of 0.92+/-0.06, 0.85+/-0.09, and 0.81+/-0.12 respectively. A full summary of the CT structure DSC is available below:

Table 4: CT Training Data Results for AutoContour Model RADAC V4
CT StructureSizePass Criteria# of Training Sets# of Testing SetsDSC (Avg)DSC Std DevLower Bound 95% Confidence Interval
A_BrachiocephisSmall0.50388970.880.160.6168
A_Carotid_LMedium0.65328830.790.130.57615
A_Carotid_RMedium0.65328830.790.130.57615
A_Coronary_RSmall0.504081160.560.090.41195
A_Subclavian_LSmall0.50388970.860.170.58035
A_Subclavian_RSmall0.50388970.890.140.6597
Atrium_LMedium0.651082650.920.10.7555
Atrium_RMedium0.651082650.890.130.67615
Bone_HyoidSmall0.50305770.820.030.77065
Bone_TeethMedium0.65340760.880.020.8471
Breast_ProneLarge0.80245630.930.040.8642
Clavicle_LMedium0.651082650.950.020.9171
Clavicle_RMedium0.651082650.950.010.93355
Dental_ArtifactMedium0.65342860.760.10.5955
Foley_BalloonSmall0.5036100.780.160.5168
HDR_BladderMedium0.65383960.930.080.7984
HDR_BowelMedium0.6556150.770.260.3423
HDR_RectumMedium0.65131330.860.040.7942
Heart_ProneLarge0.80308780.970.010.95355
Iliac_Int_LMedium0.65160400.660.280.1994
Iliac_Int_RMedium0.65160400.660.280.1994
Iliac_LMedium0.651082650.830.060.7313
Iliac_RMedium0.651082650.810.080.6784
LN_Ax_L1_ESTRO_LMedium0.6582210.80.150.55325
LN_Ax_L1_ESTRO_RMedium0.6582210.80.150.55325
LN_Ax_L2_ESTRO_LMedium0.6582210.80.150.55325
LN_Ax_L2_ESTRO_RMedium0.6582210.80.150.55325
LN_Ax_L3_ESTRO_LMedium0.6582210.80.150.55325
LN_Ax_L3_ESTRO_RMedium0.6582210.80.150.55325
LN_InPec_ESTRO_LMedium0.6582210.80.150.55325
LN_InPec_ESTRO_RMedium0.6582210.80.150.55325
LN_Neck_IB_LMedium0.65252640.880.050.79775
LN_Neck_IB_RMedium0.65252640.880.050.79775
LN_Pelvics_FLarge0.8082210.890.020.8571
LN_Post_Neck_LMedium0.65240600.830.050.74775
LN_Post_Neck_RMedium0.65240600.830.050.74775
LN_PresacralMedium0.65191480.780.160.5168
LN_Sclav_ESTRO_LMedium0.6582210.80.150.55325
LN_Sclav_ESTRO_RMedium0.6582210.80.150.55325
Musc_Iliopsoas_LLarge0.801082650.940.080.8084
Musc_Iliopsoas_RLarge0.801082650.940.140.7097
MyocardiumMedium0.651082650.90.050.81775
Nipple_ProneSmall0.50247620.720.10.5555
PharynxMedium0.6557150.90.020.8671
ProstateBedMedium0.65133340.870.040.8042
Rib01_LMedium0.65148370.740.30.2465
Rib01_RMedium0.65148370.890.070.77485
Rib02_LMedium0.65148370.90.060.8013
Rib02_RMedium0.65148370.90.060.8013
Rib03_LMedium0.65148370.90.070.78485
Rib03_RMedium0.65148370.890.070.77485
Rib04_LMedium0.65148370.870.160.6068
Rib04_RMedium0.65148370.910.070.79485
Rib05_LMedium0.65148370.90.070.78485
Rib05_RMedium0.65148370.880.10.7155
Rib06_LMedium0.65148370.920.060.8213
Rib06_RMedium0.65148370.920.060.8213
Rib07_LMedium0.65148370.920.060.8213
Rib07_RMedium0.65148370.920.070.80485
Rib08_LMedium0.65148370.910.060.8113
Rib08_RMedium0.65148370.890.170.61035
Rib09_LMedium0.65148370.920.050.83775
Rib09_RMedium0.65148370.910.060.8113
Rib10_LMedium0.65148370.910.060.8113
Rib10_RMedium0.65148370.910.060.8113
Rib11_LMedium0.65148370.90.060.8013
Rib11_RMedium0.65148370.90.080.7684
Rib12_LSmall0.50148370.890.080.7584
Rib12_RSmall0.50148370.90.070.78485
SacralPlex_LMedium0.65326830.750.060.6513
SacralPlex_RMedium0.65326830.750.060.6513
V_Brachioceph_LMedium0.65388970.910.10.7455
V_Brachioceph_RSmall0.50388970.860.190.54745
V_Jugular_LMedium0.65165420.760.080.6284
V_Jugular_RMedium0.65165420.760.080.6284
Ventricle_LMedium0.651082650.950.040.8842
Ventricle_RMedium0.651082650.970.070.85485

19

20

21

Additional external clinical testing was performed in order to validate the accuracy of the models on image sets acquired that were unique to the training datasets. Both AutoContour and manually added ground truth contours following the same structure guidelines used for structure model training were added to the image sets.

Table 5: CT External Clinical Dataset References
Model GroupData Source IDData Citation
CT PelvisTCIA - Pelvic-RefAfua A. Yorke, Gary C. McDonald, David Solis Jr., Thomas Guerrero. (2019)
Pelvic Reference Data. The Cancer Imaging Archive. DOI:
10.7937/TCIA.2019.woskq500
CT Head and
NeckTCIA -
Head-Neck-PET-CTMartin Vallières, Emily Kay-Rivest, Léo Jean Perrin, Xavier Liem, Christophe
Furstoss, Nader Khaouam, Phuc Félix Nguyen-Tan, Chang-Shu Wang, Khalil
Sultanem. (2017). Data from Head-Neck-PET-CT. The Cancer Imaging Archive.
doi: 10.7937/K9/TCIA.2017.8oje5q00
CT AbdomenTCIA - Pancreas-CT-CBHong, J., Reyngold, M., Crane, C., Cuaron, J., Hajj, C., Mann, J., Zinovoy, M.,
Yorke, E., LoCastro, E., Apte, A. P., & Mageras, G. (2021). Breath-hold CT and
cone-beam CT images with expert manual organ-at-risk segmentations from
radiation treatments of locally advanced pancreatic cancer [Data set]. The
Cancer Imaging Archive. https://doi.org/10.7937/TCIA.ESHQ-4D90
CT Thorax:TCIA - NSCLCAerts, H. J. W. L., Wee, L., Rios Velazquez, E., Leijenaar, R. T. H., Parmar, C.,
Grossmann, P., Carvalho, S., Bussink, J., Monshouwer, R., Haibe-Kains, B.,
Rietveld, D., Hoebers, F., Rietbergen, M. M., Leemans, C. R., Dekker, A.,
Quackenbush, J., Gillies, R. J., Lambin, P. (2019). Data From
NSCLC-Radiomics [Data set]. The Cancer Imaging Archive.
https://doi.org/10.7937/K9/TCIA.2015.PF0M9REI
CT ThoraxTCIA - LCTSCYang, J., Sharp, G., Veeraraghavan, H., Van Elmpt, W., Dekker, A., Lustberg, T.,
& Gooding, M. (2017). Data from Lung CT Segmentation Challenge (Version 3)
[Data set]. The Cancer Imaging Archive.
https://doi.org/10.7937/K9/TCIA.2017.3R3FVZ08
CT Thorax
Prone FemaleTCIA- QIN-BREAST
and Prone ThoraxLi, X., Abramson, R. G., Arlinghaus, L. R., Chakravarthy, A. B., Abramson, V. G.,
Sanders, M., & Yankeelov, T. E. (2016). Data From QIN-BREAST (Version 2)
[Data set]. The Cancer Imaging Archive.

22

| | | https://doi.org/10.7937/K9/TCIA.2016.21JUEBH0
N/A- Testing data was shared from several institutions |
|---------------------|--------------------|---------------------------------------------------------------------------------------------------------|
| CT HDR
Female | Female HDR Pelvis | N/A- Testing data was shared from 2 different institutions based in the United
States. |
| CT
Prostatectomy | Pelvis ProstateBed | N/A- Testing data was shared from 1 institution based in the United States |

DSC values were calculated between ground truth contour data and AutoContour structures and rated on the same DSC passing criteria used for the Training DSC validation. All structures passed the minimum DSC criteria for small, medium and large structures with an mean DSC of 0.76+/-0.09, 0.84+/-0.09, and 0.94+/-0.02 respectively: Additionally, the qualitative clinical appropriateness of AutoContour structures generated on these scans was graded by clinical experts. Autocontour structures were graded on a scale from 1 to 5 where 5 refers to contour requiring no additional edits, and 1 refers to a score in which full manual re-contour of the structure would be required. An average score >= 3 was used to determine whether a structure model would ultimately be beneficial clinically. An average rating of 4.57 was found across all CT structure models demonstrating that only minor edits would be required in order to make the structure models acceptable for clinical use.

Table 6: CT External Reviewer Results for AutoContour Model RADAC V4
CT StructureSizePass
Criteria#
Testings
SetsAverage
DSCAverage
DSC Std.
DevLower
Bound 95%
Confidence
IntervalExternal
Reviewer
Average
Rating
(1-5)
A_Aorta (Update)Large0.8390.950.030.895822664.6
A_Aorta_Asc
(Update)Medium0.65390.940.060.8422918654.6
A_BrachiocephisSmall0.5400.830.090.689454284.6
A_Carotid_LMedium0.65820.790.090.643247824.6
A_Carotid_RMedium0.65820.770.120.5635462954.55
A_LAD (Update)Small0.5410.620.0990.4530083354.8
A_Coronary_RSmall0.5400.550.130.323976224.4
A_Subclavian_LSmall0.5400.860.050.769369954.7
A_Subclavian_RSmall0.5400.830.080.702202914.9
Atrium_LMedium0.65200.960.010.935054835
Atrium_RMedium0.65200.950.020.923258174.3
Bone_HyoidSmall0.5230.800.040.7280277554.3
Bone_TeethMedium0.65220.850.030.8013304554.3
Breast_ProneLarge0.8200.930.020.897354284.4
Clavicle_LMedium0.65400.920.020.887537724.5
Clavicle_RMedium0.65400.920.010.89428264.5
Dental_ArtifactMedium0.65200.650.170.3756724054.3
Esophagus (update)Medium0.65620.850.040.7776051354.7
Foley_BalloonSmall0.5130.910.060.820630684.7
Gallbladder (Update)Medium0.65220.870.050.7858881254.8
HDR_BladderMedium0.65210.910.130.6982588454.8
HDR_BowelMedium0.65210.930.040.8639236154.4
HDR_RectumMedium0.65210.900.050.818698924.6
Heart_ProneLarge0.8210.950.020.909595474.9
Iliac_Int_LMedium0.65410.800.060.6998091554.75
Iliac_Int_RMedium0.65410.800.090.6517618854.7
Iliac_LMedium0.65410.890.070.77517054.5
Iliac_RMedium0.65410.850.080.7261480454.45
Liver (update)Large0.8360.970.010.9553940554.9
LN_Ax_L1_ESTRO_LMedium0.65370.910.060.807724044.5
LN_Ax_L1_ESTRO_RMedium0.65340.900.060.797260314.5
LN_Ax_L2_ESTRO_LMedium0.65370.940.050.8534330654.5
LN_Ax_L2_ESTRO_RMedium0.65340.930.040.861854774.5
LN_Ax_L3_ESTRO_LMedium0.65370.940.050.846028694.3
LN_Ax_L3_ESTRO_RMedium0.65340.920.050.8482400154.3
R
LN_InPec_ESTRO_LMedium0.65370.900.060.7917005854.5
LN_InPec_ESTRO_RMedium0.65340.900.060.7982949054.5
LN_Neck_IB_LMedium0.65230.850.030.805146564.1
LN_Neck_IB_RMedium0.65230.860.020.819661854.1
LN_Pelvics_FLarge0.8200.820.040.7592598153.8
LN_Post_Neck_LMedium0.65200.810.110.6251357154.1
LN_Post_Neck_RMedium0.65200.800.110.6275140854
LN_PresacralMedium0.65400.820.070.6959674854.45
LN_Sclav_ESTRO_LMedium0.65370.880.070.758872164.5
LN_Sclav_ESTRO_RMedium0.65340.860.070.735230284.5
Lung_L (Update)Large0.8480.980.010.973060784.95
Lung_R (Update)Large0.8480.990.010.9805711354.9
Musc_Iliopsoas_LLarge0.8410.920.030.882554854.7
Musc_Iliopsoas_RLarge0.8410.920.020.8798529354.65
MyocardiumMedium0.65200.980.010.960663374.7
Nipple_ProneSmall0.5200.570.180.283567334.8
PharynxMedium0.65230.830.040.760671334.6
ProstateBedMedium0.65200.900.060.8022755554.4
Rib01_LMedium0.65400.810.090.661518444.6
Rib01_RMedium0.65400.810.110.6330763054.7
Rib02_LMedium0.65400.830.090.6853885154.7
Rib02_RMedium0.65390.830.110.649192814.5
Rib03_LMedium0.65400.830.120.629526294.5
Rib03_RMedium0.65390.840.140.6069398354.7
Rib04_LMedium0.65380.850.070.736937554.7
Rib04_RMedium0.65380.830.120.6232412354.7
Rib05_LMedium0.65370.850.140.6108380854.5
Rib05_RMedium0.65330.880.020.836493534.6
Rib06_LMedium0.65400.860.080.7243823754.5
Rib06_RMedium0.65400.860.120.6627571254.6
Rib07_LMedium0.65400.850.140.6257974954.6
Rib07_RMedium0.65400.860.140.6245014254.7
Rib08_LMedium0.65400.850.140.620397464.5
Rib08_RMedium0.65400.850.140.6190593654.6
Rib09_LMedium0.65400.840.140.6083652154.7
Rib09_RMedium0.65400.820.200.4960949354.5
Rib10_LMedium0.65400.820.190.5054756454.7
Rib10_RMedium0.65400.7950.230.416505054.8
Rib11_LMedium0.65390.790.190.4738833354.8
Rib11_RMedium0.65390.790.230.4057564354.7
Rib12_LSmall0.5290.770.140.5393979454.7
Rib12_RSmall0.5280.810.100.65219354.9
SacralPlex_LMedium0.65610.710.070.596178984.75
SacralPlex_RMedium0.65410.730.050.63997954.75
V_Brachioceph_LMedium0.65400.810.150.565502524.8
V_Brachioceph_RSmall0.5400.860.060.7548421854.7
V_Jugular_LMedium0.65820.770.120.5650539554.35
V_Jugular_RMedium0.65820.780.200.621293364.35
V_Venacava_S
(Update)Medium0.65390.910.040.846023564.4
Ventricle_LMedium0.65200.990.0040.984386294.6
Ventricle_RMedium0.65200.970.010.9533997154.7

23

24

25

26

The MR training data set used for initial testing of the Brain models (SpinalCord_Cerv, Brain, and Lens_L/R) had an average of 149 training image sets and 45 testing image sets and were acquired from the Cancer Imaging Archive GLIS-RT dataset. These data sets consisted primarily of glioblastoma and astrocytoma patients. Images were acquired on either a GE Signa HDxT (3T) or Siemens Skyra (3T) scanner and had an average slice thickness of 1mm, In-plane resolution between 0.5-1.0 mm, and acquisition parameters of TR=2.3-8.9ms, TE=3.0-3.2s.

The MR training data used for initial testing of the MR Pelvis models (A Pud Int L/R, Bladder, Bladder Trigone, Colon Sigmoid, External Pelvis, Femur L/R, NVB L/R, PenileBulb, Rectal Spacer, Rectum, and Urethra) had an average of 306 training image sets and 77 testing image sets and were taken from 2 open source datasets, and one institution within the United States.

Table 7: MR Initial Testing Dataset References
Model GroupData Source IDData Citation
MR BrainMR - RenownShusharina, N., & Bortfeld, T. (2021). Glioma Image Segmentation for
Radiotherapy: RT targets, barriers to cancer spread, and organs at risk [Data
set]. The Cancer Imaging Archive. https://doi.org/10.7937/TCIA.T905-ZQ20
MR PelvisProstate-MRI-U
S-BiopsyNatarajan, S., Priester, A., Margolis, D., Huang, J., & Marks, L. (2020).
Prostate MRI and Ultrasound With Pathology and Coordinates of Tracked
Biopsy (Prostate-MRI-US-Biopsy) [Data set]. The Cancer Imaging Archive.
DOI: 10.7937/TCIA.2020.A61IOC1A
MR Pelvis_2NYPN/A- Testing data was shared by 1 institution
MR Pelvis_3ProstateXLitjens, G., Debats, O., Barentsz, J., Karssemeijer, N., & Huisman, H. (2017).
SPIE-AAPM PROSTATEx Challenge Data (Version 2) [dataset]. The Cancer
Imaging Archive. https://doi.org/10.7937/K9TCIA.2017.MURS5CL

Datasets used for testing were removed from the training dataset pool before model training began, and used exclusively for testing.

Ground truthing of each test data set was generated manually using consensus (NRG/RTOG) guidelines as appropriate by three clinically experienced experts consisting of 2 radiation therapy physicists and 1 radiation dosimetrist. For MR Structure models, a mean training DSC of 0.96+/-0.03 was found for large models, 0.84+/-0.07 for medium models, 0.74+/- 0.09 for small models.

Table 8: MR Training Data Results for AutoContour Model RADAC V4
MR ModelsSizePass
CriteriaDSC
(Avg)DSC Std
Dev (Avg)Lower Bound
95%
Confidence
Interval
A_Pud_Int_LSmall0.50.690.080.5584

27

A_Pud_Int_RSmall0.50.690.080.5584
BladderLarge0.80.920.070.80485
Bladder_TrigoneSmall0.50.720.050.63775
BrainLarge0.80.9700.97
Colon_SigmoidMedium0.650.720.180.4239
External_PelvisLarge0.80.990.010.97355
Femur_LMedium0.650.930.030.88065
Femur_RMedium0.650.930.030.88065
Lens_LSmall0.50.820.090.67195
Lens_RSmall0.50.820.090.67195
NVB_LSmall0.50.610.080.4784
NVB_RSmall0.50.610.080.4784
PenileBulbSmall0.50.790.120.5926
Rectal_SpacerMedium0.650.840.120.6426
RectumMedium0.650.880.10.7155
SpinalCord_CervSmall0.50.820.060.7213
UrethraMedium0.650.680.090.53195

Additional external clinical testing was performed in order to validate the accuracy of the models on image sets acquired that were unique to the training datasets.

Table 9: MR External Clinical Dataset References
Model GroupData Source IDData Citation
MR BrainMR - RenownN/A
MR PelvisGold Atlas PelvisNyholm, Tufve, Stina Svensson, Sebastian Andersson, Joakim Jonsson,
Maja Sohlin, Christian Gustafsson, Elisabeth Kjellén, et al. 2018. "MR
and CT Data with Multi Observer Delineations of Organs in the Pelvic
Area - Part of the Gold Atlas Project." Medical Physics 12 (10): 3218-21.
doi:10.1002/mp.12748.
MR Pelvis_2SynthRadThummerer A, van der Bijl E, Galapon Jr A, Verhoeff JJ, Langendijk JA, Both S,
van den Berg CAT, Maspero M. 2023. SynthRAD2023 Grand Challenge dataset
Generating synthetic CT for radiotherapy. Medical Physics, 50(7), 4664-4674.
https://doi.org/10.1002/mp.16529
MRLinac
PelvisMR LinacN/A- Testing data was shared by 2 institutions utilizing MR Linacs for image
acquisitions.

For the Brain models, datasets acquired via data-use agreement from a clinical partner were acquired containing 20 MR T1 Ax post (BRAVO) image scans acquired with a GE MR750w scanner. Images had an average slice thickness of 1.6mm, In-plane

28

resolution between 0.94 mm. and acquisition parameters of TR=5.98ms. TE=96.8s. Data for testing of the MR Pelvis structure models were acquired from 2 publicly available datasets, which contained images of patients with prostate or rectal cancer, as well as 1 dataset shared from 2 institutions utilizing an MR Linac. Various scanner models and acquisition settings were used.

DSC values were calculated between ground truth contour data and AutoContour structures and rated on the same DSC passing criteria as was used for the training DSC validation. All structures passed the minimum DSC criteria for small, medium, and large structures with a mean DSC of 0.61+/-0.14, 0.84+/-0.09, 0.80+/-.09 respectively: Additionally, the qualitative clinical appropriateness of AutoContour structures generated on these scans was graded by clinical experts. Autocontour structures were graded on a scale from 1 to 5 where 5 refers to contour requiring no additional edits, and 1 refers to a score in which full manual re-contour of the structure would be required. An average score >= 3 was used to determine whether a structure model would ultimately be beneficial clinically. An average rating of 4.6 was found across all MR structure models demonstrating that only minor edits would be required in order to make the structure models acceptable for clinical use.

Table 10: MR External Reviewer Results for AutoContour Model RADAC V4
MR ModelsSizePas
s
Crite
ria#
External
Test
Data
SetsAverage
DSCAverage
DSC Std.
DevLower
Bound 95%
Confidence
IntervalExternal
Reviewer
Average
Rating
(1-5)
A_Pud_Int_LSmall0.5450.570.110.399131914.9
A_Pud_Int_RSmall0.5450.580.100.471111234.9
BladderLarge0.8450.930.060.8246469154.8
Bladder_TrigoneMedium0.5450.590.130.3794839054.6
BrainLarge0.8200.970.010.9591017254.6
Colon_SigmoidMedium0.65450.740.210.4060632254.5
External_PelvisLarge0.860.990.0010.989608495
Femur_LMedium0.65440.940.020.909503334.6
Femur_RMedium0.65450.950.010.9211295654.5
GInd_Prostate (Update)Medium0.65450.830.070.716625364.8
Lens_LSmall0.5180.720.140.4876338954.6
Lens_RSmall0.5190.630.210.2688510554.6
NVB_LSmall0.5450.540.120.340804954.2
NVB_RSmall0.5450.500.120.3050081054.2

29

PenileBulbSmall0.5450.710.180.404569924.8
Prostate(Update)Medium0.65450.860.040.78875114.8
Rectal_SpacerSmall0.550.510.220.14813.9
RectumMedium0.65450.840.070.7215106454.5
SeminalVes (Update)Medium0.65450.690.160.439222764.6
SpinalCord_CervSmall0.5180.8370.080.7041945454.6
UrethraSmall0.5260.560.130.350902694.9

Validation testing of the AutoContour application demonstrated that the software meets user needs and intended uses of the application.

Mechanical and Acoustic Testing Not Applicable (Standalone Software)

Not Applicable (Standalone Software)

Animal Study

No animal studies were conducted using the Subject Device, AutoContour.

Clinical Studies

No clinical studies were conducted using the Subject Device, AutoContour

5.9. Conclusion

AutoContour Model RADAC V4 is deemed substantially equivalent to the Predicate Device, AutoContour Model RADAC V3 (K220598). Verification and Validation testing and the Risk Management Report demonstrate that AutoContour Model RADAC V4 is as safe and effective as the Predicate Device. The technological characteristics table demonstrates the similarity between AutoContour Model RADAC V4 and the Predicate Device and does not raise any questions on the safety and effectiveness of the Subject Device.