(251 days)
Yes
The document explicitly states that "The RPA uses deep learning algorithms" and mentions "spinal canal CNN," which are types of AI/ML.
No.
The device is a software tool used to plan radiotherapy treatments, not to deliver the treatment itself. It generates contours and treatment plans that must be imported into another system for review, editing, and final dose calculation.
No
This device is used to plan radiotherapy treatments and create contours and treatment plans. It is explicitly stated that "The RPA is not intended to be used as a primary treatment planning system" and that all plans must be reviewed and re-calculated. This indicates a role in treatment delivery planning rather than diagnosis.
Yes
The device is described as a "web-based contouring and radiotherapy treatment planning software tool" and its function is to create contours and treatment plans for import into a separate Treatment Planning System (TPS). While it integrates with the Eclipse TPS software, the core functionality described is purely software-based, processing CT images to generate data for another system. There is no mention of accompanying hardware or hardware components being part of the device itself.
Based on the provided information, the Radiation Planning Assistant (RPA) is not an In Vitro Diagnostic (IVD) device.
Here's why:
- IVD Definition: In Vitro Diagnostics are devices intended for use in the examination of specimens derived from the human body in order to provide information for diagnostic, monitoring, or compatibility purposes. This typically involves analyzing biological samples like blood, urine, or tissue.
- RPA's Intended Use: The RPA is used for planning radiotherapy treatments for cancer. It processes medical images (CT scans) to assist in creating contours and treatment plans. This is a therapeutic planning function, not a diagnostic one based on analyzing biological specimens.
- Device Description: The description clearly states it's a "web-based contouring and radiotherapy treatment planning software tool."
- Input: The input is CT images, not biological specimens.
The RPA is a medical device used in the field of radiation oncology for treatment planning, which falls under a different regulatory category than IVDs.
No
The letter does not state that the FDA has reviewed and approved or cleared a PCCP for this specific device.
Intended Use / Indications for Use
The Radiation Planning Assistant (RPA) is used to plan radiotherapy treatments with cancers of the head and neck, cervix, breast, and metastases to the brain. The RPA is used to plan external beam irradiation with photon beams using CT images. The RPA is used to create contours and treatment plans that the user imports into their own Treatment Planning System (TPS) for review, editing, and re-calculation of the dose.
Some functions of the RPA use Eclipse 15.6. The RPA is not intended to be used as a primary treatment planning system. All automatically generated contours and plans must be imported into the user's own treatment planning system for review, edit, and final dose calculation.
Product codes
MUJ
Device Description
The Radiation Planning Assistant (RPA) is a web-based contouring and radiotherapy treatment planning software tool that incorporates the basic radiation planning functions from automated contouring, automated planning with dose optimization, and quality control checks. The system is intended for use for patients with cancer of the head and neck, cervix, breast, and metastases to the brain. The RPA system is integrated with the Eclipse Treatment Planning System v15.6 software cleared under K181145. The RPA radiation treatment planning software tool was trained against hundreds / thousands of CT Scans of normal and diseased tissues from patients receiving radiation for head and neck, cervical, breast, and whole brain at MD Anderson Cancer Center.
Mentions image processing
Not Found
Mentions AI, DNN, or ML
The RPA uses deep learning algorithms which Eclipse does not.
Devices are Substantially Equivalent. Both devices create plans that the users then edit and review for clinical acceptability prior to use through the use of AI software.
Input Imaging Modality
CT images
computerized tomography (CT) images.
Anatomical Site
head and neck, cervix, breast, and metastases to the brain.
Head / neck region, Breast region, Pelvis, GYN Liver, Chest Wall, Whole Brain.
Indicated Patient Age Range
Mean:
Cervix VMAT: 51
Cervix 3D: 50
Chest Wall: 51
Head and Neck: 62
Whole Brain: 60
Min, Max:
Cervix VMAT: 26, 94
Cervix 3D: 26, 84
Chest Wall: 31, 80
Head and Neck: 27, 87
Whole Brain: 14, 88
Intended User / Care Setting
Not Found
Description of the training set, sample size, data source, and annotation protocol
Head and Neck (Normal Tissue primary): 3,288 patients (3,495 CT scans) who received radiation therapy at MD Anderson Cancer Center between September 2004 and June 2018. Any patient who received a simulation CT scan of the head/neck region in a head-first supine position was eligible.
Head and Neck (Normal Tissue secondary): 160 patients who received radiation therapy at MD Anderson Cancer Center from 2018 to 2020. Any patient who received a simulation CT scan of the head/neck region in a head-first supine position was eligible.
Head and Neck (Lymph Node CTVs): 61 patients who received radiation therapy at MD Anderson Cancer Center between 2010 and 2019. Any patient who received a simulation CT scan of the head/neck region in a head-first supine position was eligible.
Whole Brain: The whole brain primary segmentation models used the same models as used for head and neck segmentation, as well as an additional vertebral body localization and segmentation model (Vertebral Bodies model: spinal canal CNN: 1,966, VB labeling: 803, VB segmentation: 107, from 930 MDACC patients and 355 external patients). Patients who received spinal radiotherapy for spinal metastases (3DCRT and VMAT) at MD Anderson, or for whom data was publicly available (MICCAI challenge data).
GYN (Normal Tissue primary): 1,999 patients (2,254 CT scans) who received radiation therapy at MD Anderson from September 2004 and June 2018. Any patient who received a simulation CT scan of the pelvic region in a head-first supine position was eligible.
GYN (Normal Tissue secondary): 192 patients (316 CT scans) who were treated for locally advanced cervical cancer between 2006 and 2020.
GYN (CTVs primary): 406 CT scans from 308 patients (UteroCervix), 250 CT scans from 201 patients (Nodal CTV), 146 CT scans from 131 patients (PAN), 490 CT scans from 388 patients (Vagina), 487 CT scans from 388 patients (Parametria) who received radiation therapy at MD Anderson Cancer Center between 2006 and 2020.
GYN (Liver): Training data for GYN Liver (normal) comprised 119 patients (169 CT scans) who had received contrast-enhanced and non-contrast CT imaging of the liver at MD Anderson Cancer Center.
Chest Wall (Whole Body secondary for chest wall): Training data for whole body (secondary for chest wall) comprised 250 patients who were treated at MD Anderson between August 2016 and June 2021, with CT imaging in the thoracic region.
The ground truth treatment plans were generated by the primary 4-field box automation technique for cervical cancer by Kisling et al. (Kisling 2019) with beam apertures based on a patient's bony anatomy. Only the clinically acceptable plans were used for training (rated by physicians); their DRRs and corresponding beam apertures were the inputs for training (and just the DRRs for testing/prediction). No additional criteria were applied.
Description of the test set, sample size, data source, and annotation protocol
Head and Neck: 174 CT scans were randomly selected from this group (training data) (and excluded for training) plus qualitative evaluation 24 CT scans from an external dataset. Test patients were randomly selected and excluded from the training set (for secondary normal tissue data). These 71 cases were randomly placed in 3 groups: training (51 pts.), cross-validation (10 pts.) and final test (10 pts.) (for lymph node CTVs data). Test patients were randomly selected from this group (whole brain data) (and excluded for training).
GYN: 140 CT scans were randomly selected from this group (primary normal tissue data) (and excluded for training) plus qualitative evaluation with 30 cervical cancer patients from 3 centers in S. Africa. Test patients were randomly selected from this group (secondary normal tissue data) (and excluded for training). Test patients were randomly selected from this group (CTVs data) (and excluded for training). Test patients were randomly selected from this group (liver data) (and excluded for training).
Chest Wall: Test patients were randomly selected from this group (and excluded for training).
The test set was generated in the same manner as the ground truth, but on previously unseen patients.
Test datasets were chosen going forward in time until sufficient data were collected, starting with CT scans collected on January 1, 2022. If insufficient patient scans were found, data collection was restarted with January 1, 2021 (for patients treated in 2021) and so forth, until sufficient data was collected. Testing datasets were unique, with no overlap with data used for model creation or in previous validation studies. CT scans included the manufacturer and model of the scanner used to obtain the CT image or recorded separately.
Cervix Inclusion Criteria: CT scan of the female pelvic anatomy. Clear CT image of the pelvic region without distortions. Test datasets consisted of CT images of patients previously treated for cervical cancer using radiotherapy following one of the following treatment schemes: 4-field box (based on bony landmarks or soft tissue) or VMAT. Scan was obtained with patient head-first, supine. The datasets included CT images, original clinical contours of anatomic structures and treatment targets, and the dose distributions used for patient treatment.
Cervix Exclusion Criteria: Poor Image Quality.
Chest Wall Inclusion Criteria: CT scan of the breast (thorax) region. Clear CT image of the breast (thorax) region without distortions. Test datasets must consist of CT images of patients previously treated for postmastectomy breast radiotherapy following one of the following treatment schemes: Tangent fields with supraclavicular fields. Similar approaches, including those that also treat the intramammary lymph nodes are also acceptable. Scan was obtained with patient head-first, supine. The datasets must include CT images, original clinical contours of anatomic structures and treatment targets, and the dose distributions used for patient treatment.
Chest Wall Exclusion Criteria: Poor Image Quality.
Head and Neck Inclusion Criteria: CT scan of head and neck anatomy. Clear CT image of head and/or neck without distortions. Test datasets consisted of CT images of patients previously treated for head and neck cancer using radiotherapy following this treatment scheme: VMAT or IMRT treatments, 1-3 dose levels in the prescription. Scan was obtained with patient head-first, supine. The datasets included CT images, original clinical contours of anatomic structures and treatment targets, and the dose distributions used for patient treatment.
Head and Neck Exclusion Criteria: Poor Image Quality.
Whole Brain Inclusion Criteria: CT scan of the head/neck region. Clear CT image of the head/neck region without distortions. Test datasets consisted of CT images of patients previously treated for whole brain radiotherapy following one of the following treatment schemes: Opposed laterals or slight obliques. Scan was obtained with patient head-first, supine. The datasets included CT images, original clinical contours of anatomic structures and treatment targets, and the dose distributions used for patient treatment.
Whole Brain Exclusion Criteria: Poor Image Quality.
Summary of Performance Studies (study type, sample size, AUC, MRMC, standalone performance, key results)
Study Type: Multicenter clinical data.
Sample Size:
- Cervix VMAT: 50 unique patients
- Cervix 3D (4-field box soft tissue plan): 47 unique patients
- Cervix 3D (4-field box bony landmark plan): 45 unique patients
- Chest Wall: 46 unique patients
- Head and Neck: 86 unique patients
- Whole Brain: 46 unique patients
Key Results (Cervix):
- Safety (Dosimetric metrics on RPA contour vs. Clinical contour for normal structures): 0.7.
- Quality of Body Contouring: Surface DSC > 0.8 for 95% of CT scans.
- Accuracy of Marked Isocenter Identification: 0.8 for 95% of CT scans.
Key Results (Head and Neck):
- Safety (Dosimetric metrics on RPA contour vs. Clinical contour for normal structures): 0.7.
- Quality of Body Contouring: Surface DSC > 0.8 for >95% of CT scans.
- Accuracy of Marked Isocenter Identification: 0.8 difference between RPA Plan and Clinical Plan for all assessments.
- Accuracy of Marked Isocenter Identification:
§ 892.5050 Medical charged-particle radiation therapy system.
(a)
Identification. A medical charged-particle radiation therapy system is a device that produces by acceleration high energy charged particles (e.g., electrons and protons) intended for use in radiation therapy. This generic type of device may include signal analysis and display equipment, patient and equipment supports, treatment planning computer programs, component parts, and accessories.(b)
Classification. Class II. When intended for use as a quality control system, the film dosimetry system (film scanning system) included as an accessory to the device described in paragraph (a) of this section, is exempt from the premarket notification procedures in subpart E of part 807 of this chapter subject to the limitations in § 892.9.
0
May 17, 2023
Image /page/0/Picture/1 description: The image contains the logo of the U.S. Food and Drug Administration (FDA). On the left is the Department of Health & Human Services logo. To the right of that is the FDA logo, which is a blue square with the letters "FDA" in white. To the right of the blue square is the text "U.S. FOOD & DRUG ADMINISTRATION" in blue.
University of Texas, MD Anderson Cancer Center % Ms. Stella Tsai Sr. Project Manager 1515 Holcombe Blvd. HOUSTON TX 77030
Re: K222728
Trade/Device Name: Radiation Planning Assistant (RPA) Regulation Number: 21 CFR 892.5050 Regulation Name: Medical charged-particle radiation therapy system Regulatory Class: Class II Product Code: MUJ Dated: April 17, 2023 Received: April 17, 2023
Dear Ms. Stella Tsai:
We have reviewed your Section 510(k) premarket notification of intent to market the device referenced above and have determined the device is substantially equivalent (for the indications for use stated in the enclosure) to legally marketed predicate devices marketed in interstate commerce prior to May 28, 1976, the enactment date of the Medical Device Amendments, or to devices that have been reclassified in accordance with the provisions of the Federal Food, Drug, and Cosmetic Act (Act) that do not require approval of a premarket approval application (PMA). You may, therefore, market the device, subject to the general controls provisions of the Act. Although this letter refers to your product as a device, please be aware that some cleared products may instead be combination products. The 510(k) Premarket Notification Database located at https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfpmn/pmn.cfm identifies combination product submissions. The general controls provisions of the Act include requirements for annual registration, listing of devices, good manufacturing practice, labeling, and prohibitions against misbranding and adulteration. Please note: CDRH does not evaluate information related to contract liability warranties. We remind you, however, that device labeling must be truthful and not misleading.
If your device is classified (see above) into either class II (Special Controls) or class III (PMA), it may be subject to additional controls. Existing major regulations affecting your device can be found in the Code of Federal Regulations, Title 21, Parts 800 to 898. In addition, FDA may publish further announcements concerning your device in the Federal Register.
Please be advised that FDA's issuance of a substantial equivalence determination does not mean that FDA has made a determination that your device complies with other requirements of the Act or any Federal statutes and regulations administered by other Federal agencies. You must comply with all the Act's requirements, including, but not limited to: registration and listing (21 CFR Part 807); labeling (21 CFR Part 801); medical device reporting of medical device-related adverse events) (21 CFR 803) for devices or postmarketing safety reporting (21 CFR 4, Subpart B) for combination products (see https://www.fda.gov/combination-products/guidance-regulatory-information/postmarketing-safety-reporting
1
combination-products); good manufacturing practice requirements as set forth in the quality systems (QS) regulation (21 CFR Part 820) for devices or current good manufacturing practices (21 CFR 4, Subpart A) for combination products; and, if applicable, the electronic product radiation control provisions (Sections 531-542 of the Act); 21 CFR 1000-1050.
Also, please note the regulation entitled, "Misbranding by reference to premarket notification" (21 CFR Part 807.97). For questions regarding the reporting of adverse events under the MDR regulation (21 CFR Part 803), please go to https://www.fda.gov/medical-device-safety/medical-device-reportingmdr-how-report-medical-device-problems.
For comprehensive regulatory information about medical devices and radiation-emitting products, including information about labeling regulations, please see Device Advice (https://www.fda.gov/medicaldevices/device-advice-comprehensive-regulatory-assistance) and CDRH Learn (https://www.fda.gov/training-and-continuing-education/cdrh-learn). Additionally, you may contact the Division of Industry and Consumer Education (DICE) to ask a question about a specific regulatory topic. See the DICE website (https://www.fda.gov/medical-device-advice-comprehensive-regulatoryassistance/contact-us-division-industry-and-consumer-education-dice) for more information or contact DICE by email (DICE@fda.hhs.gov) or phone (1-800-638-2041 or 301-796-7100).
Sincerely,
Image /page/1/Picture/5 description: The image shows a digital signature. The signature is from Lora D. Weidner -S. The date of the signature is 2023.05.17 and the time is 07:14:45-04'00'.
Lora D. Weidner, Ph.D. Assistant Director Radiation Therapy Team DHT8C: Division of Radiological Imaging and Radiation Therapy Devices OHT8: Office of Radiological Health Office of Product Evaluation and Quality Center for Devices and Radiological Health
Enclosure
2
Indications for Use
510(k) Number (if known) K222728
Device Name Radiation Planning Assistant (RPA)
Indications for Use (Describe)
The Radiation Planning Assistant (RPA) is used to plan radiotherapy treatments with cancers of the head and neck, cervix, breast, and metastases to the brain. The RPA is used to plan external beam irradiation with photon beams using CT images. The RPA is used to create contours and treatment plans that the user imports into their own Treatment Planning System (TPS) for review, editing, and re-calculation of the dose.
Some functions of the RPA use Eclipse 15.6. The RPA is not intended to be used as a primary treatment planning system. All automatically generated contours and plans must be imported into the user's own treatment planning system for review, edit, and final dose calculation.
Type of Use (Select one or both, as applicable) | |
---|---|
------------------------------------------------- | -- |
X Prescription Use (Part 21 CFR 801 Subpart D)
Over-The-Counter Use (21 CFR 801 Subpart C)
CONTINUE ON A SEPARATE PAGE IF NEEDED.
This section applies only to requirements of the Paperwork Reduction Act of 1995.
DO NOT SEND YOUR COMPLETED FORM TO THE PRA STAFF EMAIL ADDRESS BELOW.
The burden time for this collection of information is estimated to average 79 hours per response, including the time to review instructions, search existing data sources, gather and maintain the data needed and complete and review the collection of information. Send comments regarding this burden estimate or any other aspect of this information collection, including suggestions for reducing this burden, to:
Department of Health and Human Services Food and Drug Administration Office of Chief Information Officer Paperwork Reduction Act (PRA) Staff PRAStaff(@fda.hhs.gov
"An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB number."
3
K222728
DATE PREPARED: 16 May 2023
SUBMITTER 1.
| Manufacturer Name: | The University of Texas MD Anderson Cancer Center
Department of Radiation Physics
Division of Radiation Oncology
1515 Holcombe Blvd.
Houston, TX 77030 |
|--------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Official Contact: | Stella Tsai, MHA, CCRA
Sr. Project Director, IND Office
The University of Texas MD Anderson Cancer Center
1515 Holcombe Blvd. Unit 1634
Houston, TX 77030
Telephone (713) 563-5464
swtsai@mdanderson.org |
2. DEVICE
Name of Device: | Radiation Planning Assistant (RPA) |
---|---|
Common or Usual Name: | System, Planning, Radiation Therapy Treatment |
Classification Name: | 21CFR 892.5050 - Medical charged-particle radiation |
therapy system | |
Regulatory Class: | II |
Product Code: | MUJ |
PREDICATE DEVICE 3.
Eclipse Treatment Planning System v15.6 (K181145)
4. DEVICE DESCRIPTION
Design Characteristics
The Radiation Planning Assistant (RPA) is a web-based contouring and radiotherapy treatment planning software tool that incorporates the basic radiation planning functions from automated contouring, automated planning with dose optimization, and quality control checks. The system is intended for use for patients with cancer of the head and neck, cervix, breast, and metastases to the brain. The RPA system is integrated with the Eclipse Treatment Planning System v15.6 software cleared under K181145. The RPA radiation treatment planning software tool was trained against hundreds / thousands of CT Scans of normal and diseased tissues from patients receiving radiation for head and neck, cervical, breast, and whole brain at MD Anderson Cancer Center.
4
5. INDICATIONS FOR USE
The Radiation Planning Assistant (RPA) is used to plan radiotherapy treatments for patients with cancers of the head and neck, cervix, breast, and metastases to the brain.
The RPA is used to plan external beam irradiation with photon beams using computerized tomography (CT) images. The RPA is used to create contours and treatment plans that the user imports into their own Treatment Planning System (TPS) for review, editing, and re-calculation of the dose.
Some functions of the RPA use Eclipse 15.6. The RPA is not intended to be used as a primary treatment planning system. All automatically generated contours and plans must be imported into the user's own treatment planning system for review, edit, and final dose calculation.
COMPARISON OF TECHNOLOGICAL CHARACTERISTICS 6. WITH THE PREDICATE DEVICE
The Radiation Planning Assistant (RPA) is substantially equivalent to the Eclipse Treatment Planning System v15.6 (K181145) predicate device in the following respects:
5
Subject Device | Predicate Device | |
---|---|---|
Radiation Planning Assistant (RPA) | Eclipse Treatment Planning System Version | |
15.6 | ||
K181145 | ||
CFR | ||
Citation | 892.5050 | 892.5050 |
Product | ||
Code | MUJ | MUJ |
Indications | ||
for Use | The Radiation Planning Assistant (RPA) is | |
used to plan radiotherapy treatments for | ||
patients with cancers of the head and neck, | ||
cervix, breast, and metastases to the brain. The | ||
RPA is used to plan external beam irradiation | ||
with photon beams using computerized | ||
tomography (CT) images. The RPA is used to | ||
create contours and treatment plans that the | ||
user imports into their own Treatment Planning | ||
System (TPS) for review, editing, and re- | ||
calculation of the dose. | ||
Some functions of the RPA use Eclipse | ||
v.15.6. The RPA is not intended to be used as | The Eclipse Treatment Planning System | |
(Eclipse TPS) is used to plan radiotherapy | ||
treatments for patients with malignant or | ||
benign diseases. Eclipse TPS is used to plan | ||
external beam irradiation with photon, electron, | ||
and proton beams, as well as for internal | ||
irradiation (brachytherapy) treatments. | ||
a primary treatment planning system. All | ||
automatically generated contours and plans | ||
must be imported into the user's own | ||
treatment planning system for review, edit, | ||
and final dose calculation. | ||
Device | ||
Description | The Radiation Planning Assistant (RPA) is a | |
web-based contouring and radiotherapy | ||
treatment planning software tool that | ||
incorporates the basic radiation planning | ||
functions from automated contouring, | ||
automated planning with dose optimization, | ||
and quality control checks. The system is | ||
intended for use for patients with cancer of the | ||
head and neck, cervix, breast, and metastases to | ||
the brain. The RPA system is integrated with | ||
the Eclipse Treatment Planning System v.15.6 | ||
software cleared under K181145. | The Varian Eclipse™ Treatment Planning | |
System (Eclipse TPS) provides software tools | ||
for planning the treatment of malignant or | ||
benign diseases with radiation. Eclipse TPS is | ||
a computer-based software device used by | ||
trained medical professionals to design and | ||
simulate radiation therapy treatments. | ||
Eclipse TPS is capable of planning treatments | ||
for external beam irradiation with photon, | ||
electron, and proton beams, as well as for | ||
internal irradiation (brachytherapy) treatments. |
Table 1: Comparison of the Technological Characteristics of the RPA with Predicate Device
6
| Software
Function | Description of Functions Available in
Eclipse | Differences | Similarities | Rationale for
Substantial Equivalence |
|------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Software
Contouring
Functions | 1. Organ-specific autocontouring algorithms. Eclipse
v.15.6 includes the following organ-specific
autocontouring algorithms for the following organs:
spine, lung, brain, eye, bone.
2. Expert Segmentation. Eclipse v.15.6 includes an atlas-
based contouring approach ("expert segmentation") for
autocontouring of many structures, including:
Head / neck region: Body, bones, brainstem,
cochlea, esophagus, eyes, mandible, oral cavity,
various lymph nodes Breast region: Body, heart, trachea, various
lymph nodes Pelvis: Body, bladder, femoral heads, pelvic
bones, rectum, spinal canal The system calculates the anatomical points, image
features, and similarity scores of the patient image and
compares them with pre-stored expert cases. Rigid
registration is used both for initializing the deformable
registration algorithm and for displaying the expert case
and patient image in aligned preview. The
autocontouring approach depends on the selected
structures. Either the structures are heuristically
segmented from the patient image, or the structures are
generated via deformable registration and structure
propagation from the expert cases. If multiple expert
cases are used, the propagated structures from the
different atlases are fused by means of the simultaneous
truth and performance level estimation (STAPLE)
algorithm. | The RPA uses deep learning algorithms
which Eclipse does not.
Use and function: In Eclipse, the user edits
the contours prior to planning. This is the
same for complex planning in the RPA
(VMAT planning for head / neck and cervix).
It is different for simple plans, where the plan
is generated before the user reviews the
contours. If the user edits the contours in the
RPA, they will have to delete the plan as
well. | Use and function: The RPA
provides autocontouring for a
range of structures, including
most of those listed here for
Eclipse.
Performance data: The
algorithm for Expert
Segmentation in Eclipse is
very similar to the Multi-Atlas
Contouring System (MACS)
that is used to contour
structures for the chest wall
planning in the RPA.
Safety and effectiveness: Both
Eclipse and the RPA are
designed to provide contours
that the users review and edit. | Devices are Substantially
Equivalent. Both devices
provide autocontouring
functions for the same
anatomical regions. Both
devices require user edits of
contours with 'complex plans'
prior to planning. |
| Eclipse,
Other Plan
Preparation | Automatic marker detection - Eclipse v.15.6 includes a
function ('Calypso Beacon Detection') to automatically
detect a specific type of marker (Calypso transponders)
on CT images. | Use and function: The Eclipse function is for
a specific type of marker that is different
from the generic markers that the RPA is
designed for. | Use and function: Both Eclipse
and the RPA can automatically
detect markers. | Devices are Substantially
Equivalent. Both devices can
automatically detect markers. |
| Comparison of the RPA System's Software Functions with the | | | | |
| Eclipse Treatment Planning System Version 15.6 (K181145) | | | | |
| Software | Description of Functions Available in | Differences | Similarities | Rationale for Substantial |
| Function | Eclipse | | | Equivalence |
| Eclipse
Automated
Planning,
VMAT | 1. Photon Optimizer (PO) algorithm. This algorithm
is used to optimize IMRT or VMAT plans based on
DVH constraints / objectives.
2. Automated Optimization Workflow. Enabling this
can automate the optimization workflow for IMRT
planning so that, after optimization, the leaf motion
calculation and final dose calculation are
automatically initiated, and the results are then
automatically saved. A similar feature exists for
VMAT plans.
3. DVH Estimation Models for RapidPlan. DVH
estimation models are created from information
extracted from a set of previous treatment plans
(called 'treatment plans'). The estimation models
predict the DVH that is achievable from the current
treatment plan (based on the geometry in the current
plan), and also creates a set of optimization objects
that can be based on the DVH estimates or fixed
(i.e., not based on the DVH estimates). | Use and function: The main difference for
VMAT planning is that Eclipse generally
creates a plan that the user reviews, makes
edits to the optimization constraints, and
repeats the process to improve the plan
quality. The RPA uses the same
optimization tools (i.e., the tools in Eclipse),
but the optimization objectives and
constraints have been pre-set to give optimal
plans for the majority of patients. The user
is not able to easily edit the RPA VMAT
plans so, if they do not approve the plan for
clinical use, they must delete it and create
one using their own routine processes (i.e.,
in their own treatment planning system). | Use and function: The RPA
uses some Eclipse features,
including DVH Estimates for
RapidPlan and the Photon
Optimizer for optimizing
VMAT plans. The plans look
very similar.
Safety and effectiveness: Both
Eclipse and the RPA are
designed to create plans that
the users then edit and review
for clinical acceptability prior
to use. | Devices are Substantially
Equivalent. Both devices provide
autoplanning features and create
plans that the users then edit and
review for clinical acceptability
prior to use. Both devices provide
Photon Optimizer, automated
optimization workflow and DVH
estimation models. |
| Autoplanning,
Other | 1. Beam Angle Optimization (BAO). This tool
optimizes the number and angle of treatment beams.
It optimizes the objective function, which is
determined by DVH goals / constraints and a normal
tissue objective (which falls off with distance from
the PTV). BAO can be used for IMRT plans or as a
starting point for conformal treatment plans.
2. Collimator Angle Optimization (CAO). This
function optimizes collimator angle for each arc of a
HyperArc plan such that, whenever possible, a given
pair of MLC leaves delineates only one target in the
beam's-eye-view.
3. Optimize collimator jaws. Adjusts the collimator
jaws to best fit the MLC leaves to the structure.
4. Use recommended jaw positions. Adjusts the
collimator jaw positions with an additional margin.
5. Optimize collimator rotation. Optimizes the
collimator rotation around a structure. | Use and function: Autoplanning in Eclipse
is mostly automation of individual tasks that
are controlled by the user. The user does
not control these tasks with the RPA.
Use and function: Review and editing of 3D
plans (cervix 4-field box, post-mastectomy
breast plans, whole brain plans) for the RPA
happens in the users' own treatment
planning system. | Use and function: Many of the
treatment plan details in
Eclipse and RPA use functions
with similar algorithms, such
as optimizing the jaw
positions.
Safety and effectiveness: Both
Eclipse and the RPA are
designed to create plans that
the users then edit and review
for clinical acceptability prior
to use. | Devices are Substantially
Equivalent. Both devices create
plans that the users then edit and
review for clinical acceptability
prior to use through the use of AI
software. Both devices provide
Beam Angle Optimization,
Collimator Angle Optimization and
collimator jaw optimization. |
Table 2: Comparison of Functions of the Subject Device with Functions of the Predicate Device
7
8
PERFORMANCE DATA 7.
7.1 Non-Clinical Data
7.1.1 Software Verification and Validation Testing
Software verification and validation was conducted, and documentation was provided as recommended by the FDA's Guidance for Industry and FDA Staff, "Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices." The software for this device was considered as a "major" level of concern. Test results demonstrate conformance to applicable requirements and specifications.
No animal studies or clinical tests have been included in this pre-market submission.
The ground truth treatment plans were generated by the primary 4-field box automation technique for cervical cancer by Kisling et al. (Kisling 2019) with beam apertures based on a patient's bony anatomy. Only the clinically acceptable plans were used for training (rated by physicians); their DRRs and corresponding beam apertures were the inputs for training (and just the DRRs for testing/prediction). No additional criteria were applied. The test set was generated in the same manner as the ground truth, but on previously unseen patients.
Initial software training for each anatomical location was successfully accomplished and is described in brief in Table 3 below.
9
Table 3 presents the initial testing performed for software training and testing. Multicenter performance testing is presented in Section 7.2.
| Anatomical
Location | Tissue Type(s) | Training Data Set | Test Data
Independence |
|------------------------|------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Head and Neck | Normal Tissue
(primary) | 3,288 patients (3,495 CT scans) who received radiation therapy at MD
Anderson Cancer Center between September 2004 and June 2018. Any
patient who received a simulation CT scan of the head/neck region in a
head -first supine position was eligible. | 174 CT scans were
randomly selected from
this group (and excluded
for training) plus
qualitative evaluation 24
CT scans from an
external dataset. |
| Head and Neck | Normal Tissue
(secondary) | 160 patients who received radiation therapy at MD Anderson Cancer
Center from 2018 to 2020. Any patient who received a simulation CT
scan of the head/neck region in a head-first supine position was eligible. | Test patients were
randomly selected and
excluded from the
training set. |
| | Lymph Node
CTVs | 61 patients who received radiation therapy at MD Anderson Cancer
Center between 2010 and 2019. Any patient who received a simulation
CT scan of the head/neck region in a head-first supine position was
eligible. | These 71 cases were
randomly placed in 3
groups: training (51 pts.),
cross-validation (10 pts.)
and final test (10 pts.). |
| Whole Brain | Whole Brain | The whole brain primary segmentation models used the same models as
used for head and neck segmentation, described above, as well as an
additional vertebral body localization and segmentation model (Vertebral
Bodies model: spinal canal CNN: 1,966, VB labeling: 803, VB
segmentation: 107, from 930 MDACC patients and 355 external
patients). Patients who received spinal radiotherapy for spinal metastases
(3DCRT and VMAT) at MD Anderson, or for whom data was publicly
available (MICCAI challenge data). | Test patients were
randomly selected from
this group (and excluded
for training). |
Table 3: Software Training for Anatomical Locations
10
| Anatomical
Location | Tissue Type(s) | Training Data Set | Test Data
Independence |
|------------------------|---------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| GYN | Normal Tissue
(primary) | 1,999 patients (2,254 CT scans) who received radiation therapy at MD
Anderson from September 2004 and June 2018. Any patient who
received a simulation CT scan of the pelvic region in a head-first supine
position was eligible. | 140 CT scans were
randomly selected from
this group (and excluded
for training) plus
qualitative evaluation
with 30 cervical cancer
patients from 3 centers in
S. Africa. |
| GYN | Normal Tissue
(secondary) | 192 patients (316 CT scans) who were treated for locally advanced
cervical cancer between 2006 and 2020. | Test patients were
randomly selected from
this group (and excluded
for training). |
| GYN | CTVs (primary) | 406 CT scans from 308 patients (UteroCervix), 250 CT scans from 201
patients (Nodal CTV), 146 CT scans from 131 patients (PAN), 490 CT
scans from 388 patients (Vagina), 487 CT scans from 388 patients
(Parametria) who received radiation therapy at MD Anderson Cancer
Center between 2006 and 2020. | Test patients were
randomly selected from
this group (and excluded
for training). |
| | Liver | Training data for GYN Liver (normal) comprised 119 patients (169 CT
scans) who had received contrast-enhanced and non-contrast CT imaging
of the liver at MD Anderson Cancer Center. | Test patients were
randomly selected from
this group (and excluded
for training). |
| Chest Wall | Whole Body
(secondary for
chest wall) | Training data for whole body (secondary for chest wall) comprised 250
patients who were treated at MD Anderson between August 2016 and
June 2021, with CT imaging in the thoracic region. | Test patients were
randomly selected from
this group (and excluded
for training). |
11
Standards Conformance 7.1.2
The subject device conforms in whole or in part with the following standards:
- IEC 62304 Medical device software Software life cycle processes
- IEC 62083 Requirements for the safety of radiotherapy treatment planning systems
7.2 Clinical Data
A summary of the multicenter clinical data is presented in the tables below.
12
| Characteristic | All
Cervix
VMAT | All
Cervix
3D | All Chest
Wall | All Head
and Neck | All
Whole
Brain |
|----------------------------------------------|-----------------------|---------------------|-------------------|----------------------|-----------------------|
| No. of Unique Patients with RPA Plan(s) | 50 | 47a; 45b | 46 | 86 | 46 |
| CT Scan Equipment | | | | | |
| Philips | x | x | x | x | x |
| Siemens | x | x | x | x | x |
| GE | x | x | x | x | x |
| No. of Clinical Sites | 5 | 5 | 5 | 5 | 5 |
| No. of Participating Physicians / Study Site | | | | | |
| Site 1 | 5 | 5 | 8 | 12 | 12 |
| Site 2 | 1 | 1 | 2 | 3 | 2 |
| Site 3 | 3 | 3 | 1 | 1 | 3 |
| Site 4 | 3 | 3 | 8 | 1 | 6 |
| Site 5 | 2 | 2 | 4 | 5 | 2 |
| Clinical Subgroups and Confounding Factors | | | | | |
| By Study Site | None | None | None | None | None |
| By Equipment | None | None | None | None | None |
| Age | | | | | |
| Mean | 51 | 50 | 51 | 62 | 60 |
| Min, Max | 26, 94 | 26, 84 | 31, 80 | 27, 87 | 14, 88 |
| Sex | | | | | |
| Male | 0.0% | 0.0% | 2.2% | 79.3% | 39.1% |
| Female | 100.0% | 100.0% | 97.8% | 29.7% | 34.8% |
| Not Reported | 0.0% | 0.0% | 0.0% | 0.0% | 26.1% |
| Race | | | | | |
| Asian | 9.8% | 2.1% | 26.1% | 5.4% | 6.5% |
| Black/African American | 13.7% | 14.9% | 13.0% | 12.0% | 8.7% |
| White | 39.2% | 78.7% | 32.6% | 73.9% | 54.3% |
| Native Hawaiian or Pacific Islander | 0.0% | 0.0% | 0.0% | 1.1% | 0.0% |
| British | 7.8% | 0.0% | 0.0% | 0.0% | 0.0% |
| American Indian or Alaskan Native | 2.0% | 0.0% | 4.3% | 1.1% | 0.0% |
| Other / not available | 27.5% | 4.3% | 23.9% | 6.5% | 28.3% |
| Ethnicity | | | | | |
| Hispanic or Latino | 25.5% | 10.6% | 8.7% | 7.6% | 6.5% |
| Not Hispanic or Latino | 41.2% | 46.8% | 43.5% | 50.0% | 58.7% |
| Other / not available | 33.3% | 42.6% | 47.8% | 42.4% | 34.8% |
Table 4: | Demographics, Number of Patients, Number of Samples, and Clinical Sites | |
---|---|---|
ª4-field box soft tissue plan
b4-field box bony landmark plan
13
| Criteria
Number | Criteria | Results |
---|---|---|
1 | Assess the safety of using the RPA plan for normal structures for treatment | |
planning by comparing the number of patient plans that pass accepted | ||
dosimetric metrics when assessed on the RPA contour with the number that | ||
pass when assessed on the clinical contour. The difference should be 5% or | ||
less. When there are multiple metrics for a single structure at least one | ||
should pass this criterion. | 0.7 | |
5 | Assess the quality of body contouring generated by the RPA by comparing | |
primary and secondary body contours generated by the RPA with manual | ||
body contours. Surface DSC (2mm) should be greater than 0.8 for 95% of | ||
the CT scans. | Surface DSC > 0.8 for | |
95% of CT scans | ||
6 | Assess the ability of the RPA to accurately identify the marked isocenter. | |
This is achieved by comparing the automatically generated isocenters with | ||
manually generated ones. 95% of automatically generated marked | ||
isocenters (primary and verification approaches) should agree with | ||
manually generated marked isocenters within 3mm in all orthogonal | ||
directions (AP, lateral, cranial-caudal). | 0.8 for | |
95% of CT scans |
Table 7: Chest Wall Summary of Statistical Results
16
| Criteria
Number | Inclusion Criteria | Exclusion Criteria | Sampling Method |
---|---|---|---|
1 | CT scan of the breast (thorax) region. | Poor Image Quality | Test datasets were chosen |
going forward in time | |||
until sufficient data was | |||
collected, starting with CT | |||
scans collected on January | |||
1, 2022. If insufficient | |||
patient scans were found, | |||
data collection was | |||
restarted with January 1, | |||
2021 (for patients treated | |||
in 2021) and so forth, until | |||
sufficient data was | |||
collected. | |||
2 | Clear CT image of the breast (thorax) region | ||
without distortions. | - | ||
3 | Test datasets must consist of CT images of | ||
patients previously treated for | |||
postmastectomy breast radiotherapy | |||
following one of the following treatment | |||
schemes: | |||
i. Tangent fields with supraclavicular | |||
fields. Similar approaches, | |||
including those that also treat the | |||
intramammary lymph nodes are | |||
also acceptable. | - | ||
4 | Scan was obtained with patient head-first, | ||
supine. | - | ||
5 | The datasets must include CT images, | ||
original clinical contours of anatomic | |||
structures and treatment targets, and the dose | |||
distributions used for patient treatment. | - | ||
6 | Test datasets were chosen going forward in | ||
time until sufficient data was collected, | |||
starting with CT scans collected on January | |||
1, 2022. If insufficient patient scans were | |||
found, data collection can be restarted with | |||
January 1, 2021 (for patients treated in 2021) | |||
and so forth, until sufficient data was | |||
collected. | - | ||
7 | Testing datasets must be unique, with no | ||
overlap with data used for model creation or | |||
in previous validation studies. | - | ||
8 | CT scan must include the manufacturer and | ||
model of the scanner used to obtain the CT | |||
image or recorded separately. | - |
Table 8: Chest Wall Protocol Summary
17
| Criteria
Number | Criteria | Results |
---|---|---|
1 | Assess the safety of use of RPA normal structures for treatment planning | |
by comparing the number of patient plans that passed accepted dosimetric | ||
metrics (e.g., mean dose to the parotid) when assessed on the RPA | ||
contour with the number that passed when assessed on the clinical | ||
contour. The difference should be 5% or less. When multiple metrics | ||
were used to assess a single structure at least one should pass this | ||
criterion. | 0.7 | |
5 | Assess the quality of body contouring generated by the RPA by | |
comparing body contours generated by the RPA with manual body | ||
contours. Surface DSC (2mm) should be greater than 0.8 for 95% of the | ||
CT scans. | Surface DSC > 0.8 for |
95% of CT scans |
| 6 | Assess the ability of the RPA to accurately identify the marked isocenter.
This is achieved by comparing the automatically generated isocenters with
manually generated ones. 95% of automatically generated marked should
agree with manually generated marked isocenters within 3mm in all
orthogonal directions (AP, lateral, cranial-caudal). | 0.8 difference
between RPA Plan and
Clinical Plan for all
assessments. |
| 5 | Assess the ability of the RPA to accurately identify the marked isocenter.
This is achieved by comparing the automatically generated isocenters with
manually generated ones. 95% of automatically generated marked
isocenters (primary and verification approaches) should agree with
manually generated marked isocenters within 3mm in all orthogonal
directions. |