(112 days)
Rayvolve is a computer-assisted detection and diagnosis (CAD) software device to assist radiologists and emergency physicians in detecting fractures during the review of radiographs of the musculosketal system. Rayvolve is indicated for adult and pediatric population (≥ 2 years).
Rayvolve is indicated for radiographs of the following industry-standard radiographic views and study types.
Study type (Anatomic Area of interest) / Radiographic Views* supported: Ankle/ AP, Lateral, Oblique Clavicle/ AP, AP Angulated View Elbow/ AP, Lateral Forearm/ AP, Lateral Hip /AP, Frog-leg lateral Humerus /AP, Lateral Knee/ AP, Lateral Pelvis /AP Shoulder/ AP, Lateral, Axillary Tibia/fibula/ AP, Lateral Wrist/ PA, Lateral, Oblique Hand / PA, Lateral, Oblique Foot/ AP, Lateral, Oblique.
- Definitions of anatomic area of interest and radiographic views are consistent with the ACR-SPR-SSR Practice Parameter for the Performance of Radiography of the Extremities guideline.
The medical device is called Rayvolve. It is a standalone software that uses deep learning techniques to detect and localize fractures on osteoarticular X-rays. Rayvolve is intended to be used as an aided-diagnosis device and does not operate autonomously.
Rayvolve has been developed to use the current edition of the DICOM image standard. DICOM is the international standard for transmitting, storing, printing, processing, and displaying medical imaging.
Using the DICOM standard allows Rayvolve to interact with existing DICOM Node servers (eg.: PACS) and clinical-grade image viewers. The device is designed for running on-premise, cloud platform, connected to the radiology center local network, and can interact with the DICOM Node server.
When remotely connected to a medical center DICOM Node server. Rayvolve directly interacts with the DICOM files to output the prediction (potential presence or absence of fracture) the initial image appears first, followed by the image processed by Ravvolve.
Rayvolve does not intend to replace medical doctors. The instructions for use are strictly and systematically transmitted to each user and used to train them on Ravvolve's use.
Here's a breakdown of the acceptance criteria and the study proving the device meets them, based on the provided FDA 510(k) summary for Rayvolve:
1. Table of Acceptance Criteria and Reported Device Performance
The acceptance criteria are not explicitly listed in a single table with defined thresholds. However, based on the performance data presented, the implicit acceptance criteria for standalone performance appear to be:
- High Sensitivity, Specificity, and AUC for fracture detection.
- Non-inferiority of the retrained algorithm (including pediatric population) compared to the predicate device, specifically by ensuring the lower bound of the difference in AUCs (Retrained - Predicate) for each anatomical area is greater than -0.05.
- Superior diagnostic accuracy of readers when aided by Rayvolve compared to unaided readers, as measured by AUC in an MRMC study.
- Improved sensitivity and specificity for readers when aided by Rayvolve.
Table: Acceptance Criteria (Implicit) and Reported Device Performance
| Acceptance Criterion (Implicit) | Reported Device Performance (Standalone & MRMC Studies) |
|---|---|
| Standalone Performance (Pediatric Population Inclusion) | |
| High Sensitivity for fracture detection in pediatric population (implicitly > 0.90 based on predicate). | 0.9611 (95% CI: 0.9480; 0.9710) |
| High Specificity for fracture detection in pediatric population (implicitly > 0.80 based on predicate). | 0.8597 (95% CI: 0.8434; 0.8745) |
| High AUC for fracture detection in pediatric population (implicitly > 0.90 based on predicate). | 0.9399 (95% Bootstrap CI: 0.9330; 0.9470) |
| Non-inferiority of Retrained Algorithm (compared to Predicate for adult & pediatric) | |
| Lower bound of difference in AUCs (Retrained - Predicate) > -0.05 for all anatomical areas. | "The lower bounds of the differences in AUCs for the Retrained model compared to the Predicate model are all greater than -0.05, indicating that the Retrained model's performance is not inferior to the Predicate model across all organs." (Specific values for each organ are not provided, only the conclusion that they meet the criterion.) The Total AUC for Retrained is 0.98781 (0.98247; 0.99048) compared to Predicate 0.98607 (0.98104; 0.99058). Overlapping CIs and the non-inferiority statement support this. This suggests the inclusion of pediatric data did not degrade performance in adult data. |
| MRMC Clinical Reader Study | |
| Diagnostic accuracy (AUC) of readers aided by Rayvolve is superior to unaided readers. | Reader AUC improved from 0.84602 to 0.89327, a difference of 0.04725 (95% Cl: 0.03376; 0.061542) (p=0.0041). This demonstrates statistically significant superiority. |
| Reader sensitivity is improved with Rayvolve assistance. | Reader sensitivity improved from 0.86561 (95% Wilson's Cl: 0.84859, 0.88099) to 0.9554 (95% Wilson's CI: 0.94453, 0.96422). |
| Reader specificity is improved with Rayvolve assistance. | Reader specificity improved from 0.82645 (95% Wilson's Cl: 0.81187, 0.84012) to 0.83116 (95% Wilson's CI: 0.81673, 0.84467). |
2. Sample Sizes and Data Provenance
-
Test Set (Pediatric Standalone Study):
- Sample Size: 3016 radiographs.
- Data Provenance: Not explicitly stated regarding country of origin. The study was retrospective.
-
Test Set (Adult Predicate Standalone Study - for comparison):
- Sample Size: 2626 radiographs.
- Data Provenance: Not explicitly stated regarding country of origin.
-
Test Set (MRMC Clinical Reader Study):
- Sample Size: 186 cases.
- Data Provenance: Not explicitly stated regarding country of origin. The study was retrospective.
-
Training Set:
- Sample Size: 150,000 osteoarticular radiographs. (Expanded from 115,000 for the predicate device).
- Data Provenance: Not explicitly stated regarding country of origin.
3. Number of Experts and Qualifications for Ground Truth (Test Set)
- Number of Experts: A panel of three (3) US board-certified MSK radiologists.
- Qualifications of Experts: US board-certified MSK (Musculoskeletal) radiologists. Years of experience are not specified, but board certification implies a certain level of expertise.
4. Adjudication Method for the Test Set (Ground Truth Establishment)
- Method: "Each case had been previously evaluated by a panel of three US board-certified MSK radiologists to provide ground truth binary labeling the presence or absence of fracture and the localization information for fractures." This implies a consensus-based ground truth, likely achieved through discussion and agreement among the three radiologists. The term "panel" suggests a collaborative review. No specific "2+1" or "3+1" rule is mentioned, but "panel of three" indicates a rigorous approach to consensus.
5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study
- Was it done?: Yes, a fully crossed multi-reader, multi-case (MRMC) retrospective reader study was done.
- Effect Size of Improvement:
- AUC Improvement: Reader AUC was significantly improved from 0.84602 (unaided) to 0.89327 (aided), resulting in a difference (effect size) of 0.04725 (95% Cl: 0.03376; 0.061542) (p=0.0041).
- Sensitivity Improvement: Reader sensitivity improved from 0.86561 (unaided) to 0.9554 (aided).
- Specificity Improvement: Reader specificity improved from 0.82645 (unaided) to 0.83116 (aided).
6. Standalone (Algorithm Only) Performance Study
- Was it done?: Yes, standalone performance assessments were conducted for both the pediatric population inclusion and the retrained algorithm.
- Pediatric Standalone Study: Sensitivity (0.9611), Specificity (0.8597), and AUC (0.9399) were reported.
- Retrained Algorithm Standalone Study: Non-inferiority was assessed by comparing AUCs against the predicate device's standalone performance, showing improvements or non-inferiority across body parts (e.g., Total AUC for retrained was 0.98781 vs. predicate 0.98607).
7. Type of Ground Truth Used
- For Test Sets (Standalone & MRMC): Expert consensus by a panel of three US board-certified MSK radiologists. They provided binary labeling (presence/absence of fracture) and localization information (bounding boxes) for fractures. This is a form of expert consensus.
8. Sample Size for the Training Set
- Sample Size: 150,000 osteoarticular radiographs.
9. How Ground Truth for the Training Set was Established
The document states that the "training dataset for the subject device was expanded to include 150,000 osteoarticular radiographs". While it confirms the size and composition (mixed adult/pediatric, osteoarticular radiographs), it does not explicitly describe how the ground truth for this training set was established. It mentions that the "previous truthed predicate test dataset was strictly walled off and not included in the new training dataset," implying that the training data was "truthed," but the method (e.g., expert review, automated labeling, etc.) is not detailed. Given the large training set size, it is common for such datasets to be curated through a combination of established clinical reports, expert review, or semi-automated processes, but the specific methodology is not provided in this summary.
{0}------------------------------------------------
July 17, 2024
Image /page/0/Picture/1 description: The image shows the logo of the U.S. Food and Drug Administration (FDA). On the left is the Department of Health & Human Services logo. To the right of that is the FDA logo, which is a blue square with the letters "FDA" in white. To the right of the blue square is the text "U.S. FOOD & DRUG ADMINISTRATION" in blue.
AZmed SAS Christelle Baille Head of QARA 6 rue Léonard de Vinci Laval, France, 53000
Re: K240845
Trade/Device Name: Rayvolve Regulation Number: 21 CFR 892.2090 Regulation Name: Radiological computer assisted detection and diagnosis software Regulatory Class: Class II Product Code: QBS Dated: June 18, 2024 Received: June 18, 2024
Dear Christelle Baille:
We have reviewed your section 510(k) premarket notification of intent to market the device referenced above and have determined the device is substantially equivalent (for the indications for use stated in the enclosure) to legally marketed predicate devices marketed in interstate commerce prior to May 28, 1976, the enactment date of the Medical Device Amendments, or to devices that have been reclassified in accordance with the provisions of the Federal Food, Drug, and Cosmetic Act (the Act) that do not require approval of a premarket approval application (PMA). You may, therefore, market the device, subject to the general controls provisions of the Act. Although this letter refers to your product as a device, please be aware that some cleared products may instead be combination products. The 510(k) Premarket Notification Database available at https://www.accessdata.fda.gov/scripts/cdrb/cfdocs/cfpmn/pmn.cfm identifies.combination product submissions. The general controls provisions of the Act include requirements for annual registration, listing of devices, good manufacturing practice, labeling, and prohibitions against misbranding and adulteration. Please note: CDRH does not evaluate information related to contract liability warranties. We remind you, however, that device labeling must be truthful and not misleading.
If your device is classified (see above) into either class II (Special Controls) or class III (PMA), it may be subject to additional controls. Existing major regulations affecting your device can be found in the Code of Federal Regulations, Title 21, Parts 800 to 898. In addition, FDA may publish further announcements concerning your device in the Federal Register.
Additional information about changes that may require a new premarket notification are provided in the FDA guidance documents entitled "Deciding When to Submit a 510(k) for a Change to an Existing Device" (https://www.fda.gov/media/99812/download) and "Deciding When to Submit a 510(k) for a Software Change to an Existing Device" (https://www.fda.gov/media/99785/download).
{1}------------------------------------------------
Your device is also subject to, among other requirements, the Quality System (QS) regulation (21 CFR Part 820), which includes, but is not limited to, 21 CFR 820.30, Design controls; 21 CFR 820.90, Nonconforming product; and 21 CFR 820.100, Corrective and preventive action. Please note that regardless of whether a change requires premarket review, the QS regulation requires device manufacturers to review and approve changes to device design and production (21 CFR 820.30 and 21 CFR 820.70) and document changes and approvals in the device master record (21 CFR 820.181).
Please be advised that FDA's issuance of a substantial equivalence determination does not mean that FDA has made a determination that your device complies with other requirements of the Act or any Federal statutes and regulations administered by other Federal agencies. You must comply with all the Act's requirements, including, but not limited to: registration and listing (21 CFR Part 807); labeling (21 CFR Part 801); medical device reporting of medical device-related adverse events) (21 CFR Part 803) for devices or postmarketing safety reporting (21 CFR Part 4, Subpart B) for combination products (see https://www.fda.gov/combination-products/guidance-regulatory-information/postmarketing-safety-reportingcombination-products); good manufacturing practice requirements as set forth in the quality systems (OS) regulation (21 CFR Part 820) for devices or current good manufacturing practices (21 CFR Part 4, Subpart A) for combination products; and, if applicable, the electronic product radiation control provisions (Sections 531-542 of the Act); 21 CFR Parts 1000-1050.
Also, please note the regulation entitled, "Misbranding by reference to premarket notification" (21 CFR 807.97). For questions regarding the reporting of adverse events under the MDR regulation (21 CFR Part 803), please go to https://www.fda.gov/medical-device-safety/medical-device-reportingmdr-how-report-medical-device-problems.
For comprehensive regulatory information about mediation-emitting products, including information about labeling regulations, please see Device Advice (https://www.fda.gov/medicaldevices/device-advice-comprehensive-regulatory-assistance) and CDRH Learn (https://www.fda.gov/training-and-continuing-education/cdrh-learn). Additionally, you may contact the Division of Industry and Consumer Education (DICE) to ask a question about a specific regulatory topic. See the DICE website (https://www.fda.gov/medical-device-advice-comprehensive-regulatoryassistance/contact-us-division-industry-and-consumer-education-dice) for more information or contact DICE by email (DICE@fda.hhs.gov) or phone (1-800-638-2041 or 301-796-7100).
Sincerely,
Samuel for
Jessica Lamb, Ph.D. Assistant Director Imaging Software Team DHT8B: Division of Radiological Imaging Devices and Electronic Products OHT8: Office of Radiological Health Office of Product Evaluation and Quality Center for Devices and Radiological Health
{2}------------------------------------------------
Indications for Use
510(k) Number (if known) K240845
Device Name Rayvolve
Indications for Use (Describe)
Rayvolve is a computer-assisted detection and diagnosis (CAD) software device to assist radiologists and emergency physicians in detecting fractures during the review of radiographs of the musculosketal system. Rayvolve is indicated for adult and pediatric population (≥ 2 years).
Rayvolve is indicated for radiographs of the following industry-standard radiographic views and study types.
Study type (Anatomic Area of interest) / Radiographic Views* supported: Ankle/ AP, Lateral, Oblique Clavicle/ AP, AP Angulated View Elbow/ AP, Lateral Forearm/ AP, Lateral Hip /AP, Frog-leg lateral Humerus /AP, Lateral Knee/ AP, Lateral Pelvis /AP Shoulder/ AP, Lateral, Axillary Tibia/fibula/ AP, Lateral Wrist/ PA, Lateral, Oblique Hand / PA, Lateral, Oblique Foot/ AP, Lateral, Oblique.
- Definitions of anatomic area of interest and radiographic views are consistent with the ACR-SPR-SSR Practice Parameter for the Performance of Radiography of the Extremities guideline.
Type of Use (Select one or both, as applicable)
| ☑ Prescription Use (Part 21 CFR 801 Subpart D) | ☐ Over-The-Counter Use (21 CFR 801 Subpart C) |
|---|---|
| ----------------------------------------------------------------------------------------------------- | ----------------------------------------------- |
CONTINUE ON A SEPARATE PAGE IF NEEDED.
This section applies only to requirements of the Paperwork Reduction Act of 1995.
DO NOT SEND YOUR COMPLETED FORM TO THE PRA STAFF EMAIL ADDRESS BELOW.
The burden time for this collection of information is estimated to average 79 hours per response, including the time to review instructions, search existing data sources, gather and maintain the data needed and complete and review the collection of information. Send comments regarding this burden estimate or any other aspect of this information collection, including suggestions for reducing this burden, to:
Department of Health and Human Services Food and Drug Administration Office of Chief Information Officer Paperwork Reduction Act (PRA) Staff PRAStaff(@fda.hhs.gov
"An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB number."
{3}------------------------------------------------
Image /page/3/Picture/0 description: The image shows the logo for AZ Med. The logo consists of a blue geometric shape that resembles the letters "AZ" on the left, followed by the text "AZMED" in a dark blue sans-serif font on the right. The geometric shape is made up of several triangles and trapezoids, creating a three-dimensional effect.
RAYVOLVE 510K Summary
{4}------------------------------------------------
Image /page/4/Picture/0 description: The image contains the logo for AZ Med. The logo consists of a blue geometric shape on the left and the text "AZ MED" on the right. The geometric shape is a stylized representation of the letters "A" and "Z" combined into a single design. The text "AZ MED" is written in a bold, sans-serif font and is also blue.
Content
| 1. Submitter | 3 |
|---|---|
| 2. Device identification | 3 |
| 3. Predicate device | 3 |
| 4. Device description | 3 |
| 5. Intended use/Indication for use | 4 |
| 6. Substantial equivalence Discussion | 5 |
| 7. Performance data | 8 |
| a. Software verification and validation testing | 8 |
| b. Bench Testing | 8 |
| c. Clinical data | 14 |
| 8. CONCLUSION | 15 |
{5}------------------------------------------------
Image /page/5/Picture/0 description: The image contains the logo for AZ Med. The logo consists of a blue hexagon with the letters AZ inside of it. To the right of the hexagon is the word "MED" in blue.
1. Submitter
Submitted date: 2024-07-15
| Submitter | AZmed SAS6 rue Léonard de Vinci53000 Laval, FrancePhone: +33 6 43 31 51 38 |
|---|---|
| Contact personn | Christelle BAILLEHead of QARA6 rue Léonard de Vinci53000 Laval, FrancePhone: +33 6 43 31 51 38Mail: christelle@azmed.co |
2. Device identification
| Name ofthe Device | Commonor UsualName | Regulatorysection | Classification | ProductCode | Panel |
|---|---|---|---|---|---|
| Rayvolve | Rayvolve | 21 CFR892.2090 | Class II | QBS | 90(Radiology) |
3. Predicate device
The legally marketed device for which AZmed is claiming equivalence is identified as follows:
| Manufacturer | Product Name | 510K Number |
|---|---|---|
| AZmed | Rayvolve | K220164 |
{6}------------------------------------------------
Image /page/6/Picture/0 description: The image shows the logo for AZ Med. The logo consists of a blue geometric shape on the left and the text "AZ MED" in blue on the right. The geometric shape is a stylized letter "Z" formed by overlapping blue ribbons. The text "AZ MED" is in a sans-serif font and is slightly darker than the geometric shape.
4. Device description
The medical device is called Rayvolve. It is a standalone software that uses deep learning techniques to detect and localize fractures on osteoarticular X-rays. Rayvolve is intended to be used as an aided-diagnosis device and does not operate autonomously.
Rayvolve has been developed to use the current edition of the DICOM image standard. DICOM is the international standard for transmitting, storing, printing, processing, and displaying medical imaging.
Using the DICOM standard allows Rayvolve to interact with existing DICOM Node servers (eg.: PACS) and clinical-grade image viewers. The device is designed for running on-premise, cloud platform, connected to the radiology center local network, and can interact with the DICOM Node server.
When remotely connected to a medical center DICOM Node server. Rayvolve directly interacts with the DICOM files to output the prediction (potential presence or absence of fracture) the initial image appears first, followed by the image processed by Ravvolve.
Rayvolve does not intend to replace medical doctors. The instructions for use are strictly and systematically transmitted to each user and used to train them on Ravvolve's use.
Intended use/Indication for use 5.
Rayvolve is a computer-assisted detection and diagnosis (CAD) software device to assist radiologists and emergency physicians in detecting fractures during the review of radiographs of the musculoskeletal system. Rayvolve is indicated for the adult and pediatric population (≥ 2 years).
Rayvolve is indicated for radiographs of the following industry-standard radiographic views and study types.
Study type (Anatomic Area of interest) / Radiographic Views* supported:
- Ankle / AP, Lateral, Oblique -
- Clavicle / AP, AP Angulated View -
- Elbow / AP, Lateral -
- Forearm / AP, Lateral -
- Hip / AP, Frog leg lateral -
- -Humerus / AP, Lateral
- Knee / AP, Lateral -
- -Pelvis / AP
- Shoulder / AP, Lateral, Axillary -
- Tibia/fibula / AP, Lateral -
- -Wrist / PA, Lateral, Oblique
- -Hand / PA, Lateral, Oblique
- Foot / AP, Lateral, Oblique. -
{7}------------------------------------------------
Image /page/7/Picture/0 description: The image shows the logo for AZ Med. The logo consists of a blue geometric shape on the left and the text "AZ MED" in blue on the right. The geometric shape is a stylized representation of the letters "A" and "Z". The text "AZ MED" is in a sans-serif font.
- Definitions of anatomic area of interest and radiographic views are consistent with the ACR-SPR-SSR Practice Parameter for the Performance of Radiography of the Extremities guideline.
Substantial equivalence Discussion 6.
The comparison chart below provides evidence to facilitate the substantial equivalence determination between Rayvolve to the predicate device (K220164) concerning the intended use, technological characteristics, and principle of operation vice and the cited predicate device.
| Comparison topredicate device | Rayvolve - Predicate(K220164) | Rayvolve - Subjectdevice 510(k) file | Comparison tothe predicate |
|---|---|---|---|
| Device Name | Rayvolve | Rayvolve | Same |
| Manufacturer | AZmed SAS | AZmed SAS | Same |
| 510 (k) # | K220164 | K240845 | N/A |
| RegulationNumber | 21 CFR 892.2090 | 21 CFR 892.2090 | Same |
| Class | II | II | Same |
| Product Code | QBS | QBS | Same |
| Device Panel | Radiology | Radiology | Same |
| Level ofConcern | Moderate | Moderate | Same |
| Intended use /Indications foruse | Rayvolve is acomputer-assisteddetection and diagnosis(CAD) software device toassist radiologists andemergency physicians indetecting fractures duringthe review of radiographsof the musculoskeletalsystem. | Rayvolve is acomputer-assisteddetection and diagnosis(CAD) software device toassist radiologists andemergency physicians indetecting fractures duringthe review of radiographsof the musculoskeletalsystem. | Same |
| Intended user | Radiologists andemergency physicians | Radiologists andemergency physicians | Same |
| Intended patientpopulation | Adult ≥ 22 years old | Adult and pediatricpopulation | The subjectdevice isindicated for |
| Comparison topredicate device | Rayvolve - Predicate(K220164) | Rayvolve - Subjectdevice 510(k) file | Comparison tothe predicate |
| pediatric(≥ 2 years) aswell as adultpatients. | |||
| Image modality | X-Ray | X-Ray | Same |
| Anatomic Areasof Interest | AnkleClavicleElbowForearmHipHumerusKneePelvisShoulderTibia/fibulaWristHandFoot | AnkleClavicleElbowForearmHipHumerusKneePelvisShoulderTibia/fibulaWristHandFoot | Same |
| Clinical findings | Fractures | Fractures | Same |
| Machinelearningtechnology | Supervised Deeplearning | Supervised Deeplearning | Same |
| Image source | DICOM node (e.g.,imaging device,intermediate, DICOMnode, PACS system, etc) | DICOM node (e.g.,imaging device,intermediate, DICOMnode, PACS system, etc) | Same |
| Image viewing | PACS systemImage annotationstoggled on or of | PACS systemImage annotationstoggled on or of | Same |
| Privacy | HIPAA Compliant | HIPAA Compliant | Same |
| Platform | On-premise, on cloud,secure local processingand delivery of DICOMimages (eg:PACS) | On-premise, on cloud,secure local processingand delivery of DICOMimages (eg:PACS) | Same |
| Electromagneticcompatibilityand electricalsafety | N/A, Rayvolve is astandalone software andis not subject to | N/A, Rayvolve is astandalone software andis not subject to | Same |
| Comparison topredicate device | Rayvolve - Predicate(K220164) | Rayvolve - Subjectdevice 510(k) file | Comparison tothe predicate |
| electromagnetic testing.Therefore noelectromagneticcompatibility andelectrical safety isrequired. | electromagnetic testing.Therefore noelectromagneticcompatibility andelectrical safety isrequired. | ||
| Magneticresonance | N/A, Rayvolve is astandalone software andis not subject to magneticresonance. Therefore nomagnetic testing isrequired. | N/A, Rayvolve is astandalone software andis not subject to magneticresonance. Therefore nomagnetic testing isrequired. | Same |
| Animal and/orCadever Testing | N/A, Rayvolve is astandalone software | N/A, Rayvolve is astandalone software | Same |
| Biocompatibility | N/A, Rayvolve is astandalone software withno direct or indirectpatient or user contactingcomponents. Thereforeno biocompatibility isrequired. | N/A, Rayvolve is astandalone software withno direct or indirectpatient or user contactingcomponents. Thereforeno biocompatibility isrequired. | Same |
{8}------------------------------------------------
Image /page/8/Picture/0 description: The image contains the logo for AZ Med. The logo consists of a blue geometric shape on the left and the text "AZ MED" on the right. The geometric shape is a stylized representation of the letters "AZ". The text "AZ MED" is written in a bold, sans-serif font and is also blue.
{9}------------------------------------------------
Image /page/9/Picture/0 description: The image contains the logo for AZmed. The logo consists of a blue geometric shape resembling a stylized letter "A" and "Z" on the left, followed by the text "AZMED" in a bold, dark blue font on the right. The geometric shape has a gradient effect, with darker shades of blue at the bottom and lighter shades at the top, giving it a three-dimensional appearance.
Table 2: Comparison between the predicate and subject devices
AZmed claims the substantial equivalence of Rayvolve with the predicate Rayvolve (K220164) based on the functional principle of the software algorithms, the same technological characteristics, and the intended purpose of the software algorithm.
{10}------------------------------------------------
Image /page/10/Picture/0 description: The image shows the logo for AZ Med. The logo consists of a blue hexagon shape with a stylized "AZ" inside of it, followed by the text "AZ MED" in a dark blue sans-serif font. The hexagon and text are aligned horizontally.
7. Performance data
Software verification and validation testing a.
The device's software development, verification, and validation have been carried out by FDA quidelines. The software was tested against the established software design specification for each test plan to ensure the device's performance as intended. The device hazard analysis was completed and risk control was implemented to mitigate identified hazards. The testing results support that all the software specifications have met the acceptance criteria of each module and interaction of processes. Rayvolve device passes all the testing and supports the claims of substantial equivalence with the predicate.
Validation activities included a usability study of Rayvolve under normal conditions for use. The study demonstrated:
- Non-invasive usability because users' habits are unchanged, –
- Comprehension of the instructions for use provided with the device. ।
Bench Testing b.
i. Study on Pediatric population (≥ 2 years)
To include the pediatric population (≥ 2 years) in the indications for use of Rayvolve, AZmed conducted a standalone performance assessment on 3016 radiographs for all the study types (anatomic areas of interest) and views in the indication for Use. Within this standalone performance study, all the sensitivity, specificity, and AUC metrics have been computed per radiograph.
The results of standalone testing demonstrated that Ravvolve detects fractures of the musculoskeletal system radiographs with high sensitivity (0.9611, 95% Wilson's Confidence Interval (CI): 0.9480; 0.9710), high specificity (0.8597; 95% Wilson's CI: 0.8434; 0.8745) and high Area Under The Curve (AUC) of the Receiver Operating Characteristic (ROC) (0.9399; 95% Bootstrap C1: 0.9330; 0.9470).
Note: this study has been made on the following machines: Konica Minolta, IRay, AGFA, and GEHC.
The overall goal of the conducted study was to compare the diagnostic performances of Rayvolve on the pediatric (≥ 2 years) clinical performance study dataset to the diagnostic performances of Rayvolve on the adult clinical performance study dataset (included in the submission of the predicate device).
AZmed compares the AUC confidence intervals with those of the adult algorithm. Overlapping confidence intervals indicate that the pediatric software's performance is statistically non-inferior to the adult software, confirming similar efficacy in these regions.
{11}------------------------------------------------
Image /page/11/Picture/0 description: The image shows the logo for AZ Med. The logo consists of a blue hexagon with the letters "AZ" inside of it. The letters are also blue and are stylized to look like they are connected. To the right of the hexagon is the word "MED" in blue letters. The logo is simple and modern, and the blue color gives it a professional look.
The results of the study demonstrated that Rayvolve detects fractures in radiographs with high level performances :
| AUC (Bootstrapped CI) | |
|---|---|
| All | 0.9399(0.9330; 0.9470) |
Rayvolve performance on all radiographs
| Anatomic Area | AUC (Bootstrapped CI) |
|---|---|
| Ankle | 0.9489 (0.9257; 0.9694) |
| Clavicle | 0.9263 (0.8846; 0.9645) |
| Elbow | 0.9308 (0.8932; 0.9629) |
| Foreram | 0.936 (0.9012; 0.9666) |
| Humerus | 0.9568 (0.9268; 0.9818) |
| Hip | 0.947 (0.922; 0.9681) |
| Knee | 0.9624 (0.9472; 0.9756) |
| Pelvis | 0.9263 (0.8947; 0.9559) |
| Shoulder | 0.9372 (0.9037; 0.9664) |
| Tibia/Fibula | 0.9616 (0.9362; 0.9824) |
| Wrist | 0.9484 (0.9258; 0.9688) |
| Hand | 0.9485 (0.9306; 0.9654) |
| Foot | 0.9404 (0.9211; 0.9581) |
Rayvolve performances regarding the anatomic area of the study
{12}------------------------------------------------
Image /page/12/Picture/0 description: The image shows the logo for AZ Med. The logo consists of a blue hexagon with the letters AZ inside of it. To the right of the hexagon is the word "AZMED" in a dark blue sans-serif font.
| Ethnicity | AUC (Bootstrapped CI) |
|---|---|
| Caucasian | 0.944 (0.9331; 0.9548) |
| Hispanic | 0.948 (0.9362; 0.9589) |
| African-American | 0.9542 (0.9335; 0.9724) |
| Asian | 0.9272 (0.8932; 0.9588) |
| Others | 0.9308 (0.9087; 0.9503) |
Rayvolve performances regarding the ethnicity site
For pediatric anatomical regions, Rayvolve was not found to perform as well as the adults in the performance test dataset. However, the results demonstrated that Rayvolve performs with high accuracy across study types and potential confounders (anatomic areas of interest, views, patient age, and sex, image acquisition, types of fractures, weight-bearing and non-weight bearing bone fractures, and different X-ray system manufacturers).
ii. Study on the Adult population
1. Rayvolve predicate (K220164)
For Rayvolve predicate (K220164), AZmed conducted a standalone performance assessment on 2626 radiographs for all the study types (anatomic areas of interest) and views in the Indications for Use. The results of standalone testing demonstrated that Rayvolve detects fractures of the musculoskeletal system radiographs with high sensitivity (0.98763, 95% Wilson's Confidence Interval (CI): 0.97559; 0.99421), high specificity (0.88558; 95% Wilson's Cl: 0.87119; 0.89882) and high Area Under The Curve (AUC) of the Receiver Operating Characteristic (ROC) (0.98607; 95% Bootstrap CI: 0.98104; 0.99058).
Additionally, the results demonstrated that Rayvolve performs with high accuracy across study types (anatomic areas of interest, views, patient age, sex, and machine) and across potential confounders such as different X-ray manufacturers.
Note: this study has been made on the following machines: Philips DigitalDiagnost, Carestream Health DRX-1, and GEHC.
{13}------------------------------------------------
Image /page/13/Picture/0 description: The image shows the logo for AZ Med. The logo consists of a blue hexagon with a stylized "AZ" inside of it. To the right of the hexagon is the word "MED" in blue, block letters. The logo is simple and modern, and the blue color gives it a professional look.
The results of the standalone testing demonstrated that Rayvolve detects fractures of the musculoskeletal system radiographs with high AUC across the following subgroups:
| AUC (Bootstrapped CI) | |
|---|---|
| All | 0.98607(0.98104; 0.99058) |
| Anatomic Area | AUC (Bootstrapped CI) |
|---|---|
| Ankle | 0.99137 (0.98374; 0.99727) |
| Clavicle | 0.97806 (0.94626; 0.99761) |
| Elbow | 0.9964 (0.99059; 1.0) |
| Forearm | 0.9953 (0.98909; 0.99937) |
| Humerus | 0.9955 (0.98960; 0.99943) |
| Hip | 0.95821 (0.93239; 0.98014) |
| Knee | 0.97742 (0.95084; 0.99592) |
| Pelvis | 0.97676 (0.95241; 0.99638) |
| Shoulder | 0.97814 (0.94147; 0.99958) |
| Tibia/Fibula | 0.98285 (0.95925; 0.9978) |
| Wrist | 0.99567 (0.99126; 0.99897) |
| Hand | 0.99552 (0.99074; 0.99898) |
| Foot | 0.99162 (0.98238; 0.99823) |
Rayvolve performance on all radiographs
Rayvolve performances regarding the anatomic area of the study
{14}------------------------------------------------
Image /page/14/Picture/0 description: The image contains the logo for AZ Med. The logo consists of a blue hexagon with a stylized "AZ" inside of it. To the right of the hexagon is the text "AZ MED" in a dark blue sans-serif font.
2. Rayvolve retrained algorithm
Performed study for the retrained algorithm
The core design of the Rayvolve algorithm, including the object detection model, remains unchanged from the predicate device (Rayvolve K220164). The architecture and key components are consistent with those previously described.
The training dataset for the subject device was expanded to include 150,000 osteoarticular radiographs, compared to 115,000 in the predicate device. This expansion was undertaken to enhance the algorithm's robustness by including a more comprehensive representation of pediatric cases alongside adult cases. The algorithm was retrained using the expanded dataset, which involved adjusting the model weights to optimize performance across the broader dataset that includes both adult and pediatric populations. The previous truthed predicate test dataset was strictly walled off and not included in the new training dataset.
Following the retraining, AZmed conducted a performance study using the same testing methodologies applied to the predicate device. The results of this study show that the retrained algorithm performs non-inferiority across all body parts compared to the predicate device.
| AnatomicArea | AUC(Bootstrapped CI) | |
|---|---|---|
| Predicate | Re-trained | |
| Ankle | 0.99137 (0.98374; 0.99727) | 0.99732 (0.98969; 1) |
| Clavicle | 0.97806 (0.94626; 0.99761) | 0.98393 (0.95118; 0.99949) |
| Elbow | 0.9964 (0.99059; 1.0) | 0.99441 (0.98761; 0.99701) |
| Forearm | 0.9953 (0.98909; 0.99937) | 0.9943 (0.98809; 0.99737) |
| Humerus | 0.9955 (0.98960; 0.99943) | 0.99351 (0.98662; 0.99644) |
| Hip | 0.95821 (0.93239; 0.98014) | 0.95725 (0.93236; 0.97918) |
{15}------------------------------------------------
Image /page/15/Picture/0 description: The image shows the logo for "AZMED". The logo consists of a blue geometric shape on the left, resembling a stylized letter "A" and "Z" combined within a hexagon. To the right of the geometric shape, the text "AZMED" is written in a bold, dark blue font. The overall design is clean and modern, suggesting a professional and trustworthy image.
| Anatomic | AUC(Bootstrapped CI) | |
|---|---|---|
| Area | Predicate | Re-trained |
| Knee | 0.97742 (0.95084; 0.99592) | 0.9784 (0.95182; 0.9969) |
| Pelvis | 0.97676 (0.95241; 0.99638) | 0.97774 (0.95339; 0.99836) |
| Shoulder | 0.97814 (0.94147; 0.99958) | 0.98303 (0.94542; 1) |
| Tibia/Fibula | 0.98285 (0.95925; 0.9978) | 0.98776 (0.96512; 0.99872) |
| Wrist | 0.99567 (0.99126; 0.99897) | 0.99567 (0.99126; 0.99797) |
| Hand | 0.99552 (0.99074; 0.99898) | 0.99452 (0.98875; 0.99698) |
| Foot | 0.99162 (0.98238; 0.99823) | 0.99757 (0.98735; 1) |
| Total | 0.98607 (0.98104; 0.99058) | 0.98781 (0.98247; 0.99048) |
Rayvolve performances regarding the retrained algorithm
Non-inferiority testing
AZmed acceptance criteria for non-inferiority adult testing involve a margin of 0.05. Using the bootstrap method to compare the AUCs (Area Under the Curve), this approach, which assumes minimal distributional assumptions, helps assess the variability and confidence intervals of the AUC values. If the lower bound of the difference in AUCs (Retrained - Predicate) exceeds -0.05, non-inferiority is confirmed.
Based on the calculated differences and their confidence intervals, we can conclude non-inferiority for all organs when using a non-inferiority margin of -0.05. The lower bounds of the differences in AUCs for the Retrained model compared to the Predicate model are all greater than -0.05, indicating that the Retrained model's performance is not inferior to the Predicate model across all organs.
{16}------------------------------------------------
Image /page/16/Picture/0 description: The image shows the logo for AZ Med. The logo consists of a blue hexagon with the letters AZ inside of it. The letters AZ are also written out to the right of the hexagon, followed by the word MED.
Clinical data C.
No clinical studies were conducted in support of the 510(k) submission of Rayvolve.
Rayvolve is based on the same Al algorithm as the predicate device: Rayvolve (K220164).
AZmed conducted a fully crossed multiple readers, multiple case (MRMC) retrospective reader study to determine the impact of Rayvolve on reader performance in diagnosing fractures.
The primary objective of the study was to determine whether the diagnostic accuracy of readers aided by Rayvolve ("Rayvolve-aided") is superior to the diagnostic accuracy of readers unaided by Rayvolve ("Rayvolve-unaided") as determined by the AUC of the Receiver Operating Characteristic (ROC) Curve. The secondary objective is to report the sensitivity and specificity of the Rayvolve-aided and unaided reads.
24 clinical readers each evaluated 186 cases in Rayvolve's indication for use under both Rayvolve-aided and Rayvolve-unaided conditions. Each case had been previously evaluated by a panel of three US board-certified MSK radiologists to provide ground truth binary labeling the presence or absence of fracture and the localization information for fractures. The MRMC study consisted of two independent reading sessions separated by a washout period of at least one month to avoid memory bias.
For each case, each reader was required to provide a binary determination of the presence or absence of a fracture and also to draw a bounding box around each fracture on the image to determine the localization of each fracture.
In addition to this binary decision of the readers regarding the presence or absence of fracture, each reader should also provide a report score with an ordinal value.
This report score has been collected for every case and every reader with and without the aid of the Rayvolve device. The report score has been used for ROC data.
The results of the study found that the diagnostic accuracy of readers in the intended use population is superior when aided by Rayvolve than when unaided by Rayvolve, as measured at the task of fracture detection using the AUC of the ROC curve as calculated by the DBM modeling approach.
{17}------------------------------------------------
Image /page/17/Picture/0 description: The image contains the logo for AZ Med. The logo consists of a blue hexagon with the letters "AZ" inside of it. To the right of the hexagon is the word "MED" in blue.
Clinical Reader Study Results Rayvolve-Aided vs Rayvolve-Unaided ROC Curves
Image /page/17/Figure/2 description: The image is a plot titled "Clinical Reader Study Results Rayvolve-Aided vs Rayvolve-Unaided ROC Curves". The plot shows two ROC curves, one for "Unaided Readers" and one for "Aided Readers". The x-axis is labeled "1-Specificity" and ranges from 0.0 to 1.0, while the y-axis is labeled "Sensitivity" and ranges from 0.0 to 1.0. The "Aided Readers" curve is generally above the "Unaided Readers" curve, indicating better performance.
In particular, the study results demonstrated:
- Reader AUC was significantly improved from 0.84602 to 0.89327, a difference of 0.04725 (95% Cl: 0.03376; 0.061542), across the 186 cases within Rayvolve's Indications for Use, spanning 13 study types (anatomic areas of interest) (p=0.0041).
- Reader sensitivity was significantly improved from 0.86561 (95% Wilson's Cl: -0.84859, 0.88099) to 0.9554 (95% Wilson's CI: 0.94453, 0.96422)
- -Reader specificity was improved from 0.82645 (95% Wilson's Cl: 0.81187, 0.84012) to 0.83116 (95% Wilson's CI: 0.81673, 0.84467)
8. CONCLUSION
Both the proposed device (Rayvolve) and the predicate device are computer-assisted detection and diagnostic devices that accept as input radiographs in DICOM format and use machine learning techniques to identify and highlight fractures in the adult and pediatric population (≥2 years). The overall design and development of the software show that the device performs as intended and the differences in indications for use including the new patient population (pediatric ≥ 2 years) do not raise different questions of safety and effectiveness. The results of standalone and clinical studies demonstrate that Rayvolve performs according to the specifications and meets user needs and intended use.
Therefore, the Rayvolve subject device and the Rayvolve predicate device (K220164) are substantially equivalent.
§ 892.2090 Radiological computer-assisted detection and diagnosis software.
(a)
Identification. A radiological computer-assisted detection and diagnostic software is an image processing device intended to aid in the detection, localization, and characterization of fracture, lesions, or other disease-specific findings on acquired medical images (e.g., radiography, magnetic resonance, computed tomography). The device detects, identifies, and characterizes findings based on features or information extracted from images, and provides information about the presence, location, and characteristics of the findings to the user. The analysis is intended to inform the primary diagnostic and patient management decisions that are made by the clinical user. The device is not intended as a replacement for a complete clinician's review or their clinical judgment that takes into account other relevant information from the image or patient history.(b)
Classification. Class II (special controls). The special controls for this device are:(1) Design verification and validation must include:
(i) A detailed description of the image analysis algorithm, including a description of the algorithm inputs and outputs, each major component or block, how the algorithm and output affects or relates to clinical practice or patient care, and any algorithm limitations.
(ii) A detailed description of pre-specified performance testing protocols and dataset(s) used to assess whether the device will provide improved assisted-read detection and diagnostic performance as intended in the indicated user population(s), and to characterize the standalone device performance for labeling. Performance testing includes standalone test(s), side-by-side comparison(s), and/or a reader study, as applicable.
(iii) Results from standalone performance testing used to characterize the independent performance of the device separate from aided user performance. The performance assessment must be based on appropriate diagnostic accuracy measures (
e.g., receiver operator characteristic plot, sensitivity, specificity, positive and negative predictive values, and diagnostic likelihood ratio). Devices with localization output must include localization accuracy testing as a component of standalone testing. The test dataset must be representative of the typical patient population with enrichment made only to ensure that the test dataset contains a sufficient number of cases from important cohorts (e.g., subsets defined by clinically relevant confounders, effect modifiers, concomitant disease, and subsets defined by image acquisition characteristics) such that the performance estimates and confidence intervals of the device for these individual subsets can be characterized for the intended use population and imaging equipment.(iv) Results from performance testing that demonstrate that the device provides improved assisted-read detection and/or diagnostic performance as intended in the indicated user population(s) when used in accordance with the instructions for use. The reader population must be comprised of the intended user population in terms of clinical training, certification, and years of experience. The performance assessment must be based on appropriate diagnostic accuracy measures (
e.g., receiver operator characteristic plot, sensitivity, specificity, positive and negative predictive values, and diagnostic likelihood ratio). Test datasets must meet the requirements described in paragraph (b)(1)(iii) of this section.(v) Appropriate software documentation, including device hazard analysis, software requirements specification document, software design specification document, traceability analysis, system level test protocol, pass/fail criteria, testing results, and cybersecurity measures.
(2) Labeling must include the following:
(i) A detailed description of the patient population for which the device is indicated for use.
(ii) A detailed description of the device instructions for use, including the intended reading protocol and how the user should interpret the device output.
(iii) A detailed description of the intended user, and any user training materials or programs that address appropriate reading protocols for the device, to ensure that the end user is fully aware of how to interpret and apply the device output.
(iv) A detailed description of the device inputs and outputs.
(v) A detailed description of compatible imaging hardware and imaging protocols.
(vi) Warnings, precautions, and limitations must include situations in which the device may fail or may not operate at its expected performance level (
e.g., poor image quality or for certain subpopulations), as applicable.(vii) A detailed summary of the performance testing, including test methods, dataset characteristics, results, and a summary of sub-analyses on case distributions stratified by relevant confounders, such as anatomical characteristics, patient demographics and medical history, user experience, and imaging equipment.