K Number
K251873
Device Name
Saige-Dx
Manufacturer
Date Cleared
2025-08-11

(54 days)

Product Code
Regulation Number
892.2090
Reference & Predicate Devices
Predicate For
N/A
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
Intended Use

Saige-Dx analyzes digital breast tomosynthesis (DBT) mammograms to identify the presence or absence of soft tissue lesions and calcifications that may be indicative of cancer. For a given DBT mammogram, Saige-Dx analyzes the DBT image stacks and the accompanying 2D images, including full field digital mammography and/or synthetic images. The system assigns a Suspicion Level, indicating the strength of suspicion that cancer may be present, for each detected finding and for the entire case. The outputs of Saige-Dx are intended to be used as a concurrent reading aid for interpreting physicians on screening mammograms with compatible DBT hardware.

Device Description

Saige-Dx is a software device that processes screening mammograms using artificial intelligence to aid interpreting radiologists. By automatically detecting the presence or absence of soft tissue lesions and calcifications in mammography images, Saige-Dx can help improve reader performance, while also reducing reading time. The software takes as input a set of x-ray mammogram DICOM files from a single digital breast tomosynthesis (DBT) study and generates finding-level outputs for each image analyzed, as well as an aggregate case-level assessment. Saige-Dx processes both the DBT image stacks and the associated 2D images (full-field digital mammography (FFDM) and/or synthetic 2D images) in a DBT study. For each image, Saige-Dx outputs bounding boxes circumscribing any detected findings and assigns a Finding Suspicion Level to each finding, indicating the degree of suspicion that the finding is malignant. Saige-Dx uses the results of the finding-level analysis to generate a Case Suspicion Level, indicating the degree of suspicion for malignancy across the case. Saige-Dx encapsulates the finding and case-level results into a DICOM Structured Report (SR) object containing markings that can be overlaid on the original mammogram images using a viewing workstation and a DICOM Secondary Capture (SC) object containing a summary report of the Saige-Dx results.

AI/ML Overview

Here's a breakdown of the acceptance criteria and the study that proves the device meets them, based on the provided FDA 510(k) clearance letter for Saige-Dx:

1. Table of Acceptance Criteria and Reported Device Performance

The provided document indicates that the primary endpoint of the standalone performance testing was to demonstrate non-inferiority of the subject device (new Saige-Dx version) to the predicate device (previous Saige-Dx version). Specific quantitative acceptance criteria (e.g., AUC, sensitivity, specificity thresholds) are not explicitly stated in the provided text. However, the document states:

"The test met the pre-specified performance criteria, and the results support the safety and effectiveness of Saige-Dx updated AI model on Hologic and GE exams."

Acceptance Criteria (Not explicitly quantified in source)Reported Device Performance
Non-inferiority of subject device performance to predicate device performance."The test met the pre-specified performance criteria, and the results support the safety and effectiveness of Saige-Dx updated AI model on Hologic and GE exams."
Performance across breast densities, ages, race/ethnicities, and lesion types and sizes.Subgroup analyses "demonstrated similar standalone performance trends across breast densities, ages, race/ethnicities, and lesion types and sizes."
Software design and implementation meeting requirements.Verification testing including unit, integration, system, and regression testing confirmed "the software, as designed and implemented, satisfied the software requirements and has no unintentional differences from the predicate device."

2. Sample Size for the Test Set and Data Provenance

  • Sample Size for Test Set: 2,002 DBT screening mammograms from unique women.
    • 259 cancer cases
    • 1,743 non-cancer cases
  • Data Provenance:
    • Country of Origin: United States (cases collected from 12 diverse clinical sites).
    • Retrospective or Prospective: Retrospective.
    • Acquisition Equipment: Hologic (standard definition and high definition) and GE images.

3. Number of Experts Used to Establish Ground Truth for the Test Set and Qualifications

The document mentions: "The case collection and ground truth lesion localization processes of the newly collected cases were the same processes used for the previously collected test dataset (details provided in K220105)."

  • While the specific number and qualifications of experts for the ground truth of the current test set are not explicitly detailed in this document, it refers back to K220105 for those details. It implies that a standardized process involving experts was used.

4. Adjudication Method for the Test Set

The document does not explicitly describe the adjudication method (e.g., 2+1, 3+1) used for establishing ground truth for the test set. It states: "The case collection and ground truth lesion localization processes of the newly collected cases were the same processes used for the previously collected test dataset (details provided in K220105)." This suggests a pre-defined and presumably robust method for ground truth establishment.

5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study

  • Was it done? Yes.
  • Effect Size: The document states: "a multi-reader multi-case (MRMC) study was previously conducted for the predicate device and remains applicable to the subject device." It does not provide details on the effect size (how much human readers improve with AI vs. without AI assistance) within this document. Readers would need to refer to the K220105 submission for that information if it was presented there.

6. Standalone (Algorithm Only) Performance Study

  • Was it done? Yes.
  • Description: "Validation of the software was conducted using a retrospective and blinded multicenter standalone performance testing under an IRB approved protocol..."
  • Primary Endpoint: "to demonstrate that the performance of the subject device was non-inferior to the performance of the predicate device."

7. Type of Ground Truth Used

  • The ground truth involved the presence or absence of cancer, with cases categorized as 259 cancer and 1,743 non-cancer. The mention of "ground truth lesion localization processes" implies a detailed assessment of findings, likely involving expert consensus and/or pathology/biopsy results to confirm malignancy. Given it's a diagnostic aid for cancer, pathology is the gold standard for confirmation.

8. Sample Size for the Training Set

  • Training Dataset: 161,323 patients and 300,439 studies.

9. How the Ground Truth for the Training Set Was Established

  • The document states: "The Saige-Dx algorithm was trained on a robust and diverse dataset of mammography exams acquired from multiple vendors including GE and Hologic equipment."
  • While it doesn't explicitly detail the method of ground truth establishment for the training set (e.g., expert consensus, pathology reports), similar to the test set, for a cancer detection AI, it is highly probable that the ground truth for the training data was derived from rigorous clinical assessments, including follow-up, biopsy results, and/or expert interpretations, to accurately label cancer and non-cancer cases for the algorithm to learn from. The implied "robust and diverse" nature of the training data suggests a comprehensive approach to ground truth.

FDA 510(k) Clearance Letter - Saige-Dx

Page 1

U.S. Food & Drug Administration
10903 New Hampshire Avenue
Silver Spring, MD 20993
www.fda.gov

Doc ID # 04017.08.00

August 11, 2025

DeepHealth, Inc.
B. Nathan Hunt
Head of Quality, Regulatory and Compliance
212 Elm Street
Somerville, Massachusetts 02144

Re: K251873
Trade/Device Name: Saige-Dx
Regulation Number: 21 CFR 892.2090
Regulation Name: Radiological Computer Assisted Detection And Diagnosis Software
Regulatory Class: Class II
Product Code: QDQ
Dated: June 17, 2025
Received: June 18, 2025

Dear B. Nathan Hunt:

We have reviewed your section 510(k) premarket notification of intent to market the device referenced above and have determined the device is substantially equivalent (for the indications for use stated in the enclosure) to legally marketed predicate devices marketed in interstate commerce prior to May 28, 1976, the enactment date of the Medical Device Amendments, or to devices that have been reclassified in accordance with the provisions of the Federal Food, Drug, and Cosmetic Act (the Act) that do not require approval of a premarket approval application (PMA). You may, therefore, market the device, subject to the general controls provisions of the Act. Although this letter refers to your product as a device, please be aware that some cleared products may instead be combination products. The 510(k) Premarket Notification Database available at https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfpmn/pmn.cfm identifies combination product submissions. The general controls provisions of the Act include requirements for annual registration, listing of devices, good manufacturing practice, labeling, and prohibitions against misbranding and adulteration. Please note: CDRH does not evaluate information related to contract liability warranties. We remind you, however, that device labeling must be truthful and not misleading.

If your device is classified (see above) into either class II (Special Controls) or class III (PMA), it may be subject to additional controls. Existing major regulations affecting your device can be found in the Code of Federal Regulations, Title 21, Parts 800 to 898. In addition, FDA may publish further announcements concerning your device in the Federal Register.

Additional information about changes that may require a new premarket notification are provided in the FDA guidance documents entitled "Deciding When to Submit a 510(k) for a Change to an Existing Device" (https://www.fda.gov/media/99812/download) and "Deciding When to Submit a 510(k) for a Software Change to an Existing Device" (https://www.fda.gov/media/99785/download).

Page 2

K251873 - B. Nathan Hunt
Page 2

Your device is also subject to, among other requirements, the Quality System (QS) regulation (21 CFR Part 820), which includes, but is not limited to, 21 CFR 820.30, Design controls; 21 CFR 820.90, Nonconforming product; and 21 CFR 820.100, Corrective and preventive action. Please note that regardless of whether a change requires premarket review, the QS regulation requires device manufacturers to review and approve changes to device design and production (21 CFR 820.30 and 21 CFR 820.70) and document changes and approvals in the device master record (21 CFR 820.181).

Please be advised that FDA's issuance of a substantial equivalence determination does not mean that FDA has made a determination that your device complies with other requirements of the Act or any Federal statutes and regulations administered by other Federal agencies. You must comply with all the Act's requirements, including, but not limited to: registration and listing (21 CFR Part 807); labeling (21 CFR Part 801); medical device reporting (reporting of medical device-related adverse events) (21 CFR Part 803) for devices or postmarketing safety reporting (21 CFR Part 4, Subpart B) for combination products (see https://www.fda.gov/combination-products/guidance-regulatory-information/postmarketing-safety-reporting-combination-products); good manufacturing practice requirements as set forth in the quality systems (QS) regulation (21 CFR Part 820) for devices or current good manufacturing practices (21 CFR Part 4, Subpart A) for combination products; and, if applicable, the electronic product radiation control provisions (Sections 531-542 of the Act); 21 CFR Parts 1000-1050.

All medical devices, including Class I and unclassified devices and combination product device constituent parts are required to be in compliance with the final Unique Device Identification System rule ("UDI Rule"). The UDI Rule requires, among other things, that a device bear a unique device identifier (UDI) on its label and package (21 CFR 801.20(a)) unless an exception or alternative applies (21 CFR 801.20(b)) and that the dates on the device label be formatted in accordance with 21 CFR 801.18. The UDI Rule (21 CFR 830.300(a) and 830.320(b)) also requires that certain information be submitted to the Global Unique Device Identification Database (GUDID) (21 CFR Part 830 Subpart E). For additional information on these requirements, please see the UDI System webpage at https://www.fda.gov/medical-devices/device-advice-comprehensive-regulatory-assistance/unique-device-identification-system-udi-system.

Also, please note the regulation entitled, "Misbranding by reference to premarket notification" (21 CFR 807.97). For questions regarding the reporting of adverse events under the MDR regulation (21 CFR Part 803), please go to https://www.fda.gov/medical-devices/medical-device-safety/medical-device-reporting-mdr-how-report-medical-device-problems.

For comprehensive regulatory information about medical devices and radiation-emitting products, including information about labeling regulations, please see Device Advice (https://www.fda.gov/medical-devices/device-advice-comprehensive-regulatory-assistance) and CDRH Learn (https://www.fda.gov/training-and-continuing-education/cdrh-learn). Additionally, you may contact the Division of Industry and Consumer Education (DICE) to ask a question about a specific regulatory topic. See the DICE website (https://www.fda.gov/medical-devices/device-advice-comprehensive-regulatory-assistance/contact-us-division-industry-and-consumer-education-dice) for more information or contact DICE by email (DICE@fda.hhs.gov) or phone (1-800-638-2041 or 301-796-7100).

Page 3

K251873 - B. Nathan Hunt
Page 3

Sincerely,

Paramita Sengupta -S

For
Yanna Kang
Assistant Director
Mammography and Ultrasound Team
DHT8C: Division of Radiological
Imaging and Radiation Therapy Devices
OHT8: Office of Radiological Health
Office of Product Evaluation and Quality
Center for Devices and Radiological Health

Enclosure

Page 4

Indications for Use

Please type in the marketing application/submission number, if it is known. This textbox will be left blank for original applications/submissions.K251873
Please provide the device trade name(s).
Saige-Dx
Please provide your Indications for Use below.
Saige-Dx analyzes digital breast tomosynthesis (DBT) mammograms to identify the presence or absence of soft tissue lesions and calcifications that may be indicative of cancer. For a given DBT mammogram, Saige-Dx analyzes the DBT image stacks and the accompanying 2D images, including full field digital mammography and/or synthetic images. The system assigns a Suspicion Level, indicating the strength of suspicion that cancer may be present, for each detected finding and for the entire case. The outputs of Saige-Dx are intended to be used as a concurrent reading aid for interpreting physicians on screening mammograms with compatible DBT hardware.
Please select the types of uses (select one or both, as applicable).☒ Prescription Use (Part 21 CFR 801 Subpart D)☐ Over-The-Counter Use (21 CFR 801 Subpart C)

Saige-Dx
Page 8 of 35

Page 5

510(k) Summary - Saige-Dx - DeepHealth, Inc

212 Elm St
Somerville, MA 02144
www.deephealth.com

K251873

510(k) Summary

DeepHealth, Inc

Saige-Dx

In accordance with 21 CFR 807.92 the following summary of information is provided, on this date, August 1st, 2025:

1. 510(k) SUBMITTER

DeepHealth, Inc
212 Elm St.
Somerville, MA 02144
Tel: 443-506-8911

Contact Person:

B. Nathan Hunt
Head of Quality, Regulatory, and Compliance
DeepHealth, Inc
212 Elm St
Somerville, MA 02144
Tel: 443-506-8911

Date Prepared:

August 1, 2025

2. DEVICE

Trade Name of Device:
Saige-Dx

Common or Usual Name:
Medical Image Software

Classification Names:
Radiological Computer Assisted Detection/Diagnosis Software for Lesions Suspicious for Cancer (21 CFR 892.2090)

Regulation Class: II

Product Code: QDQ

3. PREDICATE DEVICE

Predicate Device:

Trade Name: Saige-Dx
Device Model: v.3.1.0
K251873

510(k) Summary - Saige-Dx - DeepHealth, Inc
1

Page 6

212 Elm St
Somerville, MA 02144
www.deephealth.com

Common or Usual Name: Medical Image Software

Classification Names:
Radiological Computer Assisted Detection/Diagnosis Software for Lesions Suspicious for Cancer (21 CFR 892.2090)

Regulation Class: II

Product Code: QDQ

510(K) No.: K243688

This predicate has not been subject to a design-related recall.

No reference devices were used in this submission.

4. DEVICE DESCRIPTION

Saige-Dx is a software device that processes screening mammograms using artificial intelligence to aid interpreting radiologists. By automatically detecting the presence or absence of soft tissue lesions and calcifications in mammography images, Saige-Dx can help improve reader performance, while also reducing reading time. The software takes as input a set of x-ray mammogram DICOM files from a single digital breast tomosynthesis (DBT) study and generates finding-level outputs for each image analyzed, as well as an aggregate case-level assessment. Saige-Dx processes both the DBT image stacks and the associated 2D images (full-field digital mammography (FFDM) and/or synthetic 2D images) in a DBT study. For each image, Saige-Dx outputs bounding boxes circumscribing any detected findings and assigns a Finding Suspicion Level to each finding, indicating the degree of suspicion that the finding is malignant. Saige-Dx uses the results of the finding-level analysis to generate a Case Suspicion Level, indicating the degree of suspicion for malignancy across the case. Saige-Dx encapsulates the finding and case-level results into a DICOM Structured Report (SR) object containing markings that can be overlaid on the original mammogram images using a viewing workstation and a DICOM Secondary Capture (SC) object containing a summary report of the Saige-Dx results.

5. INDICATIONS FOR USE

Saige-Dx analyzes digital breast tomosynthesis (DBT) mammograms to identify the presence or absence of soft tissue lesions and calcifications that may be indicative of cancer. For a given DBT mammogram, Saige-Dx analyzes the DBT image stacks and the accompanying 2D images, including full field digital mammography and/or synthetic images. The system assigns a Suspicion Level, indicating the strength of suspicion that cancer may be present, for each detected finding and for the entire case. The outputs of Saige-Dx are intended to be used as a concurrent reading aid for interpreting physicians on screening mammograms with compatible DBT hardware.

Intended User Population

The intended users of Saige-Dx are interpreting physicians qualified to read screening mammography exams.

Intended Patient Populations

The device is intended to be used on women from a screening population undergoing screening mammography.

Warnings and Precautions

Saige-Dx is an adjunct tool and is not intended to replace a physician's own review of a mammogram. Decisions should not be made solely based on analysis by Saige-Dx.

510(k) Summary - Saige-Dx - DeepHealth, Inc
2

Page 7

212 Elm St
Somerville, MA 02144
www.deephealth.com

6. PREDICATE DEVICE COMPARISON

Saige-Dx and the predicate device have the same indications for use and patient population, and similar technical characteristics, and principles of operation. Compared to the predicate device, the subject device includes an updated AI algorithm, an added output configuration, improved acceptance checks, and refinements to improve usability. The device also includes improved memory and GPU management. The differences between the subject and predicate device do not alter the safety or effectiveness of the subject device for its intended use.

Both the subject and predicate devices are intended to be used by physicians to aid in the interpretation of screening mammograms. The devices are not intended to be used as a replacement for a full physician review or their own clinical judgment. Both the subject and predicate devices are software systems that use artificial intelligence (AI)/machine learning algorithms that analyze mammography images to detect and characterize findings and provide information regarding the presence and location of the findings to the user.

Both devices are designed to fit in parallel to the standard-of-care workflow: mammography imaging studies are routed from the healthcare facility to the software device for processing, and after the analysis is completed, the results are sent back to the calling system to be displayed in the PACS or other worklist software.

The design of the current version of Saige-Dx is similar to that of the predicate device. Verification and Validation testing has been completed ensuring that the differences do not affect the safety and effectiveness of the proposed subject device.

7. PERFORMANCE DATA

The design and development of Saige-Dx followed the following FDA recognized standards and guidance documents:

  • ISO 14971:2019 – Medical Devices – Application of Risk Management to Medical Devices (#5-125)
  • IEC 62304:2015 – Medical Device Software – Software Life Cycles Processes (#13-79)
  • NEMA PS3 – Digital Imaging and Communications in Medicine (DICOM) Set (#12-300)
  • Guidance for Industry and FDA Staff: Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices (May, 2005)
  • Guidance for Industry and FDA Staff: Software as a Medical Devices (SAMD): Clinical Evaluation (December 2017)

Saige-Dx is a software-only device. Verification testing included software unit testing, software integration testing, system testing, and regression testing. Testing confirmed that the software, as designed and implemented, satisfied the software requirements and has no unintentional differences from the predicate device.

Training Dataset

The Saige-Dx algorithm was trained on a robust and diverse dataset of mammography exams acquired from multiple vendors including GE and Hologic equipment. A total of ten datasets comprising 161,323 patients and 300,439 studies were collected from diverse practices with the majority from geographically diverse areas within the United States, including New York and

510(k) Summary - Saige-Dx - DeepHealth, Inc
3

Page 8

212 Elm St
Somerville, MA 02144
www.deephealth.com

California. The training dataset included age-appropriate and racially, ethnically, and socio-economically diverse populations. Aligned with good machine learning practices, a validation data usage plan was implemented ensuring no exam overlap between the training and testing datasets.

Performance Testing

Validation of the software was conducted using a retrospective and blinded multicenter standalone performance testing under an IRB approved protocol; additionally, a multi-reader multi-case (MRMC) study was previously conducted for the predicate device and remains applicable to the subject device. The standalone performance of the subject device included testing on a total of 2,002 DBT screening mammograms acquired from unique women and consisted of data used in standalone clinical testing of a previously cleared version of Saige-Dx, as well as newly collected data. The case collection and ground truth lesion localization processes of the newly collected cases were the same processes used for the previously collected test dataset (details provided in K220105). Cases were collected from 12 diverse clinical sites in the United States, consisting of 259 cancer and 1743 non-cancer cases, and included exams with Hologic (standard definition and high definition) and GE images. Table 1 shows the descriptive statistics of the dataset used in this evaluation.

All testing datasets were independent and did not overlap with any data used for model development, training, or internal bench testing. The primary endpoint was to demonstrate that the performance of the subject device was non-inferior to the performance of the predicate device. The test met the pre-specified performance criteria, and the results support the safety and effectiveness of Saige-Dx updated AI model on Hologic and GE exams. Additionally, subgroup analyses were performed as secondary assessments to demonstrate performance of the subject device across the intended use population and demonstrated similar standalone performance trends across breast densities, ages, race/ethnicities, and lesion types and sizes.

Table 1. Descriptive statistics of cases included in standalone performance study (n=2002). Numbers in parentheses for Breast Density, Patient Race, and Patient Ethnicity are percentages. Numbers in parentheses for Patient Age are standard deviation (SD).

510(k) Summary - Saige-Dx - DeepHealth, Inc
4

Page 9

212 Elm St
Somerville, MA 02144
www.deephealth.com

8. CONCLUSION

Verification and Validation testing conducted to support this submission confirm that Saige-Dx is safe and effective. The differences between the subject and predicate device do not alter the intended use of the device and do not affect its safety and effectiveness when used as labeled. Therefore, the information presented in this 510(k) submission demonstrates that Saige-Dx is substantially equivalent to the predicate device.

All Cases (%)
Breast Density
A
B
C
D
Patient Age
Max
Mean (SD)
Min
Patient Ethnicity
Hispanic/Latino
Not Hispanic/Latino
Unknown
Patient Race
American Indian/Alaska Native
Asian
Black/African American
Multiple
Native Hawaiian/Other Pacific Islander
Other
Unknown
White

510(k) Summary - Saige-Dx - DeepHealth, Inc
5

§ 892.2090 Radiological computer-assisted detection and diagnosis software.

(a)
Identification. A radiological computer-assisted detection and diagnostic software is an image processing device intended to aid in the detection, localization, and characterization of fracture, lesions, or other disease-specific findings on acquired medical images (e.g., radiography, magnetic resonance, computed tomography). The device detects, identifies, and characterizes findings based on features or information extracted from images, and provides information about the presence, location, and characteristics of the findings to the user. The analysis is intended to inform the primary diagnostic and patient management decisions that are made by the clinical user. The device is not intended as a replacement for a complete clinician's review or their clinical judgment that takes into account other relevant information from the image or patient history.(b)
Classification. Class II (special controls). The special controls for this device are:(1) Design verification and validation must include:
(i) A detailed description of the image analysis algorithm, including a description of the algorithm inputs and outputs, each major component or block, how the algorithm and output affects or relates to clinical practice or patient care, and any algorithm limitations.
(ii) A detailed description of pre-specified performance testing protocols and dataset(s) used to assess whether the device will provide improved assisted-read detection and diagnostic performance as intended in the indicated user population(s), and to characterize the standalone device performance for labeling. Performance testing includes standalone test(s), side-by-side comparison(s), and/or a reader study, as applicable.
(iii) Results from standalone performance testing used to characterize the independent performance of the device separate from aided user performance. The performance assessment must be based on appropriate diagnostic accuracy measures (
e.g., receiver operator characteristic plot, sensitivity, specificity, positive and negative predictive values, and diagnostic likelihood ratio). Devices with localization output must include localization accuracy testing as a component of standalone testing. The test dataset must be representative of the typical patient population with enrichment made only to ensure that the test dataset contains a sufficient number of cases from important cohorts (e.g., subsets defined by clinically relevant confounders, effect modifiers, concomitant disease, and subsets defined by image acquisition characteristics) such that the performance estimates and confidence intervals of the device for these individual subsets can be characterized for the intended use population and imaging equipment.(iv) Results from performance testing that demonstrate that the device provides improved assisted-read detection and/or diagnostic performance as intended in the indicated user population(s) when used in accordance with the instructions for use. The reader population must be comprised of the intended user population in terms of clinical training, certification, and years of experience. The performance assessment must be based on appropriate diagnostic accuracy measures (
e.g., receiver operator characteristic plot, sensitivity, specificity, positive and negative predictive values, and diagnostic likelihood ratio). Test datasets must meet the requirements described in paragraph (b)(1)(iii) of this section.(v) Appropriate software documentation, including device hazard analysis, software requirements specification document, software design specification document, traceability analysis, system level test protocol, pass/fail criteria, testing results, and cybersecurity measures.
(2) Labeling must include the following:
(i) A detailed description of the patient population for which the device is indicated for use.
(ii) A detailed description of the device instructions for use, including the intended reading protocol and how the user should interpret the device output.
(iii) A detailed description of the intended user, and any user training materials or programs that address appropriate reading protocols for the device, to ensure that the end user is fully aware of how to interpret and apply the device output.
(iv) A detailed description of the device inputs and outputs.
(v) A detailed description of compatible imaging hardware and imaging protocols.
(vi) Warnings, precautions, and limitations must include situations in which the device may fail or may not operate at its expected performance level (
e.g., poor image quality or for certain subpopulations), as applicable.(vii) A detailed summary of the performance testing, including test methods, dataset characteristics, results, and a summary of sub-analyses on case distributions stratified by relevant confounders, such as anatomical characteristics, patient demographics and medical history, user experience, and imaging equipment.