K Number
K232879
Device Name
Roche Digital Pathology Dx (VENTANA DP 200)
Date Cleared
2024-06-14

(270 days)

Product Code
Regulation Number
864.3700
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP Authorized
Intended Use
Roche Digital Pathology Dx (VENTANA DP 200) is an automated digital slide creation, viewing and management system. Roche Digital Pathology Dx (VENTANA DP 200) is intended for in vitro diagnostic use as an aid to the pathologist to review and interpret digital images of scanned pathology slides prepared from formalin-fixed paraffin-embedded (FFPE) tissue. Roche Digital Pathology Dx (VENTANA DP 200) is not intended for use with frozen section, cytology, or non-FFPE hematopathology specimens. Roche Digital Pathology Dx (VENTANA DP 200) is for creation and viewing of digital images of scanned glass slides that would otherwise be appropriate for manual visualization by conventional light microscopy.
Device Description
Roche Digital Pathology Dx (VENTANA DP 200), hereinafter referred to as Roche Digital Pathology Dx, is a whole slide imaging (WSI) system. It is an automated digital slide creation, viewing, and management system intended to aid pathologists in generating, reviewing, and interpreting digital images of surgical pathology slides that would otherwise be appropriate for manual visualization by conventional light microscopy. Roche Digital Pathology Dx system is composed of the following components: - · VENTANA DP 200 slide scanner - · Roche uPath enterprise software 1.1.1 (hereinafter, "uPath") - · ASUS PA248QV display VENTANA DP 200 slide scanner is a bright-field digital pathology scanner that accommodates loading and scanning of up to 6 standard slides. The scanner comprises a high-resolution 20x objective with the ability to scan at both 20x and 40x. With its uniquely designed optics and scanning methods, VENTANA DP 200 scanner enables users to capture sharp, high-resolution digital images of stained tissue specimens on glass slides. The scanner features automatic detection of the tissue specimen on the slide, automated 1D and 2D barcode reading, and selectable volume scanning (3 to 15 focus layers). It also integrates color profiling to ensure that images produced from scanned slides are generated with a color-managed International Color Consortium (ICC) profile. VENTANA DP 200 image files are generated in a proprietary format (BIF) and can be uploaded to an Image Management System (IMS), such as the one provided with Roche uPath enterprise software. Roche uPath enterprise software (uPath), a component of Roche Digital Pathology system, is a web-based image management and workflow software application. uPath enterprise software can be accessed on a Windows workstation using Google Chrome or Microsoft Edge. The interface of uPath software enables laboratories to manage their workflow from the time the digital slide image is produced and acquired by a VENTANA slide scanner through the subsequent processes including, but not limited to, review of the digital image on the monitor screen, analysis, and reporting of results. The software incorporates specific functions for pathologists, laboratory histology staff, workflow coordinators, and laboratory administrators.
More Information

Not Found

No
The document describes a digital pathology system for creating, viewing, and managing digital slides. While it mentions image processing and automated features like tissue detection and barcode reading, there is no mention of AI, ML, deep learning, or any algorithms that would typically fall under these categories. The performance studies focus on agreement rates and precision compared to manual microscopy, not on the performance of any AI/ML-driven analysis.

No.
This device is an automated digital slide creation, viewing, and management system intended for in vitro diagnostic use, aiding pathologists in reviewing and interpreting digital images of pathology slides. It does not directly provide therapy or treatment.

Yes

Explanation: The "Intended Use / Indications for Use" section explicitly states, "Roche Digital Pathology Dx (VENTANA DP 200) is intended for in vitro diagnostic use as an aid to the pathologist to review and interpret digital images of scanned pathology slides..." This clearly indicates its role in the diagnostic process.

No

The device description explicitly states that the system is composed of a slide scanner, software, and a display, indicating it includes hardware components in addition to the software.

Yes, this device is an IVD (In Vitro Diagnostic).

Here's why:

  • Explicit Statement: The "Intended Use / Indications for Use" section explicitly states: "Roche Digital Pathology Dx (VENTANA DP 200) is intended for in vitro diagnostic use as an aid to the pathologist to review and interpret digital images of scanned pathology slides..."
  • Purpose: The device is intended to be used in a laboratory setting to analyze biological specimens (FFPE tissue on slides) to aid in a medical diagnosis by a pathologist. This aligns directly with the definition of an in vitro diagnostic device.
  • Components: The system includes a scanner, software for image management and review, and a display, all designed to facilitate the diagnostic process using in vitro samples.
  • Clinical Studies: The provided information includes details of clinical accuracy and precision studies, which are typical requirements for demonstrating the performance and safety of IVD devices.
  • Predicate Device: The mention of a predicate device (K190332; Aperio AT2 DX System) further indicates that this device is being compared to other legally marketed IVD devices.

N/A

Intended Use / Indications for Use

Roche Digital Pathology Dx (VENTANA DP 200) is an automated digital slide creation, viewing and management system. Roche Digital Pathology Dx (VENTANA DP 200) is intended for in vitro diagnostic use as an aid to the pathologist to review and interpret digital images of scanned pathology slides prepared from formalin-fixed paraffinembedded (FFPE) tissue. Roche Digital Pathology Dx (VENTANA DP 200) is not intended for use with frozen section, cytology, or non-FFPE hematopathology specimens. Roche Digital Pathology Dx (VENTANA DP 200) is for creation and viewing of digital images of scanned glass slides that would otherwise be appropriate for manual visualization by conventional light microscopy.

Roche Digital Pathology Dx (VENTANA DP 200) is composed of VENTANA DP 200 slide scanner, Roche uPath enterprise software, and the ASUS PA248QV display. It is the responsibility of a qualified pathologist to employ appropriate procedures and safeguards to assure the validity of the interpretation of images obtained using Roche Digital Pathology Dx (VENTANA DP 200).

Product codes (comma separated list FDA assigned to the subject device)

PSY

Device Description

Roche Digital Pathology Dx (VENTANA DP 200), hereinafter referred to as Roche Digital Pathology Dx, is a whole slide imaging (WSI) system. It is an automated digital slide creation, viewing, and management system intended to aid pathologists in generating, reviewing, and interpreting digital images of surgical pathology slides that would otherwise be appropriate for manual visualization by conventional light microscopy. Roche Digital Pathology Dx system is composed of the following components:

  • VENTANA DP 200 slide scanner
  • Roche uPath enterprise software 1.1.1 (hereinafter, "uPath")
  • ASUS PA248QV display

VENTANA DP 200 slide scanner is a bright-field digital pathology scanner that accommodates loading and scanning of up to 6 standard slides. The scanner comprises a high-resolution 20x objective with the ability to scan at both 20x and 40x. With its uniquely designed optics and scanning methods, VENTANA DP 200 scanner enables users to capture sharp, high-resolution digital images of stained tissue specimens on glass slides. The scanner features automatic detection of the tissue specimen on the slide, automated 1D and 2D barcode reading, and selectable volume scanning (3 to 15 focus layers). It also integrates color profiling to ensure that images produced from scanned slides are generated with a color-managed International Color Consortium (ICC) profile. VENTANA DP 200 image files are generated in a proprietary format (BIF) and can be uploaded to an Image Management System (IMS), such as the one provided with Roche uPath enterprise software.

Roche uPath enterprise software (uPath), a component of Roche Digital Pathology system, is a web-based image management and workflow software application. uPath enterprise software can be accessed on a Windows workstation using Google Chrome or Microsoft Edge. The interface of uPath software enables laboratories to manage their workflow from the time the digital slide image is produced and acquired by a VENTANA slide scanner through the subsequent processes including, but not limited to, review of the digital image on the monitor screen, analysis, and reporting of results. The software incorporates specific functions for pathologists, laboratory histology staff, workflow coordinators, and laboratory administrators.

Mentions image processing

Yes

Mentions AI, DNN, or ML

Not Found

Input Imaging Modality

Bright-field digital pathology system; whole slide imaging (WSI); digital images of scanned pathology slides

Anatomical Site

Formalin-fixed paraffin-embedded (FFPE) tissue of multiple organ and tissue types. Specific organ and tissue types mentioned include: Anus/Perianal, Appendix, Bladder, Brain/Neurological, Breast, Colorectal, Endocrine, GE Junction, Gallbladder, Gynecological, Hernial/Peritoneal, Kidney (Neoplastic), Liver/Bile duct (Neoplastic), Lung/Bronchus/Larynx/Oral Cavity/Nasopharynx, Lymph Node, Prostate, Salivary Gland, Skin, Soft Tissue Tumors, Stomach, Bone, Omentum, Esophagus, Thyroid.

Indicated Patient Age Range

Not Found

Intended User / Care Setting

Pathologist, anatomic pathology lab technicians, laboratory histology staff, workflow coordinators, and laboratory administrators.
Intended for in vitro diagnostic use.

Description of the training set, sample size, data source, and annotation protocol

Not Found

Description of the test set, sample size, data source, and annotation protocol

Clinical Accuracy Study (Primary Endpoint)

  • Sample Size: Across 4 sites, a total of 2047 cases (3259 slides) consisting of multiple organ and tissue types were enrolled. 16 Reading Pathologists read all cases enrolled at their site using both MR and DR modalities, resulting in an expected total of 8188 DR diagnoses and 8188 MR diagnoses (16376 total diagnoses). For statistical analyses, 7562 DR diagnoses paired with 7562 MR diagnoses, adjudicated by an adjudication panel, were included.
  • Data Source: Cases pre-screened by two Screening Pathologists at each study site from their clinical database of archived formalin-fixed paraffin-embedded (FFPE) tissue specimens. Cases were identified sequentially with a minimum of one year between the date of sign-out diagnosis and beginning of the study.
  • Annotation Protocol: For each case, the sign-out diagnosis rendered at the study sites using an optical (light) microscope served as the reference diagnosis. Up to 3 Adjudication Pathologists reviewed the Reading Pathologists' diagnoses (MR and DR) against the corresponding sign-out diagnoses. They determined agreement, minor disagreement, or major disagreement. Adjudication was performed blinded to site, Reading Pathologist information, and reading modality. In case of disagreement between 2 adjudicators, a third reviewed. For cases where all 3 adjudicators differed, a consensus was reached in an adjudication panel meeting.

Precision Study (Analytical Performance)

  • Sample Size: 69 total study cases (slides) for 23 primary feature types (3 unique study cases per feature type). Additionally, 12 "wild card" slides were included to reduce recall bias but excluded from statistical analyses. This resulted in 207 "study" ROIs and 36 wild card ROIs.
  • Data Source: Hematoxylin and eosin (H&E)-stained archival slides containing sections of a variety of FFPE human tissue and organ types.
  • Annotation Protocol: The study involved multiple Reading Pathologists (2 readers at each of 3 external pathology laboratories/study sites) independently identifying specific histological primary features in multiple Regions of Interest (ROIs) pre-selected on WSI scans generated by multiple VENTANA DP 200 scanners across multiple scanning days. The readers evaluated designated ROI images using a checklist of 23 protocol-specified primary features and their designated magnification levels. Readers were provided with scanning magnification level and organ system/tissue type for the ROI image and could vary viewing magnification, but were blinded to case ID, patient clinical information, sign-out diagnoses, and all previous screening or study results. Each primary feature assessment was compared to the reference primary feature for that case, and agreement was evaluated.

Summary of Performance Studies (study type, sample size, AUC, MRMC, standalone performance, key results)

1. Clinical Accuracy Study

  • Study Type: Multi-center non-inferiority study comparing digital WSI review (DR) to manual microscopy slide review (MR).
  • Sample Size: 7562 DR diagnoses paired with 7562 MR diagnoses included in statistical analyses.
  • Standalone Performance:
    • Observed overall agreement rate (with reference sign-out diagnosis): DR = 92.00%, MR = 92.61%.
    • Model estimated agreement rate: DR = 91.54%, MR = 92.16%.
  • Key Results:
    • DR-MR difference in observed agreement rate: -0.61% (95% CI: -1.59%, 0.35%).
    • DR-MR difference in model estimated agreement rate: -0.62% (95% CI: -1.50%, 0.26%).
    • The lower limit of the 95% confidence interval of DR-MR was greater than the pre-specified non-inferiority margin of -4%, demonstrating non-inferiority of DR modality using Roche Digital Pathology Dx to MR modality using light microscopy.

2. Precision Study (Analytical Performance)

  • Study Type: Reproducibility/Precision study evaluating between-site, between-day, between-reader, and within-reader precision.
  • Sample Size: 207 study ROIs (69 study cases x 3 ROIs/study case) evaluated by 6 readers across 3 sites and multiple reading sessions.
  • Key Results (Overall Percent Agreement - OPA):
    • Between-Site/System: OPA = 89.3% (19510/21839), 95% CI: (85.8, 92.4).
    • Between-Days/Within-System: OPA = 90.3% (3302/3656), 95% CI: (87.1, 93.2).
    • Between-Readers: OPA = 90.1% (1650/1832), 95% CI: (86.6, 93.0).
    • Within-Reader: OPA = 88.1% (1078/1223), 95% CI: (84.8, 91.3).
  • Overall Conclusion: For each of the co-primary analyses, the lower bounds of the 95% CI for the 3 OPA point estimates was >85%, demonstrating that Roche Digital Pathology Dx has acceptable precision.

Key Metrics (Sensitivity, Specificity, PPV, NPV, etc.)

Clinical Accuracy Study (Agreement with Sign-Out Diagnosis Rates)

  • Overall Observed Agreement Rate:
    • DR: 92.00%
    • MR: 92.61%
  • Overall Model Estimated Agreement Rate:
    • DR: 91.54%
    • MR: 92.16%
  • Difference in Agreement (DR - MR):
    • Observed: -0.61% (95% CI: -1.59%, 0.35%)
    • Model: -0.62% (95% CI: -1.50%, 0.26%)
  • Overall Major Discrepancy Rates:
    • Observed: DR 8.00% (95% CI: 6.73, 9.27), MR 7.39% (95% CI: 6.11, 8.78)
    • Model: DR 8.46% (95% CI: 7.35, 9.71), MR 7.84% (95% CI: 6.80, 9.12)
    • Difference (DR - MR): Observed 0.61% (95% CI: -0.35, 1.59), Model 0.62% (95% CI: -0.26, 1.50)
  • Between-reader agreement rate (across all sites):
    • MR: 91.4% (95% CI: 90.8, 91.9)
    • DR: 90.6% (95% CI: 90.0, 91.1)

Precision Study (Overall Percent Agreement - OPA)

  • Between-Site/System: 89.3% (95% CI: 85.8, 92.4)
  • Between-Days/Within-System: 90.3% (95% CI: 87.1, 93.2)
  • Between-Readers: 90.1% (95% CI: 86.6, 93.0)
  • Within-Reader: 88.1% (95% CI: 84.8, 91.3)

Predicate Device(s): If the device was cleared using the 510(k) pathway, identify the Predicate Device(s) K/DEN number used to claim substantial equivalence and list them here in a comma separated list exactly as they appear in the text. List the primary predicate first in the list.

K190332

Reference Device(s): Identify the Reference Device(s) K/DEN number and list them here in a comma separated list exactly as they appear in the text.

Not Found

Predetermined Change Control Plan (PCCP) - All Relevant Information for the subject device only (e.g. presence / absence, what scope was granted / cleared under the PCCP, any restrictions, etc).

Not Found

§ 864.3700 Whole slide imaging system.

(a)
Identification. The whole slide imaging system is an automated digital slide creation, viewing, and management system intended as an aid to the pathologist to review and interpret digital images of surgical pathology slides. The system generates digital images that would otherwise be appropriate for manual visualization by conventional light microscopy.(b)
Classification. Class II (special controls). The special controls for this device are:(1) Premarket notification submissions must include the following information:
(i) The indications for use must specify the tissue specimen that is intended to be used with the whole slide imaging system and the components of the system.
(ii) A detailed description of the device and bench testing results at the component level, including for the following, as appropriate:
(A) Slide feeder;
(B) Light source;
(C) Imaging optics;
(D) Mechanical scanner movement;
(E) Digital imaging sensor;
(F) Image processing software;
(G) Image composition techniques;
(H) Image file formats;
(I) Image review manipulation software;
(J) Computer environment; and
(K) Display system.
(iii) Detailed bench testing and results at the system level, including for the following, as appropriate:
(A) Color reproducibility;
(B) Spatial resolution;
(C) Focusing test;
(D) Whole slide tissue coverage;
(E) Stitching error; and
(F) Turnaround time.
(iv) Detailed information demonstrating the performance characteristics of the device, including, as appropriate:
(A) Precision to evaluate intra-system and inter-system precision using a comprehensive set of clinical specimens with defined, clinically relevant histologic features from various organ systems and diseases. Multiple whole slide imaging systems, multiple sites, and multiple readers must be included.
(B) Reproducibility data to evaluate inter-site variability using a comprehensive set of clinical specimens with defined, clinically relevant histologic features from various organ systems and diseases. Multiple whole slide imaging systems, multiple sites, and multiple readers must be included.
(C) Data from a clinical study to demonstrate that viewing, reviewing, and diagnosing digital images of surgical pathology slides prepared from tissue slides using the whole slide imaging system is non-inferior to using an optical microscope. The study should evaluate the difference in major discordance rates between manual digital (MD) and manual optical (MO) modalities when compared to the reference (
e.g., main sign-out diagnosis).(D) A detailed human factor engineering process must be used to evaluate the whole slide imaging system user interface(s).
(2) Labeling compliant with 21 CFR 809.10(b) must include the following:
(i) The intended use statement must include the information described in paragraph (b)(1)(i) of this section, as applicable, and a statement that reads, “It is the responsibility of a qualified pathologist to employ appropriate procedures and safeguards to assure the validity of the interpretation of images obtained using this device.”
(ii) A description of the technical studies and the summary of results, including those that relate to paragraphs (b)(1)(ii) and (iii) of this section, as appropriate.
(iii) A description of the performance studies and the summary of results, including those that relate to paragraph (b)(1)(iv) of this section, as appropriate.
(iv) A limiting statement that specifies that pathologists should exercise professional judgment in each clinical situation and examine the glass slides by conventional microscopy if there is doubt about the ability to accurately render an interpretation using this device alone.

0

Image /page/0/Picture/0 description: The image shows the logo of the U.S. Food and Drug Administration (FDA). On the left is the Department of Health & Human Services logo. To the right of that is the FDA logo, with the letters "FDA" in a blue box. To the right of the blue box is the text "U.S. FOOD & DRUG ADMINISTRATION" in blue.

Ventana Medical Systems, Inc. Cameron Smith Regulatory Affairs Manager (Ventana, VMSI, Roche Tissue Diagnostics, RTD) 1910 E. Innovation Park Drive Tucson, Arizona 85755

Re: K232879

Trade/Device Name: Roche Digital Pathology Dx (VENTANA DP 200) Regulation Number: 21 CFR 864.3700 Regulation Name: Whole slide imaging system Regulatory Class: Class II Product Code: PSY Dated: September 15, 2023 Received: September 18, 2023

Dear Cameron Smith:

We have reviewed your section 510(k) premarket notification of intent to market the device referenced above and have determined the device is substantially equivalent (for the indications for use stated in the enclosure) to legally marketed predicate devices marketed in interstate commerce prior to May 28, 1976, the enactment date of the Medical Device Amendments, or to devices that have been reclassified in accordance with the provisions of the Federal Food, Drug, and Cosmetic Act (the Act) that do not require approval of a premarket approval application (PMA). You may, therefore, market the device, subject to the general controls provisions of the Act. Although this letter refers to your product as a device, please be aware that some cleared products may instead be combination products. The 510(k) Premarket Notification Database available at https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfpmn/pmn.cfm identifies combination product submissions. The general controls provisions of the Act include requirements for annual registration, listing of devices, good manufacturing practice, labeling, and prohibitions against misbranding and adulteration. Please note: CDRH does not evaluate information related to contract liability warranties. We remind you, however, that device labeling must be truthful and not misleading.

June 14, 2024

If your device is classified (see above) into either class II (Special Controls) or class III (PMA), it may be subject to additional controls. Existing major regulations affecting your device can be found in the Code of Federal Regulations, Title 21, Parts 800 to 898. In addition, FDA may publish further announcements concerning your device in the Federal Register.

Additional information about changes that may require a new premarket notification are provided in the FDA guidance documents entitled "Deciding When to Submit a 510(k) for a Change to an Existing Device"

1

(https://www.fda.gov/media/99812/download) and "Deciding When to Submit a 510(k) for a Software Change to an Existing Device" (https://www.fda.gov/media/99785/download).

Your device is also subject to, among other requirements, the Quality System (QS) regulation (21 CFR Part 820), which includes, but is not limited to, 21 CFR 820.30, Design controls; 21 CFR 820.90. Nonconforming product; and 21 CFR 820.100, Corrective and preventive action. Please note that regardless of whether a change requires premarket review. the OS regulation requires device manufacturers to review and approve changes to device design and production (21 CFR 820.30 and 21 CFR 820.70) and document changes and approvals in the device master record (21 CFR 820.181).

Please be advised that FDA's issuance of a substantial equivalence determination does not mean that FDA has made a determination that your device complies with other requirements of the Act or any Federal statutes and regulations administered by other Federal agencies. You must comply with all the Act's requirements, including, but not limited to: registration and listing (21 CFR Part 807); labeling (21 CFR Part 801 and Part 809); medical device reporting of medical device-related adverse events) (21 CFR Part 803) for devices or postmarketing safety reporting (21 CFR Part 4, Subpart B) for combination products (see https://www.fda.gov/combination-products/guidance-regulatory-information/postmarketing-safetyreporting-combination-products); good manufacturing practice requirements as set forth in the quality systems (QS) regulation (21 CFR Part 820) for devices or current good manufacturing practices (21 CFR Part 4, Subpart A) for combination products; and, if applicable, the electronic product radiation control provisions (Sections 531-542 of the Act); 21 CFR Parts 1000-1050.

Also, please note the regulation entitled, "Misbranding by reference to premarket notification" (21 CFR 807.97). For questions regarding the reporting of adverse events under the MDR regulation (21 CFR Part 803), please go to https://www.fda.gov/medical-device-safety/medical-device-reportingmdr-how-report-medical-device-problems.

For comprehensive regulatory information about medical devices and radiation-emitting products, including information about labeling regulations, please see Device Advice (https://www.fda.gov/medicaldevices/device-advice-comprehensive-regulatory-assistance) and CDRH Learn (https://www.fda.gov/training-and-continuing-education/cdrh-learn). Additionally, you may contact the Division of Industry and Consumer Education (DICE) to ask a question about a specific regulatory topic. See the DICE website (https://www.fda.gov/medical-device-advice-comprehensive-regulatoryassistance/contact-us-division-industry-and-consumer-education-dice) for more information or contact DICE by email (DICE@fda.hhs.gov) or phone (1-800-638-2041 or 301-796-7100).

Sincerely, Shyam Kalavar -S

Shyam Kalavar Deputy Branch Chief Division of Molecular Genetics and Pathology 2 OHT7: Office of In Vitro Diagnostics Office of Product Evaluation and Quality Center for Devices and Radiological Health

2

Indications for Use

510(k) Number (if known) K232879

Device Name Roche Digital Pathology Dx (VENTANA DP 200)

Indications for Use (Describe)

Roche Digital Pathology Dx (VENTANA DP 200) is an automated digital slide creation, viewing and management system. Roche Digital Pathology Dx (VENTANA DP 200) is intended for in vitro diagnostic use as an aid to the pathologist to review and interpret digital images of scanned pathology slides prepared from formalin-fixed paraffinembedded (FFPE) tissue. Roche Digital Pathology Dx (VENTANA DP 200) is not intended for use with frozen section, cytology, or non-FFPE hematopathology specimens. Roche Digital Pathology Dx (VENTANA DP 200) is for creation and viewing of digital images of scanned glass slides that would otherwise be appropriate for manual visualization by conventional light microscopy.

Roche Digital Pathology Dx (VENTANA DP 200) is composed of VENTANA DP 200 slide scanner, Roche uPath enterprise software, and the ASUS PA248QV display. It is the responsibility of a qualified pathologist to employ appropriate procedures and safeguards to assure the validity of the interpretation of images obtained using Roche Digital Pathology Dx (VENTANA DP 200).

Type of Use (Select one or both, as applicable)
X Prescription Use (Part 21 CFR 801 Subpart D)Over-The-Counter Use (21 CFR 801 Subpart C)

CONTINUE ON A SEPARATE PAGE IF NEEDED.

This section applies only to requirements of the Paperwork Reduction Act of 1995.

DO NOT SEND YOUR COMPLETED FORM TO THE PRA STAFF EMAIL ADDRESS BELOW.

The burden time for this collection of information is estimated to average 79 hours per response, including the time to review instructions, search existing data sources, gather and maintain the data needed and complete and review the collection of information. Send comments regarding this burden estimate or any other aspect of this information collection, including suggestions for reducing this burden, to:

Department of Health and Human Services Food and Drug Administration Office of Chief Information Officer Paperwork Reduction Act (PRA) Staff PRAStaff@fda.hhs.gov

"An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB number."

3

510(k) Summary Roche Digital Pathology Dx (VENTANA DP 200)

Date Prepared: June 14, 2024

Submitter

Ventana Medical Systems, Inc.

Contact Person

Ventana Medical Systems, Inc. Cameron Smith Regulatory Affairs Manager (Ventana, VMSI, Roche Tissue Diagnostics, RTD) 1910 E. Innovation Park Drive Tucson, Arizona 85755

Device Information

Subject Device

Proprietary Name:Roche Digital Pathology Dx (VENTANA DP 200)
Common Name:VENTANA DP 200
Classification Name:Whole Slide Imaging System
Regulation Section:21 CFR 864.3700
RegulatoryClassification: Class II
Product Code:PSY
Review Panel:88 – Pathology
510(k) Number:K232879

Predicate Device

Proprietary Name:Aperio AT2 DX System
Submission Number:K190332

4

I. Intended Use

Roche Digital Pathology Dx (VENTANA DP 200) is an automated digital slide creation, viewing and management system. Roche Digital Pathology Dx (VENTANA DP 200) is intended for in vitro diagnostic use as an aid to the pathologist to review and interpret digital images of scanned pathology slides prepared from formalin-fixed paraffin-embedded (FFPE) tissue. Roche Digital Pathology Dx (VENTANA DP 200) is not intended for use with frozen section, cytology, or non-FFPE hematopathology specimens. Roche Digital Pathology Dx (VENTANA DP 200) is for creation and viewing of digital images of scanned glass slides that would otherwise be appropriate for manual visualization by conventional light microscopy.

Roche Digital Pathology Dx (VENTANA DP 200) is composed of VENTANA DP 200 slide scanner, Roche uPath enterprise software, and ASUS PA248OV display. It is the responsibility of a qualified pathologist to employ appropriate procedures and safeguards to assure the validity of the interpretation of images obtained using Roche Digital Pathology Dx (VENTANA DP 200).

Device Description II.

Roche Digital Pathology Dx (VENTANA DP 200), hereinafter referred to as Roche Digital Pathology Dx, is a whole slide imaging (WSI) system. It is an automated digital slide creation, viewing, and management system intended to aid pathologists in generating, reviewing, and interpreting digital images of surgical pathology slides that would otherwise be appropriate for manual visualization by conventional light microscopy. Roche Digital Pathology Dx system is composed of the following components:

  • · VENTANA DP 200 slide scanner
  • · Roche uPath enterprise software 1.1.1 (hereinafter, "uPath")
  • · ASUS PA248QV display

VENTANA DP 200 slide scanner is a bright-field digital pathology scanner that accommodates loading and scanning of up to 6 standard slides. The scanner comprises a high-resolution 20x objective with the ability to scan at both 20x and 40x. With its uniquely designed optics and scanning methods, VENTANA DP 200 scanner enables users to capture

5

sharp, high-resolution digital images of stained tissue specimens on glass slides. The scanner features automatic detection of the tissue specimen on the slide, automated 1D and 2D barcode reading, and selectable volume scanning (3 to 15 focus layers). It also integrates color profiling to ensure that images produced from scanned slides are generated with a color-managed International Color Consortium (ICC) profile. VENTANA DP 200 image files are generated in a proprietary format (BIF) and can be uploaded to an Image Management System (IMS), such as the one provided with Roche uPath enterprise software.

Roche uPath enterprise software (uPath), a component of Roche Digital Pathology system, is a web-based image management and workflow software application. uPath enterprise software can be accessed on a Windows workstation using Google Chrome or Microsoft Edge. The interface of uPath software enables laboratories to manage their workflow from the time the digital slide image is produced and acquired by a VENTANA slide scanner through the subsequent processes including, but not limited to, review of the digital image on the monitor screen, analysis, and reporting of results. The software incorporates specific functions for pathologists, laboratory histology staff, workflow coordinators, and laboratory administrators.

III. Comparison of technological characteristics with the predicate device

The candidate device, Roche Digital Pathology Dx. is substantially equivalent to the predicate device, Leica Aperio AT2 DX, which was cleared on February 13, 2019 through K 190332. Below is a table that provides a comparison between the two devices:

6

Table 1: Similarities between the Subject Device and the Predicate Device

ItemSubject DevicePredicate Device
Product NameRoche Digital Pathology Dx (VENTANA DP 200)Aperio AT2 DX System
510(k) No.K232879K190332
ManufacturerVentana Medical Systems, Inc.Leica Biosystems Imaging, Inc.
Intended UseRoche Digital Pathology Dx (VENTANA DP 200), is an automated digital slide
creation, viewing and management system. Roche Digital Pathology Dx
(VENTANA DP 200) is intended for in vitro diagnostic use as an aid to the
pathologist to review and interpret digital images of scanned pathology slides
prepared from formalin-fixed paraffin-embedded (FFPE) tissue. Roche Digital
Pathology Dx (VENTANA DP 200) is not intended for use with frozen section,
cytology, or non-FFPE hematopathology specimens. Roche Digital Pathology
Dx (VENTANA DP 200) is for creation and viewing of digital images of
scanned glass slides that would otherwise be appropriate for manual
visualization by conventional light microscopy.
Roche Digital Pathology Dx (VENTANA DP 200) is composed of VENTANA
DP 200 slide scanner, Roche uPath enterprise software, and ASUS PA248QV
display. It is the responsibility of a qualified pathologist to employ appropriate
procedures and safeguards to assure the validity of the interpretation of
images obtained using Roche Digital Pathology Dx (VENTANA DP 200).The Aperio AT2 DX System is an automated digital slide creation and viewing
system. The Aperio AT2 DX System is intended for in vitro diagnostic use as
an aid to the pathologist to review and interpret digital images of surgical
pathology slides prepared from formalin-fixed paraffin embedded (FFPE)
tissue. The Aperio AT2 DX System is not intended for use with frozen section
cytology, or non-FFPE hematopathology specimens.
The Aperio AT2 DX System is composed of the Aperio AT2 DX scanner, the
ImageScope DX review application and Display. The Aperio AT2 DX System
is for creation and viewing of digital images of scanned glass slides that
would otherwise be appropriate for manual visualization by conventional light
microscopy. It is the responsibility of a qualified pathologist to employ
appropriate procedures and safeguards to assure the validity of the
interpretation of images obtained using the Aperio AT2 DX System
Classification Regulation21 CFR 864.370021 CFR 864.3700
Product CodePSY - Whole Slide Imaging SystemPSY - Whole Slide Imaging System
Classification Panel(88) Pathology(88) Pathology
Principle of OperationAfter conducting Quality Control (QC) on the glass slides per laboratory
standards (e.g., staining, coverslipping, barcode placement, etc.), the
technician loads the slides into VENTANA DP 200 slide scanner. The scanner
scans the slides and generates a whole slide image for each slide. The
technician performs QC on scanned WSI images by checking image data and
image quality. When QC is failed, the slide will be re-scanned. The acquired
WSI images are stored in an end user provided image storage attached to the
local network. During review, the pathologist opens WSI images from the
image storage in Roche uPath enterprise software, performs further QC to
ensure image quality, and reads the WSI images of the slides to make a
diagnosis.After conducting Quality Control (QC) on the glass slides per laboratory
standards (e.g., staining, coverslipping, barcode placement, etc.), the
technician loads the slides into the Aperio AT2 DX scanner. The scanner
scans the slides and generates WSI images for each slide. The technician
performs QC on scanned WSI images by checking image data and image
quality. When QC is failed, the slide will be re-scanned. The acquired WSI
images are stored in an end user provided image storage attached to the
local network. During review, the pathologist opens WSI images acquired with
the Aperio AT2 DX scanner from the image storage, performs further QC to
ensure image quality and reads WSI images of the slides to make a
diagnosis.
Device ComponentsWSI scanner (VENTANA DP 200 slide scanner), Image Management System
(Roche uPath enterprise software), and color monitor displayWSI scanner (Aperio AT2 DX scanner), Image Management System
(ImageScope DX application), and color monitor display
ItemSubject DevicePredicate Device
Product NameRoche Digital Pathology Dx
(VENTANA DP 200)Aperio AT2 DX System
510(k) No.K232879K190332
Whole Slide Imaging ScannerVENTANA DP 200; 6 slidesAperio AT2 DX; 400 slides
Review SoftwareRoche uPathImageScope DX
Monitor DisplayASUS PA248QVDell MR2416

7

Differences between the Subject Device and the Predicate Device Table 2:

There are no differences between the candidate and predicate device that impact safety or effectiveness, or raise any new questions of those aspects, as all candidate components have been qualified & validated for sufficient use as a whole slide imaging system.

IV. PERFORMANCE DATA

1. Technical Studies

Multiple studies were conducted to evaluate the performance assessment data associated with the technical evaluation of Roche Digital Pathology Dx.

A. Slide Feeder

Information was provided on the configuration of the slide feed mechanism, including a physical description of the slide, the number of slides in queue (carrier), and the class of automation. Information was provided on the user interaction with the slide feeder, including hardware, software, feedback mechanisms, and Failure Mode and Effects Analysis (FMEA).

B. Light Source

Descriptive information associated with the lamp and the condenser was provided. Testing information was provided to verify the spectral distribution of the light source as part of the color reproduction capability of VENTANA DP 200 scanner.

C. Imaging Optics

An optical schematic with all optical elements identified from slide (object plane) to digital image sensor (image plane) was provided. Descriptive information regarding the microscope objective, the auxiliary lenses, and the magnification of imaging optics was provided. Testing

8

information regarding the relative irradiance, optical distortions, and lateral chromatic aberrations was provided.

D. Focus System

Schematic diagrams and a description of operation of the focus system, which includes the optical system, 2 imaging cameras, and a focus tracking algorithm, was provided.

Mechanical Scanner Movement E.

Information and specifications on the configuration of the stage, method of movement, control of movement of the stage, and FMEA was provided. Test data to verify the repeatability of the stage movement mechanism staying within limit during operation was provided.

ட Digital Imaging Sensor

Information and specifications on the sensor type, pixel information, responsivity specifications, noise specifications, readout rate, and digital output format was provided. Test data to determine the correct functioning of the digital image sensor that converts optical signals of the slide to digital signals which consist of a set of numerical values corresponding to the brightness and color at each point in the optical image was provided.

G. Image Processing Software

Information and specifications on exposure control, white balance, color correction, subsampling, pixel-offset correction, pixel-gain or flat-field correction, and pixel-defect correction were provided.

H. Image Composition

Information and specifications on the scanning method, the scanning speed, and the number of planes at the Z-axis to be digitized were provided. Test data to analyze the image composition performance was provided.

-Image Files Format

Information and specifications on the compression method, compression ratio, file format, and file organization were provided.

9

Image Review Manipulation Software J.

Information and specifications on continuous panning and pre-fetching, continuous zooming, discrete Z-axis displacement, the ability to compare multiple slides simultaneously on multiple windows, image enhancement and sharpening functions, color manipulation, annotation tools, and digital bookmarks were provided.

K. Computer Environment

Information and specifications on the computer hardware, operating system, graphics card, graphics card driver, color management settings, color profile, and display interface were provided.

L. Display

Information and specifications on the technological characteristics of the display such as pixel density, aspect ratio, display viewing area, display surface, backlight type, panel type, viewing angle, pixel pitch, resolution , color space, max brightness, contrast ratio, response time, refresh rate (max), color accuracy, color adjustment, gamma adjustment, adjustments, display interface, and certificate were provided. Test data to verify the performance of the display for user controls, spatial resolution, pixel defects (count and map), artifacts, temporal response, maximum and minimum luminance (achievable and recommended), grayscale, luminance uniformity, bidirectional reflection distribution function, gray tracking, color scale, and color gamut volume was provided.

M. Color Reproducibility

Test data to evaluate the color reproducibility of the system was provided.

N. Spatial Resolution

Test data to evaluate the composite optical performance of all components in the image acquisition phase was provided.

O. Focusing test

Test data to evaluate the technical focus quality of the system was provided.

10

P. Whole Slide Tissue Coverage

Test data to demonstrate that the entire tissue specimen on the glass slide is detected by the tissue detection algorithms and that all the tissue specimens are included in the digital image file was provided.

Q. Stitching Error

Test data to evaluate the stitching errors and artifacts in the reconstructed image was provided.

Turnaround Time R.

Test data to evaluate the turnaround time of the system was provided.

S. User Interface

Information on the parts of the system that users interact with, along with human factors/usability validation testing performed to demonstrate that representative users of the WSI system can perform essential tasks and those critical to safety under simulated use conditions was provided.

2. User Interface/Human Factors Validation

Human factors studies designed to assess performance of critical user tasks and use scenarios by representative users, including anatomic pathology lab technicians and pathologists, were conducted. Information provided included a list of all critical user tasks and a description of the process that was followed. A systematic evaluation of simulated use by representative participants (17 technicians and 18 pathologists) performing all tasks (including critical tasks) required for operation of the system, and subjective assessment of potential failure modes was provided. All participants were able to perform all tasks (including the critical tasks), and no critical task failures were observed. There were some occasional difficulties that are generally expected with any new system with software and instrument(s), but the system's learnability and ease of use appeared to be sufficient. All user difficulties observed in the studies had minimal impact on the perception of the usability, and no difficulties or failures were observed performing tasks that could lead to permanent or serious patient harm. In all instances, both pathologists and histopathology technicians were able to identify cases and ensure that all information needed to perform primary diagnosis was available and accessible.

8

11

3. Electromaqnetic Compatibility (EMC) Testing

VENTANA DP 200 EMC testing was performed by external vendor Intertek testing services using the following test standards: EN 61326-1:2013, EN 61326-2-6:2013, EN 61000-3-2:2014 and EN 61000-3-3:2013. According to the Intertek test report, the emission testing was performed in accordance with EN 55011:2009 +A1:2010 and VENTANA DP 200 met the testing requirements classified for Group 1, Class A equipment. VENTANA DP 200 device also passed the various immunity testing categories based on the corresponding test standards.

Clinical Testing 4.

Clinical Accuracy Study A.

A multi-center study was conducted to demonstrate that viewing, reviewing, and diagnosing digital images of surgical pathology FFPE tissue slides using Roche Digital Pathology Dx (VENTANA DP 200) system is non-inferior to using traditional optical (light) microscopy. The primary endpoint was the difference in agreement rates between diagnoses rendered using Roche's digital WSI review modality (digital read [DR]) and the manual microscopy slide review modality (manual read [MR]) when each was compared to the reference diagnosis, which was based on the original sign-out pathologic diagnosis rendered at the study sites using an optical (light) microscope.

Four sites were used in the study. Two Screening Pathologists at each study site pre-screened cases from that site for possible inclusion in the study by reviewing their clinical database of archived specimens. Cases were identified sequentially (chronologically or reverse chronologically) with a minimum of one year between the date of sign-out diagnosis and beginning of the study. The first Screening Pathologist reviewed all available H&E and ancillary stained slides (i.e., immunohistochemistry and special stains) for each case using manual microscopy to determine whether the case met the study inclusion/exclusion criteria. By reviewing the microscopic slides used to make the sign-out diagnosis, the slide(s) that were representative of the sign-out diagnosis for the case were identified. The Screening Pathologist confirmed the diagnosis related to that particular case by microscopically evaluating the H&E and any ancillary stained slides along with the relevant clinical information as extracted from the sign-out report.

9

12

Once the case slides were reviewed and selected by the first Screening Pathologist, the signout diagnosis report data captured was verified by a second Screening Pathologist to confirm the diagnostic accuracy of the case and whether the case met the study inclusion/exclusion criteria.

Across 4 sites, a total of 2047 cases (a total of 3259 slides) consisting of multiple organ and tissue types were enrolled. At each site, all 4 Reading Pathologists read all the cases enrolled at that site using both MR and DR modalities in an alternating fashion and randomized order and with a washout period of at least 30 days between the MR and DR diagnoses, resulting in an expected total of 8188 DR diagnoses and 8188 MR diagnoses, or 16376 total diagnoses.

The 16 Reading Pathologists were provided with all representative slide(s) for each case at the same time, mimicking a practice setting. An electronic case report form (eCRF) was completed to document each Reading Pathologist's diagnosis. For a given case, up to 3 Adjudication Pathologists were assigned to review the Reading Pathologists' diagnoses that were rendered using MR and DR compared against the corresponding sign-out diagnoses and determined whether the reader diagnoses agreed with the reference sign-out diagnoses, disagreed with minor differences, or disagreed with major differences. A disagreement with major differences was defined as a difference in diagnosis that would be associated with a clinically important difference in patient management. A disagreement with minor differences was defined as a difference in diagnosis that would not be associated with a clinically important difference in patient management. First, 2 Adjudication Pathologists (adjudicators) separately assessed each Reading Pathologist's primary diagnosis evaluation for a case against the case's original sign-out diagnosis while blinded to site, Reading Pathologist's information, and reading modality (DR or MR). In the event that there was a disagreement between the 2 adjudicators, a third Adjudication Pathologist reviewed the case to achieve majority consensus. In cases where all 3 adjudicators had a different opinion, consensus was arrived at in an adjudication panel meeting consisting of the same 3 Adjudication Pathologists.

The primary objective was to demonstrate the non-inferiority of DR accuracy compared to MR accuracy. Diagnostic accuracy was defined as the agreement between the Reading Pathologist primary diagnoses in each reading mode as compared to the case's original reference sign-out diagnosis. The primary endpoint was the difference in diagnostic accuracy

10

13

between DR and MR. The acceptance criterion for this endpoint was based on a hypothesis of non-inferiority. The lower bound of a 2-sided 95% confidence interval for the difference in accuracy (DR - MR) had to be greater than or equal to -4% to declare the DR method to be non-inferior to the MR method. A total of 7562 DR diagnoses paired with 7562 MR diagnoses adjudicated by the adjudication panel had consensus scores and were included in the statistical analyses. The observed overall agreement rate over all sites, Reading Pathologists, and organs, was 92.00% for DR modality and 92.61% for MR modality. The DR-MR difference in agreement rate was -0.61% (95% CI: -1.59%, 0.35%).

In addition to the observed analysis, a Generalized Linear Mixed Model (GLIMMIX) logistic regression was conducted on the Intent to Adjudicate population to demonstrate the noninferiority of the DR agreement rate as compared to the MR agreement rate. For each reading result and reading mode, the dependent variable was the agreement with sign-out diagnosis status. The model accounted for fixed study effects (reading modality and organ type) and random study effects (site and reader nested within site). The agreement rates as estimated by the GLIMMIX logistic model ("Model") resulted in similar proportions as the study point estimates, i.e., 91.54% for DR modality and 92.16% for MR modality. The DR-MR difference in agreement rate was -0.62%, with 2-sided 95% CI of [-1.50%, 0.26%]. These model results failed to show any statistically significant difference between the 2 reading modalities.

The lower limit of the 95% confidence interval of DR-MR was greater than the pre-specified non-inferiority margin of -4%, and therefore, the DR modality using Roche Digital Pathology Dx was demonstrated to be non-inferior to the MR modality using light microscopy. Thus, the study met the primary objective.

14

Whole Slide Imaging Review (DR)Light Microscope Slide Review (MR)Difference (DR - MR)
Total
Reads%
discordant95% CITotal Reads%
discordant95% CI%
discordant95% CI
Observed75628.006.73, 9.2775627.396.11, 8.780.61-0. 35, 1.59
Model77258.467.35, 9.7177447.846.80, 9.120.62-0.26, 1.50

Table 3: Overall Major Discrepancy Rates

15

| Organ Type | Digital Read (DR) | Manual Read (MR) | Difference in
Agreement (DR-MR) |
|-----------------------------------------------------|-------------------|------------------|------------------------------------|
| Anus/ Perianal | 93.0% | 95.7% | -2.7% |
| Appendix | 98.4% | 100.0% | -1.6% |
| Bladder | 85.9% | 87.8% | -1.8% |
| Brain/ Neurological | 94.3% | 92.4% | 1.9% |
| Breast | 91.0% | 93.2% | -2.1% |
| Colorectal | 93.2% | 93.0% | 0.2% |
| Endocrine | 91.3% | 92.1% | -0.8% |
| GE Junction | 90.7% | 91.5% | -0.9% |
| Gallbladder | 100.0% | 100.0% | 0.0% |
| Gynecological | 89.6% | 89.6% | 0.0% |
| Hernial/ Peritoneal | 100.0% | 100.0% | 0.0% |
| Kidney, Neoplastic | 96.2% | 94.9% | 1.3% |
| Liver/ Bile duct, Neoplastic | 97.0% | 98.5% | -1.5% |
| Lung/ Bronchus/ Larynx /Oral Cavity/
Nasopharynx | 89.4% | 92.3% | -2.9% |
| Lymph Node | 97.1% | 97.8% | -0.7% |
| Prostate | 93.4% | 92.9% | 0.5% |
| Salivary Gland | 95.3% | 94.8% | 0.5% |
| Skin | 89.6% | 89.4% | 0.2% |
| Soft Tissue Tumors | 96.6% | 93.1% | 3.4% |
| Stomach | 92.4% | 93.6% | -1.2% |
| Overall | 92.0% | 92.6% | -0.6% |

Agreement with Sign-Out Diagnosis Rates by Organ Table 4:

The difference in modality agreement, DR-MR, ranged from -2.9% for lung to 3.4% for soft tissue tumors. Three organ types, gallbladder, gynecological, and hernial/peritoneal had no differences (difference between reading modalities of 0.0%). For all organ types, the overall agreement rate was 92.0% for DR and 92.6% for MR, with a difference in agreement (DR-MR) of -0.6%. The lowest agreement in both modalities was observed with bladder cases, with an agreement rate of 85.9% for DR and 87.8% for MR.

16

When considering the 1592 cases where all 4 Reading Pathologists at the site provided successfully adjudicated diagnoses for both DR and MR, there were a total of 9552 MR/MR comparisons and 9552 DR/DR comparisons between each possible pair of readers for agreement rate calculation. Overall, when considering all reader comparisons across all sites, the between-reader agreement rate was 91.4% (95% CI: 90.8, 91.9) for MR, and 90.6% (95% CI: 90.0, 91.1) for DR. The between-reader agreement rate ranged from 86.9% to 95.2% for MR, and 85.1% to 94.5% for DR.

Precision Study B.

Analytical performance 1.

  • a. Reproducibility/Precision:
    The objective of the reproducibility/precision study was to evaluate the between-site, between-day, between-reader, and within-reader precision of Roche Digital Pathology Dx. The study was designed to examine the full scope of device variability, with multiple Reading Pathologists (readers) independently identifying specific histological primary features in multiple Regions of Interest (ROIs, which is an equivalent term to the predicate's usage of "Fields of View [FOVs]") pre-selected on WSI scans generated by multiple VENTANA DP 200 scanners (at multiple external pathology laboratories) across multiple scanning days.

The precision of the device was evaluated based on the ability of 2 readers at each of 3 external pathology laboratories (study sites) to consistently detect 23 protocol-specified histopathologic primary feature types in VENTANA DP 200 WSI scans of hematoxylin and eosin (H&E)-stained archival slides containing sections of a variety of FFPE human tissue and organ types. The list of 23 primary feature types examined (shown in Table 1) was identical to those examined in the predicate device precision study, although the cases tested were unique to the Roche study. Scanning was performed at either 20x (for 12 primary feature types) or at 40x (for 11 primary feature types) as designated by the study protocol.

17

For each of the 23 feature types, 3 unique study cases, generally from different organ systems or tissue types (see Table 1 for distribution), were enrolled and included in the study analyses, for a total analysis cohort of 69 cases (slides). H&E-stained slides from 12 additional unique cases were included in the study as "wild card" slides to reduce recall bias but were excluded from the statistical analyses, as in the predicate studies. Each of the wild card slides also contained 3 ROIs selected by the Screening Pathologist, but the 3 ROIs for a given wild card slide did not have to contain the same type of primary feature. Since each slide enrolled in the study contained 3 ROIs, a total of 207 "study" ROIs (69 study cases x 3 ROIs/study case) were included in the analyses, and an additional 36 wild card ROIs (12 wild card cases x 3 ROIs/wild card case) were included in the study.

| Primary Feature
(Scanning Magnification

Level)Organ Systems/Tissue Types of the 3 Enrolled Study Cases
Chondrocytes (20x)Bone (left proximal
humerus)Bone (right scapula)Soft tissue (chest, xiphoid)
Fat cells (adipocytes) (20x)Lymph node (anterior
prostatic)Lymph node (pelvis,
right)Omentum
Foreign body giant cells (20x)Breast (lower outer)Liver #1Soft tissue (sixth
intercostal muscle
Goblet cells (20x)Colon (ascending)Duodenum (second
portion)Lung (left upper lobe)
Granulomas (20x)Lung (left lower lobe)Lymph node (inguinal)Lymph node (right
axillary)
Infiltrating or metastatic lobular
carcinoma (20x)Breast (left)Breast (right)Chest wall (right)
Intraglandular necrosis (20x)Breast (left)Breast
(right, lower outer
quadrant)Oral cavity
(left buccal mucosa)
Osteoclasts (20x)Bone (fibula, proximal,
lesion, right)Bone (left temporal
bone)Bone (tibia, left)
Osteocytes (20x)Bone (frontal, right)Bone (right tibia)Pelvis
(left, acetabular lesion)
Pleomorphic nucleus of
malignant cell (20x)Brain (right temporal
mass)Soft tissue (left thigh)Soft tissue (right thigh)
Serrated intestinal epithelium
(eg, sessile serrated polyp) (20x)Colon (ascending, polyp)Colon (ascending, polyp)Colon (ascending, polyp)
Skeletal muscle fibers (20x)Breast (left)Left thighThyroid gland
Asteroid bodies (40x)Knee, right, synovium #3Lung (upper lobe)Lymph node
Clear cells (40x)Aorta, inter aorta caval
lymph nodeLeft kidneyOvary and fallopian tube
Foreign bodies (eg, plant
material or foreign debris) (40x)Aorta, ascending,
pseudoaneurysm wallSmall intestine and colonSoft tissue (abdomen)
Hemosiderin (pigment) (40x)Breast (left chest wall
nodule)Breast (right)Breast (right)
Megakaryocytes (40x)Bone
(distal sternum and
right ribs)Bone (rib, right)Buttock (left, lesion)
Necrosis (40x)Lung (right lower lobe)Lung (right upper lobe)Right great toe
Nerve cell bodies (eg, ganglion
cells) (40x)Colon (cecum, polyp x2)EsophagusSoft tissue
(left paraspinal)

Table 5: Primary Histologic Study Features Used in the Precision Study

18

| Primary Feature
(Scanning Magnification

Level)Organ Systems/Tissue Types of the 3 Enrolled Study Cases
Nuclear grooves (40x)Bone (left acetabulum)Left fallopian tube and
left ovaryThyroid (right)
Osteoid matrix (40x)Bone (left acetabulum)Bone (right tibia)Bone (right ulna)
Psammoma bodies (40x)Brain (posterior fossa
tumor)Brain (right frontal tumor)Thyroid (lobe, left)
Reed-Sternberg cell (40x)Lymph node (cervical right,
level IV)Neck mass (right)Thymus

The ROIs from the 3 scanning sessions at each site were independently evaluated by the 2 readers at that site in 3 different reading session per scanning day) with at least a 2-week washout period in between each reading session. In each reading session, each reader evaluated, in randomized fashion, all 207 study ROIs from each of 3 scanning sessions (across 3 scanning systems) at their site, plus the randomly interspersed, unique "wild card" ROIs to reduce recall bias between reading sessions.

Thus, all 6 readers evaluated 207 study ROIs (and 36 wild card ROIs) in each of 3 different reading sessions, and their study ROI assessments were used for the co-primary analyses of between-system/site, between-day/within-system, and between-reader precision. Each reader also participated in a fourth reading session in which they re-evaluated their site's day 1 ROI images in a different random order; their 2 assessments for each day 1 ROI image were then compared to each other to determine within-reader precision as an additional analysis.

In each of their reading sessions, each reader accessed the designated ROI images from their site in uPath and evaluated them to identify any primary features that were present, using a checklist of the 23 protocol-specified primary features and their designated magnification levels as a reference. During their evaluations, readers were provided with the scanning magnification level and organ system/tissue type (conveyed as shown in Table 1) for the ROI image and were able to move about freely on the ROI image, varying the viewing magnification as desired, but they were blinded to case ID, patient clinical information and sign-out diagnoses, and all previous screening or study results. Each primary feature assessment for a study case ROI image was then compared to the reference primary feature for that case, and the agreement that the reference feature was present was evaluated between readers, between sites, and between scanning days. As with the predicate device's precision study, if a reader identified a primary feature other than the reference feature as present in a given ROI image (in addition to or instead of the reference feature), that non-reference

16

19

identification had no effect on the study endpoints. Only the results from the study ROIs were used in the statistical analyses; those from wild card ROIs were excluded from the analyses. The precision of the system was to be considered acceptable if the lower bounds of the 2sided 95% CIs for all co-primary endpoints (i.e., the overall percent agreement [OPA] point estimate for between-site/system, between-day/within-system, and between-reader agreement) were at least 85%. No acceptance criterion was defined for within-reader agreement.

Results of the Precision Analyses

The results of the co-primary analyses of precision (between-site/system, betweenday/within-system, and between-reader OPA point estimates) are summarized below in Table 6. More details of these analyses are provided in the following sections, followed by a discussion of the within-reader analysis and results. For each of the co-primary analyses, the lower bounds of the 95% CI for the 3 OPA point estimate was >85%, demonstrating that Roche Digital Pathology Dx has acceptable precision. For these analyses, OPA was aggregated across the other sources of variation (eg, between-sites OPA was aggregated across readers and days).

Table 6: Overall Percent Agreement Rates for Co-Primary Endpoints

Co-Primary EndpointOPA (n/N)*95% CI**
Between-Site/System89.3 (19510/21839)(85.8, 92.4)
Between-Days/Within-System90.3 (3302/3656)(87.1, 93.2)
Between-Readers90.1 (1650/1832)(86.6, 93.0)

*n = number of pairwise comparisons for which the reference primary feature was identified in both assessments; N = total number of pairwise comparisons.

**Two-sided 95% confidence intervals were constructed using the percentile bootstrap method from 2000 replicates.

20

| System | Number of Pairwise
Agreements | Number of
Comparison Pairs | Agreement Rate and 95% CI | |
|-------------------|----------------------------------|-------------------------------|---------------------------|--------------|
| | | | % Agreement | 95% CI |
| Site A vs. Site B | 6277 | 7210 | 87.1 | (83.2, 90.7) |
| Site A vs. Site C | 6572 | 7284 | 90.2 | (86.8, 93.3) |
| Site B vs. Site C | 6661 | 7345 | 90.7 | (87.2, 93.8) |
| Overall | 19510 | 21839 | 89.3 | (85.8, 92.4) |

Between-Site/Between-System Precision Study Results Table 7:

Between-Day / Within-System Precision

Between-day (within-system) precision was analyzed by first performing all possible pairwise comparisons between days for each reader separately (ie, for each reader, their Day 1 results were compared to their Day 2 results, their Day 2 results were compared to their Day 3 results, and their Day 1 results were compared to their Day 3 results) and then pooling the 3 day-pair results together. These individual reader results were then aggregated across all readers at all sites to determine overall between-day/within-system precision.

Table 8: Between-Day / Within-System Precision Study Results
Site, ReaderNumber of Day to Day Pairwise AgreementsNumber of Comparison PairsAgreement Rate and 95% CI
% Agreement95% CI
Site A, Reader 149559283.6(78.7, 88.2)
Site A, Reader 254960790.4(86.0, 94.6)
Site B, Reader 154061388.1(83.7, 92.3)
Site B, Reader 254160489.6(84.9, 93.7)
Site C, Reader 160762197.7(94.8, 100.0)
Site C, Reader 257061992.1(88.4, 95.5)
Overall3302365690.3(87.1, 93.2)

Between-Reader Precision

In the between-reader precision analysis, the pairwise agreement between the 2 readers within a site (ie, between the 2 readers at Site A, between the 2 readers at Site B, and between the 2 readers at Site C) was analyzed separately for each site, and these pairwise results were then pooled across all sites.

21

| Pathologist (Reader) | Number of Pairwise
Agreements | Number of
Comparison Pairs | Agreement Rate and 95% CI | |
|----------------------|----------------------------------|-------------------------------|---------------------------|--------------|
| % Agreement | 95% Cl | | | |
| Reader A1 vs A2 | 528 | 603 | 87.6 | (83.3, 91.4) |
| Reader B1 vs B2 | 536 | 609 | 88.0 | (83.6, 92.2) |
| Reader C1 vs C2 | 586 | 620 | 94.5 | (91.5, 97.3) |
| Overall | 1650 | 1832 | 90.1 | (86.6, 93.0) |

Between-Reader Precision Study Results Table 9:

Within-Reader Precision

The within-reader precision analysis compared each reader's first assessment of their site's Day 1 ROI images (performed in their first reading session) with the same reader's second assessment of the same ROI images (performed in their fourth reading session), with the images presented in a different random order in each session. Due to the minimum 2-week washout period between reading sessions 1, 2, 3, and 4, the assessments used in the withinreader precision analyses therefore were performed at least 6 weeks apart. Pairwise agreement between reads was assessed for each reader separately, and the results were aggregated across all readers at all sites to determine the overall OPA for within-reader precision. Within-reader precision did not have a predefined acceptance criterion.

Table 10: Within-Reader Precision Study Results
---------------------------------------------------------

| Pathologist (Reader) | Number of Pairwise
Agreements | Number of
Comparison Pairs | Agreement Rate and 95% CI | |
|----------------------|----------------------------------|-------------------------------|---------------------------|---------------|
| | | | % Agreement | 95% CI |
| Site A, Reader 1 | 160 | 200 | 80.0 | (73.4, 86.1) |
| Site A, Reader 2 | 183 | 203 | 90.1 | (85.7, 94.5) |
| Site B, Reader 1 | 172 | 206 | 83.5 | (78.0, 88.9) |
| Site B, Reader 2 | 175 | 201 | 87.1 | (81.4, 92.0) |
| Site C, Reader 1 | 202 | 207 | 97.6 | (94.7, 100.0) |
| Site C, Reader 2 | 186 | 206 | 90.3 | (86.0, 94.2) |
| Overall | 1078 | 1223 | 88.1 | (84.8, 91.3) |

CONCLUSION

The submitted information in this premarket notification is complete and supports a substantial equivalence decision.