K Number
DEN200080
Device Name
Paige Prostate
Manufacturer
Date Cleared
2021-09-21

(264 days)

Product Code
Regulation Number
864.3750
AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdparty
Intended Use
Paige Prostate is a software only device intended to assist pathologists in the detection of foci that are suspicious for cancer during the review of scanned whole slide images (WSI) from prostate needle biopsies prepared from hematoxylin & eosin (H&E) stained formalinfixed paraffin embedded (FFPE) tissue. After initial diagnostic review of the WSI by the pathologist, if Paige Prostate detects tissue morphology suspicious for cancer, it provides coordinates (X,Y) on a single location on the image with the highest likelihood of having cancer for further review by the pathologist. Paige Prostate is intended to be used with slide images digitized with Philips Ultra Fast Scanner and visualized with Paige FullFocus WSI viewing software. Paige Prostate is an adjunctive computer-assisted methodology and its output should not be used as the primary diagnosis. Pathologists should only use Paige Prostate in conjunction with their complete standard of care evaluation of the slide image.
Device Description
Paige Prostate is an in vitro diagnostic medical device software, derived from a deterministic deep learning system that has been developed with digitized WSIs of H&E stained prostate needle biopsy slides. Paige Prostate utilizes several accessory devices as shown in Figure 1 below, for automated ingestion of the input. The device identifies areas suspicious for cancer on the input WSIs. For each input WSI, Paige Prostate automatically analyzes the WSI and outputs the following: - . Binary classification of suspicious or not suspicious for cancer based on a pre-defined threshold on the neural network output. - . If the slide is classified as suspicious for cancer, a single coordinate (X,Y) of the location with the highest probability of cancer on an image determined to be suspicious for cancer.
More Information

Not Found

The provided text does not contain any K/DEN numbers for reference devices. The "Reference Device(s)" section explicitly states "Not Found".

Yes
The device description explicitly states it is "derived from a deterministic deep learning system" and "utilizes several accessory devices... for automated ingestion of the input. The device identifies areas suspicious for cancer on the input WSIs... Paige Prostate automatically analyzes the WSI and outputs... Binary classification... based on a pre-defined threshold on the neural network output." Deep learning and neural networks are forms of machine learning/AI.

No.
The device assists pathologists in detecting suspicious foci for cancer; it does not directly treat or prevent a disease or condition.

Yes

The device is intended to assist pathologists in the detection of foci suspicious for cancer during the review of scanned whole slide images (WSI) from prostate needle biopsies. Its output includes a binary classification of suspicious or not suspicious for cancer and coordinates of the location with the highest probability of cancer, explicitly designed for diagnostic purposes. It is also described as an "in vitro diagnostic medical device software."

Yes

The device description explicitly states "Paige Prostate is an in vitro diagnostic medical device software". While it utilizes accessory devices for input ingestion, the core medical device being submitted for 510(k) is the software itself, which performs the analysis and provides the output.

Yes, this device is an IVD (In Vitro Diagnostic).

Here's why:

  • Explicitly Stated: The "Device Description" section explicitly states: "Paige Prostate is an in vitro diagnostic medical device software..."
  • Intended Use: The intended use is to assist pathologists in the detection of foci suspicious for cancer during the review of scanned whole slide images from prostate needle biopsies. This involves analyzing biological samples (tissue from biopsies) to provide information for a medical diagnosis.
  • Input: The input is "scanned whole slide images (WSI) from prostate needle biopsies prepared from hematoxylin & eosin (H&E) stained formalin-fixed paraffin embedded (FFPE) tissue." This is a biological sample that has been processed for microscopic examination.
  • Output: The output is a binary classification of suspicious or not suspicious for cancer and, if suspicious, a coordinate location. This output is directly related to the analysis of the biological sample and is intended to aid in a medical diagnosis.
  • Adjunctive Tool: While it's an adjunctive tool and not for primary diagnosis, its purpose is to provide information derived from the in vitro analysis of a biological sample to assist in a diagnostic process.

All these factors align with the definition of an In Vitro Diagnostic device, which is used to examine specimens derived from the human body to provide information for the diagnosis, prevention, or treatment of a disease or condition.

No
The letter mentions identified risks and mitigation measures but does not state that the FDA has reviewed and approved or cleared a Predetermined Change Control Plan (PCCP) for this specific device.

Intended Use / Indications for Use

Paige Prostate is a software only device intended to assist pathologists in the detection of foci that are suspicious for cancer during the review of scanned whole slide images (WSI) from prostate needle biopsies prepared from hematoxylin & eosin (H&E) stained formalin-fixed paraffin embedded (FFPE) tissue. After initial diagnostic review of the WSI by the pathologist, if Paige Prostate detects tissue morphology suspicious for cancer, it provides coordinates (X,Y) on a single location on the image with the highest likelihood of having cancer for further review by the pathologist.

Paige Prostate is intended to be used with slide images digitized with Philips Ultra Fast Scanner and visualized with Paige FullFocus WSI viewing software.

Paige Prostate is an adjunctive computer-assisted methodology and its output should not be used as the primary diagnosis. Pathologists should only use Paige Prostate in conjunction with their complete standard of care evaluation of the slide image.

Product codes

QPN

Device Description

Paige Prostate is an in vitro diagnostic medical device software, derived from a deterministic deep learning system that has been developed with digitized WSIs of H&E stained prostate needle biopsy slides.

Paige Prostate utilizes several accessory devices as shown in Figure 1 below, for automated ingestion of the input. The device identifies areas suspicious for cancer on the input WSIs. For each input WSI, Paige Prostate automatically analyzes the WSI and outputs the following:

  • . Binary classification of suspicious or not suspicious for cancer based on a pre-defined threshold on the neural network output.
  • . If the slide is classified as suspicious for cancer, a single coordinate (X,Y) of the location with the highest probability of cancer on an image determined to be suspicious for cancer.
  • If the slide is classified as not suspicious for cancer, no additional output will be available ● by Paige Prostate. The Paige FullFocus WSI viewer will display "Not Suspicious for Cancer - Area of Interest Not Available".

Mentions image processing

Yes

Mentions AI, DNN, or ML

Paige Prostate is an in vitro diagnostic medical device software, derived from a deterministic deep learning system that has been developed with digitized WSIs of H&E stained prostate needle biopsy slides.

Input Imaging Modality

Scanned whole slide images (WSI) from prostate needle biopsies prepared from hematoxylin & eosin (H&E) stained formalin-fixed paraffin embedded (FFPE) tissue.

Anatomical Site

Prostate

Indicated Patient Age Range

Not Found

Intended User / Care Setting

Pathologists / Prescription use only, in vitro diagnostic (IVD) use only, in conjunction with Philips Ultra Fast Scanner and FullFocus image viewing software, remote or on-site office.

Description of the training set, sample size, data source, and annotation protocol

Algorithm development: Paige Prostate algorithm development was performed on training, tuning, and test datasets. Each dataset contained slides from unique patients ensuring that training, tuning, and test datasets do not have any slides, cases, or patients in common. De-identified slides were labeled as benign or cancer based on the synoptic diagnostic pathology report. These datasets were completely independent from the validation dataset.

Training Dataset:

  • De-identified slides from cases prepared and diagnosed at internal site located in US, from 2013-2017, scanned with an Aperio Leica AT2 scanner
  • Number of slide images: 33,543

Tuning Dataset:

  • Slides prepared and diagnosed at internal site*, scanned with an Aperio Leica AT2 scanner
  • Number of slide images: 5,598

Description of the test set, sample size, data source, and annotation protocol

Test Datasets:

  • Same as tuning dataset, but scanned on Philips PIPS scanner. Number of slide images: 5,598
  • Slides prepared at external sites but diagnosed at internal site located in US, scanned with an Aperio Leica AT2 scanner. Number of slide images: 10,605

Summary of Performance Studies (study type, sample size, AUC, MRMC, standalone performance, key results)

Analytical Performance

a. Algorithm Localization (X,Y Coordinate) and Accuracy Study:

  • Study Type: Evaluation of Paige Prostate's performance in identifying suspicious digital histopathology images of prostate needle biopsies and localizing a specific focus (X,Y coordinate) with the highest suspicion for cancer.
  • Sample Size: The final sample set consisted of 728 WSIs (311 WSIs from cancer slides and 417 WSIs from benign slides). Originally 847 scanned digital WSIs of prostate needle biopsy slides (353 cancer and 494 benign). Unique patient-level cases were used in the data analysis.
  • Data Source: Scanned images obtained using the FDA-cleared Phillips UFS scanner.
  • Annotation Protocol: (X,Y) coordinates identified by Paige Prostate were evaluated against manual annotations of regions drawn by 3 study pathologists (blinded to Paige Prostate results). Localization ground truths were determined by the union of annotations between at least 2 of the 3 annotating pathologists. Pathologists were instructed to draw reasonably tight boundaries for annotations enclosing cancerous regions, allowing benign cells to be mixed.
  • Standalone Performance / Key Results:
    • Sensitivity: 94.5% (294/311) [95% CI: 91.4%; 96.6%]
    • Specificity: 94.0% (392/417) [95% CI: 91.3%; 95.9%]
    • False Negative (FN): 17 out of 311 (9 FN from slide identified as benign, 8 FN from slide identified as cancer but incorrect localization).
    • False Positive (FP): 25 out of 417.
    • Sensitivity stratified by source: Internal site 94.1% (128/136), External sites 94.9% (166/175).
    • Specificity stratified by source: Internal site 96.7% (177/183), External sites 91.9% (215/234).
    • Differences in specificity between internal and external sites attributed to diversity in patient population and slide preparation techniques.

b. Precision Study:

  • Study Type: Evaluation of the precision of Paige Prostate in identifying suspicious digital histopathology images of prostate needle biopsies and localizing a focus within a prespecified distance threshold with the highest suspicion for cancer.
  • Sample Size: 35 WSIs prostate cancer slides and 36 WSIs benign prostate slides from unique patients (for within-scanner and reproducibility studies). Localization precision study used 19 cancer and 4 benign WSIs (95 crops from 19 cancer slides).
  • Data Source: Not explicitly stated beyond "Slides used in the precision studies were not slides used during development".
  • Ground Truth: Synoptic diagnostic reports from the internal site.
  • Key Results:
    • Within-scanner precision study:
      • Overall Average Positive Percent Agreement (PPA) for cancer images: 99.0% (95% CI: 97.1%, 100.0%)
      • Overall Average Negative Percent Agreement (NPA) for benign images: 94.4% (95% CI: 88.9%, 99.1%)
      • Percent of cancer slide images with correct results with all 3 scan repetitions: 97.1% (34/35)
      • Percent of benign slide images with correct results with all 3 scan repetitions: 88.9% (32/36)
    • Reproducibility (between-scanner and between-operator variability) study:
      • Overall Average PPA for cancer images: 100%
      • Overall Average NPA for benign images: 93.5% (95% CI: 88.0%, 98.1%)
      • Percent of cancer slide images with correct results with all 3 scans: 100% (35/35)
      • Percent of benign slide images with correct results with all 3 scans: 83.3% (30/36)
    • Localization precision study:
      • 3 scans of Operator 1/Scanner 1: 98.2% (56/57) correct location (95% CI: 90.7%; 99.7%)
      • 3 scanners (1 scan per scanner) Operator 1/Scanner 1, Operator 2/Scanner 2, Operator 3/Scanner 3: 96.4% (53/55) correct location (95% CI: 87.7%; 99.0%)

2. Clinical Study:

  • Study Type: Retrospective clinical study to evaluate the effectiveness of Paige Prostate in improving the diagnostic accuracy of pathologists.
  • Sample Size: 527 WSIs (171 prostate cancer slides and 356 benign slides) from unique patients. 16 pathologist readers (2 GU subspecialists and 14 general specialists).
  • Data Source: Original diagnostic synoptic reports for slide-level cancer/benign ground truths. Whole slide images from cases prepared, reviewed, diagnosed, and digitized at an internal site (44.15%) and 156 different external sites (55.85%).
  • Annotation Protocol: Pathologists performed an unassisted read followed by an assisted read for every WSI. They classified each slide as cancer, no cancer, or defer for more information.
  • Key Results:
    • The clinical study demonstrated improvements in sensitivity and small differences in specificities between assisted and unassisted reads.
    • Combined Data (Average of 16 pathologists):
      • Average improvement in sensitivity: 7.3% (95% CI: 3.9%; 11.4%) (statistically significant). Median: 5.6%, Range: 0.6% to 24.0%.
      • Average difference in specificity: 1.1% (95% CI: -0.7%; 3.4%) (not statistically significant). Median: 0.4%, Range: -1.7% to 13.8%.
    • Reduction in False Negatives: Overall reduction in false negative slides was 12.56 slides, which is 7.3% (=12.56/171) (statistically significant).
    • Difference in False Positives: Overall difference in false positive slide images was 3.75 slides, which is 1.05% (=3.75/356) (not statistically significant).
    • Specialist vs. Generalist:
      • Generalist (Remote): Average improvement in sensitivity 8.3%, Difference in specificity 1.5%.
      • Generalist (On-site): Average improvement in sensitivity 6.6%, Difference in specificity 1.0%.
      • Specialist (Remote): Average improvement in sensitivity 2.9%, Difference in specificity -1.3%.

Key Metrics (Sensitivity, Specificity, PPV, NPV, etc.)

Algorithm Localization and Accuracy Study:

  • Sensitivity: 94.5% (294/311)
  • Specificity: 94.0% (392/417)

Clinical Study (Average of 16 pathologists):

  • Improvement in Sensitivity (Assisted vs. Unassisted): 7.3%
  • Difference in Specificity (Assisted vs. Unassisted): 1.1%

Predicate Device(s)

Not Found

Reference Device(s)

Not Found

Predetermined Change Control Plan (PCCP) - All Relevant Information

Not Found

§ 864.3750 Software algorithm device to assist users in digital pathology.

(a)
Identification. A software algorithm device to assist users in digital pathology is an in vitro diagnostic device intended to evaluate acquired scanned pathology whole slide images. The device uses software algorithms to provide information to the user about presence, location, and characteristics of areas of the image with clinical implications. Information from this device is intended to assist the user in determining a pathology diagnosis.(b)
Classification. Class II (special controls). The special controls for this device are:(1) The intended use on the device's label and labeling required under § 809.10 of this chapter must include:
(i) Specimen type;
(ii) Information on the device input(s) (
e.g., scanned whole slide images (WSI), etc.);(iii) Information on the device output(s) (
e.g., format of the information provided by the device to the user that can be used to evaluate the WSI, etc.);(iv) Intended users;
(v) Necessary input/output devices (
e.g., WSI scanners, viewing software, etc.);(vi) A limiting statement that addresses use of the device as an adjunct; and
(vii) A limiting statement that users should use the device in conjunction with complete standard of care evaluation of the WSI.
(2) The labeling required under § 809.10(b) of this chapter must include:
(i) A detailed description of the device, including the following:
(A) Detailed descriptions of the software device, including the detection/analysis algorithm, software design architecture, interaction with input/output devices, and necessary third-party software;
(B) Detailed descriptions of the intended user(s) and recommended training for safe use of the device; and
(C) Clear instructions about how to resolve device-related issues (
e.g., cybersecurity or device malfunction issues).(ii) A detailed summary of the performance testing, including test methods, dataset characteristics, results, and a summary of sub-analyses on case distributions stratified by relevant confounders, such as anatomical characteristics, patient demographics, medical history, user experience, and scanning equipment, as applicable.
(iii) Limiting statements that indicate:
(A) A description of situations in which the device may fail or may not operate at its expected performance level (
e.g., poor image quality or for certain subpopulations), including any limitations in the dataset used to train, test, and tune the algorithm during device development;(B) The data acquired using the device should only be interpreted by the types of users indicated in the intended use statement; and
(C) Qualified users should employ appropriate procedures and safeguards (e.g., quality control measures, etc.) to assure the validity of the interpretation of images obtained using this device.
(3) Design verification and validation must include:
(i) A detailed description of the device software, including its algorithm and its development, that includes a description of any datasets used to train, tune, or test the software algorithm. This detailed description of the device software must include:
(A) A detailed description of the technical performance assessment study protocols (e.g., regions of interest (ROI) localization study) and results used to assess the device output(s) (e.g., image overlays, image heatmaps, etc.);
(B) The training dataset must include cases representing different pre-analytical variables representative of the conditions likely to be encountered when used as intended (e.g., fixation type and time, histology slide processing techniques, challenging diagnostic cases, multiple sites, patient demographics, etc.);
(C) The number of WSI in an independent validation dataset must be appropriate to demonstrate device accuracy in detecting and localizing ROIs on scanned WSI, and must include subsets clinically relevant to the intended use of the device;
(D) Emergency recovery/backup functions, which must be included in the device design;
(E) System level architecture diagram with a matrix to depict the communication endpoints, communication protocols, and security protections for the device and its supportive systems, including any products or services that are included in the communication pathway; and
(F) A risk management plan, including a justification of how the cybersecurity vulnerabilities of third-party software and services are reduced by the device's risk management mitigations in order to address cybersecurity risks associated with key device functionality (such as loss of image, altered metadata, corrupted image data, degraded image quality, etc.). The risk management plan must also include how the device will be maintained on its intended platform (
e.g. a general purpose computing platform, virtual machine, middleware, cloud-based computing services, medical device hardware, etc.), which includes how the software integrity will be maintained, how the software will be authenticated on the platform, how any reliance on the platform will be managed in order to facilitate implementation of cybersecurity controls (such as user authentication, communication encryption and authentication, etc.), and how the device will be protected when the underlying platform is not updated, such that the specific risks of the device are addressed (such as loss of image, altered metadata, corrupted image data, degraded image quality, etc.).(ii) Data demonstrating acceptable, as determined by FDA, analytical device performance, by conducting analytical studies. For each analytical study, relevant details must be documented (e.g., the origin of the study slides and images, reader/annotator qualifications, method of annotation, location of the study site(s), challenging diagnoses, etc.). The analytical studies must include:
(A) Bench testing or technical testing to assess device output, such as localization of ROIs within a pre-specified threshold. Samples must be representative of the entire spectrum of challenging cases likely to be encountered when the device is used as intended; and
(B) Data from a precision study that demonstrates device performance when used with multiple input devices (e.g., WSI scanners) to assess total variability across operators, within-scanner, between-scanner and between-site, using clinical specimens with defined, clinically relevant, and challenging characteristics likely to be encountered when the device is used as intended. Samples must be representative of the entire spectrum of challenging cases likely to be encountered when the device is used as intended. Precision, including performance of the device and reproducibility, must be assessed by agreement between replicates.
(iii) Data demonstrating acceptable, as determined by FDA, clinical validation must be demonstrated by conducting studies with clinical specimens. For each clinical study, relevant details must be documented (e.g., the origin of the study slides and images, reader/annotator qualifications, method of annotation, location of the study site(s) (on-site/remote), challenging diagnoses, etc.). The studies must include:
(A) A study demonstrating the performance by the intended users with and without the software device (e.g., unassisted and device-assisted reading of scanned WSI of pathology slides). The study dataset must contain sufficient numbers of cases from relevant cohorts that are representative of the scope of patients likely to be encountered given the intended use of the device (e.g., subsets defined by clinically relevant confounders, challenging diagnoses, subsets with potential biopsy appearance modifiers, concomitant diseases, and subsets defined by image scanning characteristics, etc.) such that the performance estimates and confidence intervals for these individual subsets can be characterized. The performance assessment must be based on appropriate diagnostic accuracy measures (e.g., sensitivity, specificity, predictive value, diagnostic likelihood ratio, etc.).
(B) [Reserved]

0

EVALUATION OF AUTOMATIC CLASS III DESIGNATION FOR Paige Prostate

DECISION SUMMARY

A. DEN Number:

DEN200080

Purpose for Submission: B.

De Novo request for evaluation of automatic class III designation for the Paige Prostate

C. Measurands:

Not applicable

D. Type of Test:

Software device to identify digital histopathology images of prostate needle biopsies that are suspicious for cancer and to localize a focus with the highest probability for cancer

E. Applicant:

Paige.AI, Inc.

F. Proprietary and Established Names:

Paige Prostate

G. Regulatory Information:

    1. Regulation section:
      21 CFR 864.3750
    1. Classification:
      Class II
    1. Product code:
      QPN
    1. Panel:
      88 - PATHOLOGY

1

H. Indications for use:

1. Indications for use:

Paige Prostate is a software only device intended to assist pathologists in the detection of foci that are suspicious for cancer during the review of scanned whole slide images (WSI) from prostate needle biopsies prepared from hematoxylin & eosin (H&E) stained formalinfixed paraffin embedded (FFPE) tissue. After initial diagnostic review of the WSI by the pathologist, if Paige Prostate detects tissue morphology suspicious for cancer, it provides coordinates (X,Y) on a single location on the image with the highest likelihood of having cancer for further review by the pathologist.

Paige Prostate is intended to be used with slide images digitized with Philips Ultra Fast Scanner and visualized with Paige FullFocus WSI viewing software.

Paige Prostate is an adjunctive computer-assisted methodology and its output should not be used as the primary diagnosis. Pathologists should only use Paige Prostate in conjunction with their complete standard of care evaluation of the slide image.

    1. Special conditions for use statement(s):
      For prescription use only

For in vitro diagnostic (IVD) use only

    1. Special instrument requirements:
      Philips IntelliSite Ultra Fast Scanner

FullFocus image viewing software

I. Device Description:

Paige Prostate is an in vitro diagnostic medical device software, derived from a deterministic deep learning system that has been developed with digitized WSIs of H&E stained prostate needle biopsy slides.

Paige Prostate utilizes several accessory devices as shown in Figure 1 below, for automated ingestion of the input. The device identifies areas suspicious for cancer on the input WSIs. For each input WSI, Paige Prostate automatically analyzes the WSI and outputs the following:

  • . Binary classification of suspicious or not suspicious for cancer based on a pre-defined threshold on the neural network output.
  • . If the slide is classified as suspicious for cancer, a single coordinate (X,Y) of the location with the highest probability of cancer on an image determined to be suspicious for cancer.

2

  • If the slide is classified as not suspicious for cancer, no additional output will be available ● by Paige Prostate. The Paige FullFocus WSI viewer will display "Not Suspicious for Cancer - Area of Interest Not Available".
    Image /page/2/Figure/1 description: The image shows a diagram of the Paige ecosystem. The diagram includes a digital pathology scanner, data storage, Paige Prostate, and pathology viewing software. The data storage is connected to the digital pathology scanner and Paige Prostate via TLS. The pathology viewing software is connected to the data storage and Paige Prostate.

Figure 1: Dataflow and Input/Output Devices for Paige Prostate: (Lock icon refers to the transport layer security (TLS) encryption used for all data transfer between services within the Paige Ecosystem. Data storage is encrypted at rest as indicated by the locked green storage icon).

Image /page/2/Figure/3 description: The image shows a diagram of the process a pathologist takes when reviewing digital pathology images of a prostate needle biopsy for one patient. First, the pathologist reviews the images and determines if the slide is malignant or benign. Next, the pathologist reviews the slide again with Paige Prostate, if needed, and determines the next steps. Finally, the pathologist renders a report per the current standard of care.

Figure 2: Paige Prostate Pathologist Workflow

Algorithm development: Paige Prostate algorithm development was performed on training, tuning, and test datasets. Each dataset contained slides from unique patients ensuring that training, tuning,

3

and test datasets do not have any slides, cases, or patients in common. De-identified slides were labeled as benign or cancer based on the synoptic diagnostic pathology report. These datasets were completely independent from the validation dataset.

Algorithm Development
Training DatasetTuning DatasetTest Datasets
De-identified slides from cases
prepared and diagnosed at
internal site located in US, from
2013-2017, scanned with an
Aperio Leica AT2 scannerSlides prepared and diagnosed
at internal site*, scanned with
an Aperio Leica AT2 scannerSame as tuning dataset, but
scanned on Philips PIPS
scanner
Number of slide images: 33,543Number of slide images: 5,598Number of slide images: 5,598
Slides prepared at external sites
but diagnosed at internal site
located in US, scanned with an
Aperio Leica AT2 scanner
Number of slide images: 10,605

Table 1: Dataset Split for Training, Tuning and Test Sets

Table 2: Distribution of slide images by race in algorithm development
RaceTraining DatasetTuning DatasetTest Dataset
White27576 (82.21%)7394 (82.33%)8313 (78.45%)
Black or African American2704 (8.06%)669 (7.45%)957 (9.03%)
Native American or American Indian14 (0.04%)14 (0.16%)18 (0.17%)
Native Hawaiian or Pacific Islander0 (0.00%)0 (0.00%)2 (0.02%)
Asian-Far East/Indian Subcontinent1027 (3.06%)289 (3.22%)383 (3.61%)
Other511 (1.52%)171 (1.90%)213 (2.01%)
Unknown race1711 (5.10%)444 (4.94%)719 (6.78%)

J. Standard/Guidance Document Referenced:

  • . Guidance for the Content of Premarket Submission for Software Contained in Medical Devices; May 11, 2005
  • CLSI document EP12-A2: User Protocol for Evaluation of Qualitative Test Performance; . Approved Guideline - Second Edition, 2008
  • Content of Premarket Submissions for Management of Cybersecurity in Medical Devices; . October 2, 2014
  • . The 510(k) Program: Evaluating Substantial Equivalence in Premarket Notifications [510(k)]; July 2014
  • Guidance for Industry and FDA Staff: De Novo Classification Process (Evaluation of ● Automatic Class III Designation); October 30, 2017

4

  • Acceptance of Clinical Data to Support Medical Device Applications and Submissions . Frequently Asked Questions; February 2018
  • Guidance for Industry and Food and Drug Administration Staff Factors to Consider When . Making Benefit-Risk Determinations in Medical Device Premarket Approval and De Novo Classifications; August 30, 2019
  • Guidance for Off-the-Shelf Software Use in Medical Devices; September 2019 ●

K. Test Principle:

Paige Prostate is operated as follows:

    1. Scanned digital images of prostate needle biopsies are acquired using the designated digital pathology scanner. Image and other related quality control steps are performed per the scanner instructions for use and any additional user site specifications. The scanned digital images are immediately processed by Paige Prostate in the background.
    1. The pathologist selects a patient case and opens the whole slide image for review in the designated digital pathology viewing software.
    1. After the pathologist has fully reviewed all areas on the digital image of a prostate core biopsy slide, and has decided upon a diagnosis of "cancer", or "defer", the pathologist must "activate the Paige Prostate" to view its output.
    1. If Paige Prostate detects a region on the digital slide suggestive of carcinoma, it identifies the region with greatest likelihood of being cancer and overlays a mark on that region indicated by its coordinate (X,Y). This is a statistical determination and is not linked to other clinical assessments, such as Gleason score.
    1. The pathologist can toggle Paige Prostate outputs on and off to allow unobstructed reexamination of any suspicious regions.
    1. If the pathologist has already recognized cancer on the slide, no additional action is required. If the pathologist has indicated a diagnosis of "no cancer" or "defer" and the algorithm indicates a region suspicious for cancer, the pathologist is prompted to re-examine that slide image, focusing initially on the region indicated by the algorithm.
    1. If the pathologist determines that the histologic findings warrant a change in diagnosis from "ho cancer" to "cancer" or "defer", or from "defer" to "cancer", they then modify the original diagnosis to reflect the additional findings.
    1. The final diagnosis of cancer is made by the pathologist based upon the histologic findings and should not be solely based on the algorithm's output.
    1. Pathologists should follow standard of care to obtain any additional stains, other pathologists' opinions, and/or additional information, if needed, to render a final diagnosis.
    1. The Paige Prostate device does not provide assistance with measuring or grading foci of cancer, whether detected initially by the pathologist or recognized after deployment of the algorithm.

The clinical workflow per prostate biopsy slide (WSI) is shown in Figure 3 below.

5

Image /page/5/Picture/0 description: This image shows a workflow diagram for pathologists using Paige Prostate software. The workflow starts with a pathologist reviewing an image, followed by Paige Prostate identifying or not identifying a focus of interest. Next, the pathologist reviews the slide and determines the next steps. Finally, the pathologist characterizes and renders a report.

Figure 3: Clinical Workflow per Slide

L. Software:

The Paige Prostate device was identified to have a moderate level of concern as described in the FDA guidance document "Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices." (May 11, 2005).

  • Software Description: Paige.AI provided a general description of the features in the a. software documentation and in the device description. The description of the software is consistent with the device functionality described in the device description.
  • b. Device Hazard Analysis: Paige.AI provided separate analyses of the device and cybersecurity concerns. The content of the hazard analysis is sufficient and assesses pre- and post-mitigation risks. The device hazard analysis includes:
    • identification of the hazard ●
    • cause of the hazard (hazardous situation)
    • probability of the hazard
    • severity of the hazard
    • . method of control or mitigation
    • corrective measures taken, including an explanation of the aspects of the device design/requirements, that eliminate, reduce, or warn of a hazardous event verification of the control implementation, which is traceable through the enumerated traceability matrix.
  • Software Requirement Specifications (SRS): The SRS includes user, engineering, C. algorithmic, cybersecurity, and various other types of requirements that give a full description of the functionality of the device. The SRS is consistent with the device description and software description.

6

  • d. Architecture Design Chart: Paige.AI provided the software overview and included flow diagrams representative of process flow for various features of the Paige Prostate device.
  • e. Software Design Specification (SDS): The SDS is traceable to the SRS and demonstrates how individual requirements are implemented in the software design and includes appropriate linkages to predefined verification testing.
  • Traceability Analysis/Matrix: Paige.AI Prostate provided traceability between all documents f. including the SRS, SDS, and subsequent verification and validation. Hazard mitigations are traceable throughout all documents.
  • g. Software Development Environment: Paige outlined the software development environment and the processes/procedures used for medical device software development. The content is consistent with expected quality system norms.
  • h. Verification and Validation Testing: The validation and system level verifications procedures are based upon the requirements with clearly defined test procedures and pass/fail criteria. All tests passed. Unit level test procedures, actual, and expected results are included for all design specifications.
  • i. Revision Level History: Version (v) 2.1.501 was released prior to its use in all the performance studies, including analytical (standalone and precision) and clinical reader study. Software v2.1.501 will remain locked for use with the authorized device and will not be continually trained and improved with each cohort analyzed in clinical practice, after marketing authorization.
  • Unresolved Anomalies: All identified anomalies were resolved prior to verification and 1. validation of the software. There are no unresolved anomalies.
  • k. Cybersecurity: The cybersecurity documentation is consistent with the recommendations for information that should be included in premarket submissions outlined in the FDA guidance document "Content of Premarket Submissions for Management of Cybersecurity in Medical Devices: Guidance for Industry and Food and Drug Administration Staff" (issued October 2, 2014). Information related to cybersecurity reviewed included:
    • . Hazard analysis related to cybersecurity risks,
    • Traceability documentation linking cybersecurity controls to risks considered, .
    • Summary plan for validating software updates and patches throughout the lifecycle . of the medical device.
    • . Summary describing controls in place to ensure that the medical device will maintain its integrity, and
    • . Device instructions for use and product specifications related to recommended cybersecurity controls appropriate for the intended use of the device.

7

M. Performance Characteristics

Analytical Performance 1.

The sponsor provided data from the following two studies to support the analytical performance of the device:

  • a. Algorithm Localization (X,Y Coordinate) and Accuracy Study
  • b. Precision Study

a. Algorithm Localization (X,Y Coordinate) and Accuracy Study:

The performance of Paige Prostate in identifying digital histopathology images of prostate needle biopsies that are suspicious for cancer and localizing one specific focus (X,Y) coordinate) with the highest suspicion for cancer was evaluated. The (X,Y) coordinates identified by Paige Prostate were evaluated against manual annotations of regions drawn by 3 study pathologists who were blinded to the Paige Prostate results. These study pathologists did not participate in the clinical reader study.

The study sample set originally consisted of 847 scanned digital WSIs of prostate needle biopsv slides (353 cancer and 494 benign) stained with hematoxylin and eosin (H&E). The scanned images were obtained using the previously FDA-cleared Phillips UFS scanner. Out of 847 WSIs, 42 WSIs with cancer and 77 WSIs that were benign did not represent unique patients, i.e., WSIs from multiple different cases but from the same patient. In order to avoid any bias due to case-level overlap in slides, only unique patient level cases were used in the data analysis, i.e., all slides were unique at patient-level compared to the development dataset. Therefore, the final sample set consisted of 728 WSIs (311 WSIs from cancer slides and 417 WSIs from benign slides). There were 3 study pathologists that annotated the image crops as described below in the localization assessment procedure section.

The distribution of the slide images by diagnosis, source of slides and race is provided in Table 3 below.

8

| Characteristic | Cancer
(N=311) | Benign
(N=417) |
|----------------------------------------------------------------|------------------------------|-------------------|
| Case Category | | |
| ASAPa | 11 (3.5%) | NA |
| Atrophy Present | 0 (0.0%) | 17 (4.1%) |
| High-Grade Prostate intraepithelial neoplasia
(PIN) Present | See Cancer
Category below | 20 (4.8%) |
| Treated: tissue with treatment related changes | 2 (0.6%) | 14 (3.4%) |
| Cancer: tumor size larger than 0.5mmb | 153 (49.2%) | NA |
| PIN Present | 8 (2.6%) | NA |
| Cancer: tumor size equal or less than 0.5mmc | 147 (47.3%) | NA |
| PIN Present | 6 (1.9%) | NA |
| Benign (without Atrophy, PIN and Treated) | NA | 366 (87.8%) |
| Source of Slides | | |
| Internal Site* | 136 (43.7%) | 183 (43.9%) |
| External Sites** | 175 (56.3%) | 234 (56.1%) |
| Race | | |
| Asian-Far East/Indian Subcontinent | 11 (3.5%) | 11 (2.6%) |
| Black or African American | 26 (8.4%) | 32 (7.7%) |
| Native Hawaiian or Pacific Islander | 1 (0.3%) | 0 (0.0%) |
| White | 251 (80.7%) | 347 (83.2%) |
| Other | 13 (4.2%) | 8 (1.9%) |
| Unknown race | 9 (2.9%) | 19 (4.6%) |

Table 3: Distribution of case categories in algorithm localization and accuracy study

4 Atypical small acinar proliferation (ASAP) represents suspicious glands without adequate histologic atypia for a definitive diagnosis of prostate adenocarcinoma. However, they were included in the "cancer" category.

b Consecutive tumors

C Challenging tumors with minimal tumor burden

*Internal site located in US

** External sites include 217 different sites located throughout the world (including US) NA: Not Applicable

The study set consisted of deidentified WSIs from:

  • Consecutive prostate cancer slides from internal site ●
  • Challenging cancer slides (slides with ≤0.5mm tumor) from internal site ●
  • Consecutive cancer slides submitted from external sites ●
  • Challenging cancer slides submitted from external sites ●
  • Benign slides from consecutive prostate biopsy cases from ● internal site
  • Consecutive benign slides submitted from external sites refer to a prostate . biopsy case (slides) prepared by an external site and submitted to the internal site for expert consultation purposes that were subsequently read by the internal site pathologists.

For consecutive cancer cases, one slide with minimal tumor volume was selected per case per patient. For challenging cancer cases, slides with 4 https://www.cms.gov/medicareprovider-enrollment-and-certificationseninfopolicy-and-memosstates-and/clinical-laboratory-improvement-amendments-clia-laboratory-guidance-during-covid-19-public-health

16

Results were displayed and pathologists conducted their assessments with an FDA-cleared whole slide image viewer (FullFocus™).

Each pathologist performed the following procedural steps on an individual basis.

  • a. Pathologists were trained to use the FDA-cleared digital pathology image review system and the Paige Prostate device.
  • WSIs of scanned prostate biopsy slides were displayed on an FDA-cleared monitor to each b. pathologist one at a time in a randomized order.
  • The pathologists completed an unassisted read directly followed by an assisted read for C. every WSI.
    • . Unassisted Read: The pathologists reviewed the image, without Paige Prostate assistance, with the FDA-cleared pathology viewer.
    • Assisted Read: The pathologists reviewed the image with Paige Prostate result ● coordinate (X,Y) overlaid on the same image. The result included:
    • Paige Prostate slide level binary classification: suspicious for cancer or not suspicious ● for cancer.
    • Coordinate (X, Y): If the slide was predicted to be suspicious for cancer, the algorithm ● identifies a coordinate (X, Y) of the region on the slide as having the highest likelihood for harboring cancer.
  • d. Pathologists were instructed that they could choose to "defer for more information" during the study if they were unable to render a definitive diagnosis as either "cancer" or "no cancer."
  • Classifications for each image were made without information from e. immunohistochemistry (IHC) stains. The pathologists performed a complete review of each WSI and recorded their classifications directly into the electronic database case report form (CRF):
    • (i) The pathologists classified each slide as:
      • . cancer,
      • . no cancer, or
      • defer for more information.
    • (ii) For the deferral classification, the pathologists selected why they would defer, from the following options, which were all methods currently used in clinical practice today when a pathologist is not able to determine a diagnosis from an H&E slide:
      • Additional stains .
      • Additional levels ●
      • Seek another opinion ●
      • Other: If the pathologists selected "Other" they would elaborate via a free ● text box in the CRF.

For each slide, sixteen pathologists completed an unassisted read directly followed by an assisted read with Paige Prostate for every image.

Study Sample Characteristics:

  • The sample set originally consisted of 610 whole slide images of prostate needle a. biopsy slides (190 cancer and 420 benign) stained with hematoxylin and eosin (H&E) that were scanned using a single unit of the FDA-cleared Phillips Ultra Fast Scanner with Philips Image Management System (IMS) to upload the scanned images. Out of 610 WSIs. 19 WSIs from prostate cancer slides and 64 WSIs from benign slides from

17

prostate biopsies did not represent unique patients, i.e., WSIs were from multiple different cases, but from the same patient. To avoid any bias due to case-level overlap in slides, only unique patient level cases were considered for the final data analysis. Thus, the final sample set consisted of 527 WSIs from 171 prostate cancer slides and 356 benign slides from prostate biopsies.

  • b. Out of the 527 WSIs, 44.15% of the images were from cases prepared, reviewed, diagnosed, and digitized at the internal site, and 55.85% of the images were from cases prepared at 156 different external sites but reviewed, diagnosed, and digitized at the internal site.
  • No slide used during development of the Paige Prostate were used for this study. C.
  • Dataset was enriched with 50% challenging cancer slides, which were defined as d. slides with minimal tumor burden. Challenging cancer cases contained at least one slide with less than or equal to 0.5mm tumor; one with minimal tumor was selected.
  • Benign parts could come from a case that includes cancer parts, as long as the e. selected part represented a unique patient in the dataset. Benign parts could also come from a case without any cancer parts.
  • Slide-level cancer/benign ground truths were determined by reviewing the original f. diagnostic synoptic reports.

| | Cancer
(N=171) | Benign
(N=356) |
|---------------------------------------------------|------------------------------|-------------------|
| Characteristic | | |
| Case Category | | |
| ASAPa | 8 (4.7%) | NA |
| Atrophy Present | 0 (0.0%) | 3 (0.8%) |
| High-Grade PIN Present | See Cancer
Category below | 18 (5.1%) |
| Cancer: tumor size larger than 0.5 mmb | 84 (49.1%) | NA |
| PIN Present | 4 (2.3%) | NA |
| Cancer: tumor size equal or less than 0.5mmc | 79 (45.6%) | NA |
| PIN Present | 4 (2.3%) | NA |
| Treated: tissue with treatment related
changes | 1 (0.6%) | 12 (3.4%) |
| Benign (without Atrophy, PIN and Treated) | NA | 323 (90.7%) |
| Source of Slides | | |
| Internal site* | 77 (45.0%) | 154 (43.3%) |
| External sites ** | 94 (55.0%) | 202 (56.7%) |
| Race | | |
| Asian-Far East/Indian Subcontinent | 8 (4.7%) | 11 (3.1%) |
| Black or African American | 14 (8.2%) | 28 (7.9%) |
| Native Hawaiian or Pacific Islander | 1 (0.6%) | 0 (0.0%) |
| White | 135 (78.9%) | 302 (84.8%) |
| Other | 10 (5.8%) | 7 (2.0%) |

Table 14: Distribution of case categories in clinical study

18

Unknown race3 (1.8%)8 (2.2%)
----------------------------------

4 Atypical small acinar proliferation (ASAP) represents suspicious glands without adequate histologic atypia for a definitive diagnosis of prostate adenocarcinoma

b Consecutive tumors

C Challenging tumors with minimal tumor burden

*Internal site located in US ** External sites include 156 different sites

NA: Not Applicable

Exclusion Criteria:

Slides from the following categories were excluded:

  • a. Any slide used during development (algorithm training, tuning and testing) of the Paige Prostate device.
  • b. Any slide with scanning quality control issues, as determined by a pathologist, such as (included blur/out of focus areas, folded tissue, scanning artifacts or other artifacts that compromised the ability to interpret the findings on the tissue.
  • c. Any slide that was not H&E-stained.

Pathologist Qualifications:

  • US Board certified anatomic pathologists from six sites outside of the internal site a. including community, academic, and private practices were included in the study. The study included 14 general pathologists with greater than one year of experience, and 2 subspecialized genitourinary pathologists. Study pathologists underwent training for FDA-cleared digital pathology system.

Clinical Performance Measures:

Diagnoses of 'deferred' or 'cancer' was considered as 'positive' and diagnosis 'benign' was considered as 'negative'. Sensitivity and specificity along with 95% confidence intervals were provided.

Study Results:

The clinical study demonstrated improvements in sensitivity and small differences in specificities between assisted and unassisted reads.

19

PathologistPathologistSensitivity (N=171 Cancer)Specificity (N=356 Benign)
PathologistSpecialtySettingAssisted % (n)Unassisted % (n)Difference 95%CIAssisted % (n)Unassisted % (n)Difference 95%CI
1GeneralistRemote95.9% (164)86.5% (148)9.4% (4.7%; 14.8%)94.1% (335)93.8% (334)0.3% (-2.3%; 2.9%)
2GeneralistRemote98.8% (169)89.5% (153)9.4% (5.1%; 14.7%)91.9% (327)92.4% (329)-0.6% (-2.9%; 1.7%)
3GeneralistRemote98.2% (168)95.3% (163)2.9% (-0.5%; 7.0%)92.7% (330)90.4% (322)2.2% (-0.2%; 4.9%)
4GeneralistRemote93.6% (160)87.7% (150)5.8% (2.2%; 10.3%)77.2% (275)78.4% (279)-1.1% (-2.5%; 0.2%)
5GeneralistOn-Site97.1% (166)93.6% (160)3.5% (-0.8%; 8.2%)96.1% (342)94.9% (338)1.1% (-0.7%; 3.1%)
6GeneralistOn-Site98.2% (168)93.6% (160)4.7% (1.4%; 9.0%)88.8% (316)88.2% (314)0.6% (-1.6%; 2.7%)
7GeneralistOn-Site97.7% (167)86.0% (147)11.7% (6.7%; 17.6%)91.0% (324)89.6% (319)1.4% (-1.3%; 4.2%)
8GeneralistRemote97.1% (166)73.1% (125)24.0% (17.4%; 31.0%)94.9% (338)81.2% (289)13.8% (9.5%; 18.2%)
9GeneralistRemote94.2% (161)74.9% (128)19.3% (13.5%; 25.7%)82.3% (293)81.2% (289)1.1% (-1.2%; 3.5%)
10GeneralistRemote99.4% (170)96.5% (165)2.9% (0.0%; 6.8%)83.1% (296)83.4% (297)-0.3% (-2.0%; 1.4%)
11GeneralistRemote96.5% (165)90.6% (155)5.8% (1.9%; 10.7%)91.6% (326)89.3% (318)2.2% (-0.2%; 4.9%)
12GeneralistRemote98.2% (168)95.9% (164)2.3% (-1.0%; 6.3%)87.4% (311)88.5% (315)-1.1% (-2.7%; 0.3%)
13GeneralistRemote98.2% (168)92.4% (158)5.8% (2.3%; 10.5%)81.5% (290)82.9% (295)-1.4% (-3.3%; 0.4%)
PathologistSpecialtySettingSensitivity (N=171 Cancer)Specificity (N=356 Benign)
Assisted % (n)Unassisted % (n)Difference 95%CIAssisted % (n)Unassisted % (n)Difference 95%CI
14GeneralistRemote95.9% (164)91.8% (157)4.1% (0.4%; 8.4%)90.4% (322)89.3% (318)1.1% (-1.6%; 3.9%)
15SpecialistRemote94.2% (161)93.6% (160)0.6% (-2.3%; 3.6%)94.9% (338)95.8% (341)-0.8% (-2.4%; 0.5%)
16SpecialistRemote95.9% (164)90.6% (155)5.3% (1.4%; 9.9%)94.1% (335)95.8% (341)-1.7% (-3.9%; 0.2%)
Combined or SpecialistGeneralist or SpecialistOn-site or Remote96.8% (165)89.5% (153.0)7.3% (3.9%; 11.4%)89.5% (318)88.4% (314)1.1% (-0.7%; 3.4%)

Table 15: Summary of sensitivity and specificity by pathologist (specialist and generalist) and location (on-site and remote) with positive=cancer or defer and negative=benign

20

*Confidence intervals for differences in sensitivities are calculated by a score method for an individual pathologist and by bootstrap for combined data (averaged over all pathologists).

Details about reduction of false negative slides and false positive slide numbers on average (16 pathologists) are presented in table 16 and 17, respectively.

Table 16: Assessment of cancer slides, (171 slides) on average (16 pathologists)
Classification for unassisted read
CancerDeferredNo CancerTotal
Classification
for assisted
readCancer128.56
(75.2%)5.75
(3.4%)2.75
(1.6%)137.06
(80.2%)
Deferred0.44
(0.3%)17.50
(10.2%)10.56
(6.2%)28.50
(16.67%)
No cancer0.25
(0.1%)0.50
(0.3%)4.69
(2.7%)5.44
(3.18%)
Total129.25
(75.58%)23.75
(13.89%)18.00
(10.53%)171
(100%)

Numbers in grey colors are numbers of slide images with the same classification in assisted and unassisted reads. Numbers in green color, 2.75 (1.6%) and 10.56 (6.2%), present a reduction in the number of false negative results for the cancer slide images because of use of the Paige Prostate device. Numbers in orange color, 0.25 (0.1%) and 0.50 (0.3%), present an increase in the number of false negative results because these cancer slide images had "No cancer" for assisted reads but had "Cancer" or "Deferred" for unassisted reads. Overall reduction in the number of false negative slides was 12.56 slides [=(2.75+10.56)-(0.25+0.50)] what is 7.34% (=12.56/171). Reduction in the number of false negative slides of 7.3% with 95% CI: (3.9%; 11.4%) that was statistically significant.

21

Classification for unassisted read
CancerDeferredNo CancerTotal
Classification
for assisted
readCancer4.31 (1.2%)1.12 (0.3%)0.81 (0.2%)6.25 (1.76%)
Deferred3.19 (0.9%)22.75 (6.4%)5.19 (1.5%)31.12 (8.74%)
No cancer0.69 (0.2%)9.06 (2.5%)308.87 (86.8%)318.62 (89.5%)
Total8.19 (2.3%)32.94 (9.2%)314.87 (88.45%)356 (100%)

Table 17: Assessment of benign slides, (356 slides) on average (16 pathologists)

Numbers in grey colors are numbers of slide images with the same classification in assisted and unassisted reads. Numbers in green color, 0.69 (0.2%) and 9.06 (2.5%), present a reduction in the number of false positive results for the benign slide images because of use of the Paige Prostate device. Numbers in orange color, 0.81 (0.2%) and 5.19 (1.5%), present an increase in the number of false positive results because these benign slide images had "Cancer" assisted reads (0.2%) or "Deferred" assisted reads (1.5%) but had "No Cancer" for unassisted reads. Overall difference in the number of false positive slide images was 3.75 slides [=(0.69+9.06)-(0.81+5.19)] what is 1.05% (=3.75/356). Difference in the number of false positives slides of 1.1% with 95% CI: (-0.7%; 3.4%) was not statistically significant.

Analysis of sensitivity and specificity by pathologist specialty and location is presented in Table 18 and Figure 4 depicts the improvement in sensitivities between assisted and unassisted reads.

| Pathologist
Specialty | Pathologist
Setting | Number of
Pathologists | Improvement in Sensitivity
Average | Improvement in Sensitivity
Median | Improvement in Sensitivity
Range | Difference in Specificity
Average | Difference in Specificity
Median | Difference in Specificity
Range |
|--------------------------|------------------------|---------------------------|---------------------------------------|--------------------------------------|-------------------------------------|--------------------------------------|-------------------------------------|------------------------------------|
| Generalist | Remote | 11 | 8.3% | 5.8% | (2.3%;
24.0%) | 1.5% | 0.3% | (-1.4%;
0.8%) |
| Generalist | On-site | 3 | 6.6% | 4.7% | (3.5%;
11.7%) | 1.0% | 1.1 | (0.6;
1.4) |
| Specialist | Remote | 2 | 2.9% | 2.9% | (0.6%;
5.3%) | -1.3% | -1.3% | (-1.7%;
-0.8%) |
| Generalist | Remote or
On-site | 14 | 8.0% | 5.8% | (2.3%;
24.0%) | 1.4% | 1.1% | (-1.4%;
13.8) |
| Specialist | Remote | 2 | 2.9% | 2.9% | (0.6%;
5.3%) | -1.3% | -1.3% | (-1.7%;
-0.8%) |

Table 18: Analysis of sensitivity and specificity by pathologist (specialist and generalist) and location (on-site and remote):

22

167.3%5.6%(0.6%:1.1%0.4%(-1.7%;
Combined24.0%)13.8%)

For combined data, an average improvement in sensitivity was 7.3% with 95%CI: (3.9%; 11.4%) (statistically significant), a median value was 5.6% and range was from 0.6% to 24.0%. An average difference in specificity was 1.1% with 95% CI: (-0.7%; 3.4% (not statistically significant), a median value was 0.4% and range was from -1.7% to 13.8%.

Image /page/22/Figure/2 description: This scatter plot shows the difference between assisted and unassisted sensitivities on the y-axis and unassisted sensitivities on the x-axis. The x-axis ranges from 70.0% to 100.0%, while the y-axis ranges from 0 to 30. There are several blue data points scattered across the plot, with a few red data points as well. The data points generally show a decreasing trend as unassisted sensitivities increase.

Figure 4: Improvement in assisted sensitivities vs Unassisted Sensitivity: Points in orange represent the improvement in sensitivities among specialists and points in blue represent the improvement in sensitivities among generalists.

It should be noted that:

· The pathologists' reviews in the clinical study were based on an initial interpretation of the slide images. In clinical practice, additional special studies are performed when there is anv doubt about the diagnosis. In the clinical study, the initial interpretations were used and special studies were not permitted because an objective of the clinical study was to evaluate an improvement in accuracy of the prostate slide images using the Paige Prostate.

· The study analysis was on a per-biopsy basis, not on a per-patient basis. In a typical patient undergoing prostatic needle biopsy for evaluation of possible cancer, multiple core biopsies are obtained (often 12-14 biopsies), and many patients with prostate cancer in multiple biopsies.

Based on these two constraints of the clinical study, the expected benefit of the use of the Paige device on the final diagnosis in practice would likely be substantially lower than 7.3% when evaluated on a per-patient basis.

N. Labeling

The labeling supports the decision to grant the De Novo request for this device.

23

O. Patient perspectives

This submission did not include specific information on patient perspectives for this device.

Identified Risks to HealthMitigation Measures
False negative classification (loss of
accuracy)Certain design verification and validation,
including certain device descriptions, certain
analytical studies, and clinical studies.
Certain labeling information, including certain
device descriptions, certain performance
information, and certain limitations.
False positive classification (loss of
accuracy)Certain design verification and validation,
including certain device descriptions, certain
analytical studies, and clinical studies.
Certain labeling information, including certain
device descriptions, certain performance
information, and certain limitations.

P. Identified Risks to Health and Identified Mitigations

Q. Benefit-Risk Determination

Summary of Benefits

The use of this device for the proposed IU population according to the proposed instructions for use is expected to benefit a small proportion of men who have undergone prostate biopsy in receiving a correct pathologic diagnosis of that biopsy. Although standard of care is expected to yield the correct diagnosis in the vast majority of such biopsies, there appears to be a small proportion of cases for which a small focus of carcinoma may be overlooked and consequently identified with the use of the device. In the pivotal clinical study, for 171 slides with cancer, the change from "unassisted benign" to "assisted defer" was 6.2%, and the change from "unassisted benign" to "assisted cancer" was 1.6%. Also, for 171 slides with cancer, the change from "unassisted defer" to "assisted benign" was 0.3% and the change from "unassisted cancer" to "assisted benign" was 0.1%. Therefore, in 7.3% of individual biopsy specimens with cancer, there is expected patient benefit in terms of an improvement in sensitivity ((6.2% + 1.6%) - (0.3% + 0.1%)). On average, improvement in specificity was 1.1% (specificity assisted = 89.50% minus specificity unassisted = 88.45%). It should be noted that this analysis is on a per-biopsy basis, not on a per patient basis. Since in a typical patient undergoing prostatic needle biopsy for evaluation of possible cancer, multiple core biopsies are obtained, and many patients with prostate biopsies have cancer in multiple biopsies, this expected benefit in practice would likely be substantially lower than 7.3% when evaluated on a per-patient basis. There is also some limited expected benefit in terms of time savings for the pathologist reviewing these biopsies.

24

Summary of Risks

The risk of use of this device for the proposed IU population according to the proposed instructions for use is the loss of accuracy leading to an incorrect diagnosis (false positive or false negative). Incorrect diagnosis is clearly harmful. This could be in the form of an incorrect diagnosis of cancer for which the patient may receive unnecessary treatment and psychologically harmful misinformation. An incorrect rendering of a benign diagnosis would likely cause a delay in the treatment of a cancer and would likely in some cases lead to increased morbidity and mortality.

Benefit/Risk Conclusion

Paige Prostate appears to provide a reasonable assurance of safety and effectiveness for diagnostic use by its intended users after taking into consideration the special controls. The clinical and analytical studies have shown that the risk of accuracy loss resulting in a false positive or false negative diagnosis, is minimal relative to the patient safety benefits, including new findings that would contribute to the correct diagnosis. This is contingent on the device being used according to the approved labeling, particularly that the end user must be fully aware of how to interpret and apply the device output.

The potential for false negative and false positive results is mitigated by special controls. Labeling requirements, which include certain device description information as well as certain limitations, ensure that users will employ all appropriate procedures and safeguards as specified, including use of the device as an adjunct rather than as the sole basis of making the diagnosis. In addition, design verification and validation includes data on software performance as supported by the underlying software design, as well as software algorithm training and validation within the limits of the specified intended use. This also includes analytical validation (including precision studies) and clinical validation (including user validation studies and performance studies) studies.

The probable clinical benefits outweigh the potential risks when the standard of care is followed by qualified users, and appropriate mitigation of the risks is provided for through implementation of and adherence to the special controls. The combination of the general controls and established special controls support the assertion that the probable benefits outweigh the probable risks.

R. Conclusion

The De Novo request is granted, and the device is classified under the following and subject to the special controls identified in the letter granting the De Novo request:

Product Code: OPN Device type: Software algorithm device to assist users in digital pathology Class: II Regulation: 21 CFR 864.3750