(166 days)
Synapse 3D Base Tools is medical imaging software that is intended to provide trained medical professionals with tools to aid them in reading, interpreting, reporting, and treatment planning. Synapse 3D Base Tools accepts DICOM compliant medical images acquired from a variety of imaging devices including, CT, MR, CR, US, NM, PT, and XA, etc.
This product is not intended for use with or for the primary diagnostic interpretation of Mammography images. Synapse 3D Base Tools provides several levels of tools to the user: Basic imaging tools for general images, including 2D viewing, volume rendering and 3D volume viewing, orthogonal / oblique / curved Multi-Planar Reconstructions (MPR), Maximum (MIP), Average (RaySum) and Minimum (MinIP) Intensity Projection, 4D volume viewing, image fusion, image subtraction, surface rendering, sector and rectangular shape MPR image viewing, MPR for dental images, creating and displaying multiple MPR images along an object, time-density distribution, basic image processing, noise reduction, CINE, measurements, annotations, reporting, printing, storing, distribution, and general image management and administration tools, etc.
• Tools for regional segmentation of anatomical structures within the image data, path definition through vascular and other tubular structures, and boundary detection.
• Image viewing tools for modality specific images, including CT PET fusion and ADC image viewing for MR studies.
• Imaging tools for CT images including virtual endoscopic viewing, dual energy image viewing.
• Imaging tools for MR images including delayed enhancement image viewing, diffusion-weighted MRI image viewing.
The intended patient population for all applications implemented as base tools is limited to adult population (over 22 years old).
The 3D image analysis software Synapse 3D Base Tools (V7.0) is medical application software running on Windows server/client configuration installed on commercial general-purpose Windows-compatible computers. It offers software tools which can be used by trained professionals to interpret medical images obtained from various medical devices, to create reports, or to develop treatment plans.
The provided text details the FDA 510(k) clearance for Synapse 3D Base Tools (V7.0). It primarily focuses on demonstrating substantial equivalence to a predicate device and includes information on nonclinical and certain clinical performance testing for newly added deep-learning-based organ segmentation features.
Here's an analysis of the acceptance criteria and study that proves the device meets them, based on the provided text:
Acceptance Criteria and Reported Device Performance
The core of acceptance criteria for this 510(k) submission appears to be demonstrating substantial equivalence to a predicate device (Synapse 3D Base Tools V6.6 K221677) and proving the safety and effectiveness of new features, particularly those utilizing deep learning for automatic or semi-automatic organ extraction.
While no explicit "acceptance criteria" table is provided in the document in terms of specific thresholds for the overall device functionality, the performance section for the deep learning models serves as such for those specific features. The acceptance criterion for these features is implicitly showing a high Dice Similarity Coefficient (DICE) score, indicating strong agreement between the automated segmentation and the ground truth.
Table of Acceptance Criteria and Reported Device Performance (for Deep Learning Segmentation)
Segmented Structure (Modality) | Number of Cases | Acceptance Criteria (Implicit) - High DICE Score | Reported Device Performance (Average DICE) |
---|---|---|---|
Duodenum (CT) | 30 | High DICE | 0.85 |
Stomach (CT) | 30 | High DICE | 0.96 |
Lung section (Left S1S2) (CT) | 30 | High DICE | 0.92 |
Lung section (Left S3) (CT) | 30 | High DICE | 0.88 |
Lung section (Left S4) (CT) | 30 | High DICE | 0.75 |
Lung section (Left S5) (CT) | 30 | High DICE | 0.81 |
Lung section (Left S6) (CT) | 30 | High DICE | 0.9 |
Lung section (Left S8) (CT) | 30 | High DICE | 0.85 |
Lung section (Left S9) (CT) | 30 | High DICE | 0.73 |
Lung section (Left S10) (CT) | 30 | High DICE | 0.87 |
Lung section (Right S1) (CT) | 30 | High DICE | 0.89 |
Lung section (Right S2) (CT) | 30 | High DICE | 0.89 |
Lung section (Right S3) (CT) | 30 | High DICE | 0.91 |
Lung section (Right S4) (CT) | 30 | High DICE | 0.88 |
Lung section (Right S5) (CT) | 30 | High DICE | 0.85 |
Lung section (Right S6) (CT) | 30 | High DICE | 0.9 |
Lung section (Right S7) (CT) | 30 | High DICE | 0.8 |
Lung section (Right S8) (CT) | 30 | High DICE | 0.84 |
Lung section (Right S9) (CT) | 30 | High DICE | 0.71 |
Lung section (Right S10) (CT) | 30 | High DICE | 0.83 |
Pancreas section (Body) (CT) | 29 | High DICE | 0.91 |
Pancreas section (Head) (CT) | 29 | High DICE | 0.95 |
Pancreas section (Tail) (CT) | 29 | High DICE | 0.99 |
Spleen (CT) | 35 | High DICE | 0.95 |
Pancreas duct (CT) | 29 | High DICE | 0.74 |
Pancreas (CT) | 30 | High DICE | 0.86 |
ROI (CT)* | 29 | High DICE | 0.85 |
Liver section (S1) (CT) | 31 | High DICE | 0.99 |
Liver section (S2) (CT) | 31 | High DICE | 0.99 |
Liver section (S3) (CT) | 31 | High DICE | 0.97 |
Liver section (S4) (CT) | 31 | High DICE | 0.97 |
Liver section (S5) (CT) | 31 | High DICE | 0.92 |
Liver section (S6) (CT) | 31 | High DICE | 0.94 |
Liver section (S7) (CT) | 31 | High DICE | 0.98 |
Liver section (S8) (CT) | 31 | High DICE | 0.97 |
Gall bladder (CT) | 37 | High DICE | 0.92 |
Bronchus (CT) | 30 | High DICE | 0.87 |
Lung lobe (Left Lower) (CT) | 30 | High DICE | 0.99 |
Lung lobe (Left Upper) (CT) | 30 | High DICE | 0.99 |
Lung lobe (Right Lower) (CT) | 30 | High DICE | 0.99 |
Lung lobe (Right Middle) (CT) | 30 | High DICE | 0.97 |
Lung lobe (Right Upper) (CT) | 30 | High DICE | 0.99 |
Pulmonary Arteries (CT) | 30 | High DICE | 0.83 |
Pulmonary Veins (CT) | 30 | High DICE | 0.85 |
Pancreas vessel (CT) | 30 | High DICE | 0.9 |
Prostate (MRI) | 30 | High DICE | 0.9 |
Rectal ROI (tumor) (MRI)* | 27 | High DICE | 0.75 |
Ureter (T2) (MRI) | 33 | High DICE | 0.63 |
Bladder (MRI) | 35 | High DICE | 0.93 |
Pelvis (MRI) | 34 | High DICE | 0.94 |
Seminal vesicle (MRI) | 32 | High DICE | 0.7 |
Ureter (T1Dynamic) (MRI) | 33 | High DICE | 0.76 |
Prostate tumor (DWI) (MRI)* | 36 | High DICE | 0.65 |
Prostate tumor (T2) (MRI)* | 39 | High DICE | 0.6 |
Kidney tumor (MRI)* | 31 | High DICE | 0.88 |
Left Kidney (MRI) | 31 | High DICE | 0.97 |
Right Kidney (MRI) | 31 | High DICE | 0.98 |
ROI (MRI)* | 133 | High DICE | 0.72 |
Rectal muscularis propria (MRI) | 32 | High DICE | 0.91 |
Mesorectum (MRI) | 32 | High DICE | 0.9 |
Pelvic vessel (Artery) (MRI) | 30 | High DICE | 0.81 |
Pelvic vessel (Vein) (MRI) | 30 | High DICE | 0.8 |
Kidney vessel (Artery) (MRI) | 32 | High DICE | 0.92 |
Kidney vessel (Vein) (MRI) | 32 | High DICE | 0.86 |
Pelvic nerve (MRI) | 30 | High DICE | 0.7 |
Levator ani muscle (MRI) | 30 | High DICE | 0.77 |
Overall (Total cases) | 1086 | Consistent and Acceptable Performance | Range of 0.60 to 0.99 (Average DICE) |
Note: For items marked with an asterisk (*), the extraction is performed semi-automatically. All others are executed automatically. The acceptance criterion is "High DICE," as no specific quantitative threshold is given, but the reported values generally indicate good agreement. "Additional distance based metrics 95% Hausdorff Distance and Mean Surface Distance were also reported along with the subgroup analysis. Detailed results are reported in the labeling."
Study that Proves the Device Meets Acceptance Criteria
The study described is a performance testing for the new deep-learning-based automatic or semi-automatic organ extraction functions.
-
Sample size used for the test set and the data provenance:
- Sample Size: 1086 cases were collected for performance testing.
- Data Provenance: The data was collected newly from US patient populations across various regions: US_East (295 cases), US_Midwest (175 cases), US_Southeast (185 cases), US_Southwest (73 cases), and US_Northwest (4 cases). This indicates a prospective data collection specifically for this testing, originating from the US. The text also mentions the test data is "independence from training data."
- Demographics: The test set included 672 men, 414 women, and a range of ages from 22 to 120+ years old. Modalities covered CT and MRI from various major manufacturers (SIEMENS, GE, PHILIPS, CANON, FUJIFILM).
-
Number of experts used to establish the ground truth for the test set and the qualifications of those experts:
- The document does not specify the number of experts or their qualifications used to establish the ground truth. It only states that the performance testing used an "average DICE" score, implying a comparison against some form of expertly derived ground truth.
-
Adjudication method (e.g. 2+1, 3+1, none) for the test set:
- The document does not specify any adjudication method for establishing the ground truth.
-
If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:
- No, an MRMC comparative effectiveness study was not explicitly mentioned or performed as part of this submission for demonstrating substantial equivalence. The clinical testing mentioned focused on the standalone performance of the new deep learning features (i.e., automatic or semi-automatic segmentation accuracy) rather than human reader performance with or without AI assistance. The submission states, "The subject of this 510(k) notification, Synapse 3D Base Tools does not require clinical studies to support safety and effectiveness of the software."
-
If a standalone (i.e. algorithm only without human-in-the-loop performance) was done:
- Yes, a standalone performance evaluation was done for the automatic (and semi-automatic) deep learning segmentation functions. The Dice Similarity Coefficient (DICE) scores provided are a measure of the algorithm's performance in segmenting anatomical structures compared to a ground truth, without human intervention in the segmentation process itself, although some extractions are noted as "semi-automatic" where human interaction would refine the output.
-
The type of ground truth used (expert consensus, pathology, outcomes data, etc.):
- The text implies the ground truth for the segmentation tasks was established by expert consensus/manual annotation (as DICE is a metric comparing algorithmic output to a reference segmentation, typically derived from expert outlines). However, the specific method (e.g., single expert, multi-expert consensus) is not detailed. It mentions "Additional distance based metrics 95% Hausdorff Distance and Mean Surface Distance were also reported," which are also used for comparing segmentation masks to a ground truth.
-
The sample size for the training set:
- The document does not explicitly provide the sample size for the training set. It only states that the test data was "independence from training data," implying a separate training dataset was used.
-
How the ground truth for the training set was established:
- The document does not provide details on how the ground truth for the training set was established. However, for deep learning segmentation, it is typically established through manual annotation by qualified experts.
§ 892.2050 Medical image management and processing system.
(a)
Identification. A medical image management and processing system is a device that provides one or more capabilities relating to the review and digital processing of medical images for the purposes of interpretation by a trained practitioner of disease detection, diagnosis, or patient management. The software components may provide advanced or complex image processing functions for image manipulation, enhancement, or quantification that are intended for use in the interpretation and analysis of medical images. Advanced image manipulation functions may include image segmentation, multimodality image registration, or 3D visualization. Complex quantitative functions may include semi-automated measurements or time-series measurements.(b)
Classification. Class II (special controls; voluntary standards—Digital Imaging and Communications in Medicine (DICOM) Std., Joint Photographic Experts Group (JPEG) Std., Society of Motion Picture and Television Engineers (SMPTE) Test Pattern).