Search Results
Found 1 results
510(k) Data Aggregation
(105 days)
AYCAN WORKSTATION OSIRIX PRO
aycan workstation OsiriX PRO is a software device intended for viewing of images acquired from CT, MR, CR, DR, US and other DICOM compliant medical imaging systems when installed on suitable commercial standard hardware.
Images and data can be captured, stored, communicated, processed, and displayed within the system and or across computer networks at distributed locations.
Lossy compressed mammographic images and digitized film screen images must not be reviewed for primary diagnosis or image interpretation. For primary diagnosis, post process DICOM "for presentation" images must be used. Mammographic images should only be viewed with a monitor approved by FDA for viewing mammographic images.
It is the User's responsibility to ensure monitor quality, ambient light conditions, and image compression ratios are consistent with clinical application.
The aycan workstation OsinX PRO provides services for review and post processing of diagnostic medical images and information. It conforms to the DICOM 3.0 standard to allow the sharing of medical information with other digital imaging systems. aycan workstation OsiriX PRO runs on Apple Mac OSX systems and provides high performance review, navigation and post processing functionality for multidimensional and multimodality images.
aycan workstation OsiriX PRO is a software device that handles and manipulates digital medical images.
The provided text is a 510(k) Summary for the aycan workstation OsiriX PRO. It outlines the device's characteristics and its substantial equivalence to a predicate device, but it does not contain a detailed study report with specific acceptance criteria and performance data for the device itself.
Here's a breakdown of what can and cannot be extracted from the provided document based on your request:
1. A table of acceptance criteria and the reported device performance
- Acceptance Criteria: Not explicitly stated in terms of measurable performance metrics. The document broadly states "the predetermined acceptance criteria were met" and "The system passed all testing criteria," but the criteria themselves are not defined or quantified.
- Reported Device Performance: Not provided. There are no performance metrics or results (e.g., accuracy, sensitivity, specificity) for the aycan workstation OsiriX PRO.
2. Sample size used for the test set and the data provenance (e.g., country of origin of the data, retrospective or prospective)
- Sample Size: Not mentioned.
- Data Provenance: Not mentioned.
3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g., radiologist with 10 years of experience)
- Not mentioned. The document primarily focuses on regulatory approval and device description, not on detailed clinical validation studies.
4. Adjudication method (e.g., 2+1, 3+1, none) for the test set
- Not mentioned.
5. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance
- Not mentioned. The document describes the device as a "software device intended for viewing of images acquired from CT, MR, CR, DR, US and other DICOM compliant medical imaging systems." It is a Picture Archiving Communications System (PACS) and an image processing system. There is no indication of AI assistance or a comparative effectiveness study involving human readers.
6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done
- The device described is a PACS workstation, not an AI algorithm. Its function is to display and process images for human interpretation, explicitly stating, "A physician, providing ample opportunity for competent human intervention interprets images and information being displayed and printed." Therefore, a standalone algorithm-only performance study would not be applicable or relevant to this device's stated function.
7. The type of ground truth used (expert consensus, pathology, outcomes data, etc)
- Not mentioned.
8. The sample size for the training set
- Not mentioned. As this is not an AI/ML algorithm that typically requires a training set, this information is not expected.
9. How the ground truth for the training set was established
- Not mentioned. (See point 8)
Summary of available information regarding acceptance criteria and study:
The document states:
- "As required by the risk analysis, designated individuals performed all verification and validation activities and results demonstrated that the predetermined acceptance criteria were met. The system passed all testing criteria." (K103546, P. 2)
This indicates that internal verification and validation were conducted, and the device met its internal acceptance criteria. However, the specific details of these criteria, the study design, sample sizes, ground truth establishment, or any performance metrics are not included in this 510(k) summary. The summary focuses on demonstrating substantial equivalence to a predicate device based on its functional characteristics and intended use, rather than a detailed performance study like what would be expected for a diagnostic AI device.
Ask a specific question about this device
Page 1 of 1