Search Results
Found 2 results
510(k) Data Aggregation
(10 days)
MODIFICATION TO: VIEWSTATION
ViewStation supports image and information flow among health care facility personnel. ViewStation can be used whenever digital images and associated data are the means for communicating information. ViewStation is not intended for use in diagnosis.
The intended use of ViewStation is to provide health care facility personnel with an effective means to utilize patient images during the course of therapy or treatment. ViewStation allows users to import, view, annotate, manipulate, enhance, manage, and archive patient images and associated information are stored in a database, providing users access to the information necessary to perform their functions.
The primary function of ViewStation is to provide a means to more effectively manage image information in a therapy or treatment environment. ViewStation provides the ability to import, view, annotate, manipulate, enhance, and archive patient images during the course of therapy, treatment, and follow-up.
ViewStation imports existing digital images acquired or generated by other products. ViewStation retains the original image, which was acquired or generated by a third party product. With these facts in mind, the goal of ViewStation is to make electronic patient image information more accessible throughout the department. IMPAC is providing a tool to increase department productivity since digital images, unlike films, do not have to be physically transferred from one station to another.
The ViewStation is an Image Management System which is explicitly not intended for diagnostic use but rather for managing images and information flow in a healthcare facility. Given this, the submission does not contain a study involving clinical efficacy or diagnostic performance. Instead, the "acceptance criteria" and "study" are focused on demonstrating that the updated software maintains the safety and effectiveness of the predicate device for its intended non-diagnostic use.
Here's an breakdown:
1. A table of acceptance criteria and the reported device performance
The submission does not present a table of acceptance criteria in the traditional sense of diagnostic performance metrics (e.g., sensitivity, specificity, AUC). Instead, the "acceptance criteria" are implied by the requirements for regulatory compliance, internal quality standards, and successful software development and testing. The "reported device performance" is the successful completion of these processes, affirming that the updated ViewStation maintains its intended non-diagnostic functionality and safety.
Aspect of Acceptance/Performance | Reported Performance/Method of Meeting |
---|---|
Safety and Effectiveness | Product change does not diminish safety or effectiveness. System Hazard Analysis (SHA2102) performed, documented, reviewed, and implemented. Hazard identification traced through evaluation, design, specification, implementation, and testing. Design Review Team confirmed no increased health or safety risk. |
Intended Use | Identical indications for use to predicate device. "The total sum of all feature enhancements does not affect the intended use of ViewStation." |
Technological Characteristics | "Technological characteristics remain principally the same." "Evolutionary product changes does not raise any new questions of safety and effectiveness, nor do the changes require novel methods of verification or validation." |
Basic Functionality | "The sum of the changes does not affect the basic functionality of ViewStation remains dedicated to providing healthcare personnel with a means to import, view, annotate, manage, and archive patient images." |
Software Quality | Developed according to IMPAC Software Design Control Procedure (SDCP). IMPAC Quality System complies with ISO 9001:2000, ISO 13485:2003, ISO 14971:2000, EN 60601-1-4:1996, ISO/IEC 9003:2004, and 93/42/EEC. |
Verification and Validation | Traceability Matrix created. System Test Plan for full application, integration, and system testing. Test Procedures capture detailed parameters, results, and certification. Test certification statement confirms planned testing completed successfully. Design Reviews performed at each phase. |
Algorithm/Technical Changes | Engineering performed to ensure algorithms and all other technical changes function exactly as intended. Testing demonstrated successful implementation. |
Regulatory Compliance | Submitted under 510(k) Premarket Notification as substantially equivalent to predicate devices (K011694 and K942346). |
2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)
This information is not applicable and therefore not provided in the submission. Since the device is explicitly not intended for diagnosis and the changes are evolutionary software updates to an existing image management system, no clinical "test set" of patient data (images) was used to evaluate diagnostic performance. The testing performed was related to software verification and validation, hazard analysis, and functional integrity.
3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)
This information is not applicable and therefore not provided in the submission. As no clinical "test set" using patient data for diagnostic evaluation was involved, no experts were required to establish ground truth for such a purpose. The "experts" involved would be software engineers, quality assurance personnel, and potentially medical professionals (users) providing feedback on the system's usability and functionality, but not establishing diagnostic ground truth.
4. Adjudication method (e.g. 2+1, 3+1, none) for the test set
This information is not applicable and therefore not provided in the submission. Adjudication methods are typically used in studies involving expert review of diagnostic performance. The testing described focuses on software functionality, safety, and compliance with quality systems, not diagnostic accuracy.
5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance
A Multi-Reader, Multi-Case (MRMC) comparative effectiveness study was not done. The ViewStation is an image management system and explicitly states it is "not intended for use in diagnosis." Therefore, there is no AI component for diagnostic assistance, and no study to evaluate reader improvement with or without AI.
6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done
This information is not applicable. The ViewStation is a software system with human-in-the-loop functionality, and not a standalone diagnostic algorithm. Its purpose is to manage images for human users.
7. The type of ground truth used (expert consensus, pathology, outcomes data, etc)
This information is not applicable. The device is not for diagnosis, so there is no "ground truth" related to disease presence or absence established from pathology, expert consensus, or outcomes data. The "ground truth" for the software testing would be the expected functional behavior and safety requirements defined in the design specifications.
8. The sample size for the training set
This information is not applicable and therefore not provided in the submission. The ViewStation is a conventional image management software, not a machine learning or AI-driven diagnostic algorithm that requires a "training set" of data in the typical sense. The software's development is guided by established engineering principles and quality systems rather than data-driven machine learning.
9. How the ground truth for the training set was established
This information is not applicable. As there is no "training set" in the context of machine learning, there is no ground truth establishment for such a set. The "ground truth" for software development is based on user requirements, regulatory standards, and design specifications.
Ask a specific question about this device
(90 days)
VIEWSTATION
ViewStation supports image and information flow among health care facility personnel. ViewStation can be used whenever digital images and associated data are the means for communicating information. ViewStation is not intended for use in diagnosis. The images and associated information are stored in a database, providing users access to the information necessary to perform their functions.
The intended use of ViewStation is to provide health care facility personnel with an efficient and effective means to utilize patient images during the course of therapy or treatment. ViewStation allows users to import, view, annotate, manipulate, enhance, manage, and archive patient images.
The primary function of ViewStation is to provide a means to move image information in a therapy or treatment environment. ViewStation provides the ability to import, view, annotate, manage, and archive patient images during the course of therapy, treatment, and follow-up.
ViewStation imports existing digital images acquired or generated by other products. ViewStation retains the original image, which was acquired or generated by a third party product.
The provided text describes a Premarket Notification (510(k)) Summary of Safety and Effectiveness for "ViewStation, Image Processing System" by IMPAC Medical Systems, Inc. This document focuses on establishing substantial equivalence to a predicate device, rather than presenting a detailed study with acceptance criteria and performance metrics typically seen for standalone diagnostic AI/ML devices.
Here's an analysis based on the provided text, highlighting what is present and what is absent in relation to your request:
1. A table of acceptance criteria and the reported device performance
The document does not provide a table of acceptance criteria or specific reported device performance metrics in the way one would expect for a diagnostic or AI/ML algorithm that generates specific output values (e.g., accuracy, sensitivity, specificity, AUC).
Instead, the "acceptance criteria" can be inferred from the document's claims about equivalency and safety:
Acceptance Criterion (Inferred) | Reported Device Performance (Inferred) |
---|---|
Safety: No increase in health or safety risk to patients, users, or third parties. | System Hazard Analysis performed, reviewed, and implemented (SHA2101). Design Review Team determined no increase in risk. |
Effectiveness: Algorithms function exactly as intended. | Engineering testing performed and demonstrated successful implementation of algorithms and functionality. |
Intended Use: Remains the same as the predicate device. | IMPAC determined and certified that "The intended use of ViewStation remains the same." |
Substantial Equivalence: Equivalent in intended use, safety, and effectiveness to the predicate device. | ViewStation is "substantially equivalent to the original ViewStation product" and "the new ViewStation and previous ViewStation products are equivalent in intended use and safety and effectiveness." |
Quality System Compliance: Developed under established quality standards. | Developed according to IMPAC Software Design Control Procedure (SDCP) and in compliance with 21 CFR 820, ISO 9001:1994, ISO 13485:1996, 93/42/EEC, EN 46001:1997, EN 601-1-4:1996. |
2. Sample sized used for the test set and the data provenance
- Test Set Sample Size: Not specified. The document states "Engineering testing was also performed to ensure that the algorithms and all other technical changes function exactly as intended." This implies internal testing, but no details on the size or characteristics of the test data are provided.
- Data Provenance: Not specified.
3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts
- Not applicable as this is not a diagnostic device undergoing a typical clinical validation study with ground truth established by experts. The "effectiveness" is primarily about the algorithms functioning as intended, not about diagnostic accuracy against expert consensus.
4. Adjudication method for the test set
- Not specified.
5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance
- No MRMC or comparative effectiveness study is mentioned. The device's purpose is to automate a previously manual process (identifying a treatment field edge and ordering images), not to provide a diagnostic AI/ML assistant to human readers. The document explicitly states: "No additional or changed diagnostic or therapeutic claims arise as the result of the ViewStation product. Therefore, demonstration of clinical efficacy is not a required element of this Premarket Notification. Further, clinical performance data is not required for determination of substantial equivalence for this type and class of device."
6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done
- Yes, in a sense. The algorithms perform automated functions without human intervention in the specific tasks they are designed for (determining treatment field images, identifying edges, ordering images). The document focuses on the standalone functioning of these algorithms as part of the overall ViewStation system. "Engineering testing was also performed to ensure that the algorithms and all other technical changes function exactly as intended."
7. The type of ground truth used
- The concept of "ground truth" for diagnostic accuracy (e.g., pathology, outcomes data, expert consensus) is not directly applicable here. The "ground truth" for the engineering testing would relate to whether the algorithms correctly identify the treatment field image, correctly identify the field edge (based on pre-defined criteria or reference images), and correctly order the images. This would be a technical ground truth related to the algorithm's intended function, not clinical ground truth for diagnosis.
8. The sample size for the training set
- Not specified.
9. How the ground truth for the training set was established
- Not specified.
Summary of Device and its Purpose:
The ViewStation is an image processing system primarily for radiation therapy, designed to import, view, annotate, manipulate, enhance, manage, and archive patient images during therapy. The reported changes involve new automated image processing algorithms to:
* Determine which of two portal images is the treatment field portal image.
* Employ an edge detection algorithm to identify the treatment field edge in a portal image.
* Modify an existing histogram optimization algorithm to accept dynamic inputs from the new edge detection algorithm.
* Superimpose a polygon of the field edge onto an open field image and automatically order images.
Crucially, the document states, "ViewStation is not intended for use in diagnosis." and explicitly notes that "No additional or changed diagnostic or therapeutic claims arise as the result of the ViewStation product." This means it is a tool to enhance workflow and image management in a therapy setting, not a device that provides diagnostic outputs requiring traditional clinical performance metrics. The "study" referenced is the internal verification and validation testing to ensure the algorithms function as intended and that the changes do not introduce new safety concerns or alter the intended use, maintaining substantial equivalence to its predicate device.
Ask a specific question about this device
Page 1 of 1