Search Results
Found 3 results
510(k) Data Aggregation
(139 days)
The Cios Spin is a mobile X-Ray system designed to provide X-ray imaging of the anatomical structures of patient during clinical applications. Clinical applications may include but are not limited to: interventional fluoroscopic, gastro- intestinal, endoscopic, urologic, pain management, orthopedic, neurologic, vascular, cardiac, critical care and emergency room procedures. The patient population may include pediatric patients.
The Cios Spin mobile fluoroscopic C-arm X-ray System designed for the surgical environment. The Cios Spin provides comprehensive image acquisition modes to support orthopedic and vascular procedures. The system consists of two major components:
a) The C-arm with X-ray source on one side and the flat panel detector on the opposite side. The c-arm can be angulated in both planes and be lifted vertically, shifted to the side and move forward/backward by an operator.
b) The second unit is the image display station with a moveable trolley for the image processing and storage system, image display and documentation. Both units are connected to each other with a cable.
This document describes the premarket notification (510(k)) for the Siemens Cios Spin X-ray system. The information provided is primarily focused on demonstrating substantial equivalence to predicate devices, rather than a standalone clinical study with detailed acceptance criteria for a new AI feature.
However, based on the provided text, we can infer acceptance criteria and the studies performed for specific features, particularly those listed under "New Software VA30 due to new functionality."
Summary of Device and Context:
The Cios Spin is a mobile fluoroscopic C-arm X-ray system for imaging anatomical structures during various clinical applications, including interventional, orthopedic, and neurological procedures. The 510(k) submission highlights several modifications and new software functionalities compared to its predicate device, the Cios Alpha.
1. Table of Acceptance Criteria and Reported Device Performance
The document does not explicitly state quantitative acceptance criteria in a dedicated section for the new software features. Instead, it relies on comparative equivalence and verification/validation testing against established guidance and predicate device performance. For the detector, quantitative metrics are provided.
Feature/Metric | Acceptance Criteria (Implied/Direct) | Reported Device Performance |
---|---|---|
General Device Safety & Effectiveness | - Compliance with 21 CFR Federal Performance Standards (1020.30, 1020.32, 1040.10). |
- Conformance to FDA Recognized Consensus Standards and Guidance Documents.
- Software specifications meet acceptance criteria.
- Risk analysis completed and controls implemented for identified hazards.
- Safe and effective for intended users, uses, and environments (through design control V&V). | - Complies with 21 CFR 1020.30, 1020.32.
- Certified to comply with AAMI ANSI ES60601-1:2005/(R)2012, IEC 60601-1-2:2014, IEC 60601-1-3:2013, IEC 60601-1-6:2010/A1:2013, IEC 60825-1:2014, IEC 62304:2015, IEC 60601-2-28:2010, IEC 60601-2-43:2017, IEC 60601-2-54:2009/A1:2015, ISO 14971:2007, IEC 62366-1:2015/Cor1:2016.
- Verification and validation testing found acceptable, supporting claims of substantial equivalence.
- All new software functions validated; worked as intended.
- Human Factor Usability Validation showed human factors addressed, with adequate training for employees.
- Cybersecurity statement provided, considering IEC 80001-1:2010. |
| New Software (e.g., Metal Artifact Reduction, Retina 3D, Screw Scout, Target Pointer, High Power 3D, Easy 3D) | - Does not raise any new issues of safety or effectiveness. - Works as intended (for new software functions). | - For Metal Artifact Reduction: The algorithm is unchanged from a previously cleared device (syngo Application Software VD20B, K170747). Improves image quality by reducing artifacts.
- For Retina 3D, Screw Scout, Target Pointer, Cios Open Apps: Non-clinical testing and Software Verification/Validation testing conducted and acceptable per Software Guidance document. Retina 3D has the same reconstruction algorithm as predicate ARTIS Pheno.
- For High Power 3D & Easy 3D: Does not raise any new issues of safety or effectiveness per Software Guidance. |
| CMOS Flat Panel Detector | - Equivalent image quality to a-Si technology detector. - Does not raise any new issues of safety or effectiveness.
- Compliance with "Guidance for the Submission of 510(k)'s for Solid State X-ray Imaging Devices" for performance metrics. | - DQE: 72% (vs. Predicate Cios Alpha 76%, Reference Ziehm Solo FD 70%)
- Dynamic Range: 96dB (vs. Predicate Cios Alpha 94dB, Reference Ziehm Solo FD Equivalent)
- MTF: 58% at 1 lp/mm (large) (vs. Predicate Cios Alpha 55% at 1 Lp/mm, Reference Ziehm Solo FD 4lp/mm)
- Digitization Depth: 16 bit (same as predicates/references)
- Pixel Pitch: 152 μm (vs. Predicate Cios Alpha 194μm, Reference Ziehm Solo FD 100 μm)
- Field of View: 30 cm x 30 cm; 20 cm x 20 cm |
2. Sample Size Used for the Test Set and Data Provenance
The document does not specify a "test set" in terms of patient data for evaluating the new software features. The testing mentioned is primarily non-clinical performance testing, software verification and validation (V&V), human factors usability validation, and engineering bench testing.
- Sample Size: Not applicable in the context of patient data for the new software features, as the testing described is primarily technical and comparative against existing standards and predicate devices. For the detector, the metrics (DQE, MTF, etc.) are derived from laboratory measurements, not patient data sets.
- Data Provenance: Not specified as patient data is not the primary focus for the equivalence argument. The testing was conducted by Siemens Healthcare GmbH Corporate Testing Laboratory (for conformance standards) and internally for software V&V. This implies internal company testing, likely in a controlled laboratory environment.
3. Number of Experts Used to Establish Ground Truth for the Test Set and Their Qualifications
Given that the testing is primarily non-clinical and focused on technical performance and software functionality, the concept of "ground truth established by experts" for a patient-based test set is not directly applicable in the way it would be for an AI diagnostic algorithm.
- Experts: The "experts" involved are implied to be the engineers and technical specialists responsible for conducting the non-clinical tests, software verification/validation, and human factors evaluations. The approval by the FDA also involves review by regulatory experts.
- Qualifications: While not explicitly stated, these would be Siemens' internal development and quality assurance teams, as well as external certification bodies for standards compliance.
4. Adjudication Method for the Test Set
Not applicable, as there is no mention of a patient-based test set requiring expert adjudication for ground truth.
5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study
No, a multi-reader multi-case (MRMC) comparative effectiveness study was not done for the Cios Spin in this 510(k) submission. The submission focuses on demonstrating substantial equivalence through technological characteristics, non-clinical performance data, and software validation. It does not include an assessment of how human readers improve with or without AI assistance.
6. Standalone (Algorithm Only Without Human-In-The-Loop Performance) Study
The document does not describe a standalone performance study for the software features (e.g., Metal Artifact Reduction, Retina 3D, Screw Scout, Target Pointer) similar to what would be done for a diagnostic AI algorithm. Instead, it states that "All new software functions present in the Subject Device... have been validated through detailed software testing and it was founded they worked as intended." This implies internal functional and performance testing, but not a standalone clinical performance study typically associated with AI algorithms.
7. Type of Ground Truth Used
The "ground truth" for the various new features is established through:
- Engineering specifications and design requirements: For software functionality and hardware performance.
- Compliance with recognized industry standards: (e.g., IEC standards, FDA performance standards)
- Comparison to predicate devices and reference devices: For performance metrics (e.g., DQE, MTF for the detector), where "equivalent" or "comparable" performance serves as the ground truth.
- Expected "working as intended" functionality: For the new software features validated through detailed software testing.
There is no mention of pathology, expert consensus on patient cases, or outcomes data used to establish ground truth for the specific performance of these new features in a clinical setting.
8. Sample Size for the Training Set
Not applicable. The document describes a medical device (X-ray system) with new software features, not a machine learning model that requires a "training set" in the context of AI/ML development. The software validation is based on internal testing against specifications.
9. How the Ground Truth for the Training Set Was Established
Not applicable, as there is no "training set" as understood in machine learning. The "ground truth" for the software validation mentioned in the document is based on meeting pre-defined software specifications and functional requirements through verification and validation testing.
Ask a specific question about this device
(132 days)
The Cios Alpha is a mobile X-Ray system designed to provide X-ray imaging of the anatomical structures of patient during clinical applications. Clinical applications may include but are not limited to: interventional fluoroscopic, gastro- intestinal, endoscopic, urologic, pain management, orthopedic, neurologic, vascular, cardiac, critical care and emergency room procedures. The patient population may include pediatric patients.
The Cios Alpha mobile fluoroscopic C-arm X-ray System is designed for the surgical environment. The Cios Alpha provides comprehensive image acquisition modes. The system consists of two major components:
a) The C-arm with an X-ray source on one side and the flat panel detector on the opposite side. The c-arm can be angulated in both planes and lifted vertically, shifted to the side and moved forward/backward by an operator.
b) The second component is the image display station with a moveable trolley that holds the image processing and storage system, and the image display. Both components are connected to each other with a cable.
Here's an analysis of the acceptance criteria and study information provided in the document for the Cios Alpha (VA30) device:
1. Table of Acceptance Criteria and Reported Device Performance
The document does not explicitly state "acceptance criteria" for quantitative performance metrics in a pass/fail format. Instead, it presents a comparison of the Subject Device's (Cios Alpha (VA30)) performance to its Predicate (Cios Alpha (VA10)) and Reference Devices for Solid State X-Ray Imaging (SSXI) specifications. The implication is that comparable or better performance is the acceptance criterion for the SSXI metrics.
SSXI Metric | Acceptance Criteria (Implied - Comparable or Better) | Reported Cios Alpha (VA30) Performance | Predicate Cios Alpha (VA10) Performance | Reference Ziehm Vision RFD Performance | Reference Ziehm Solo FD Performance |
---|---|---|---|---|---|
Imaging Modes | Pulsed fluoroscopy | Pulsed fluoroscopy | Pulsed fluoroscopy | Pulsed fluoroscopy | Pulsed Fluoroscopy, Digital Spot |
DQE | Comparable or better than Predicate (76%) and Reference (70%) | 75% (small), 72% (large) | 76% | Information Not Available | 70% |
Dynamic Range | Comparable or better than Predicate (94dB) and Reference (Equivalent) | 96dB | 94dB | Information Not Available | Equivalent |
Modulation Transfer Function (MTF) | Comparable or better than Predicate (55% at 1 Lp/mm) and Reference (4 Lp/mm) | 60% at 1 lp/mm (small), 58% at 1 lpmm (large) | 55% at 1 Lp/mm | Information Not Available | 4lp/mm |
Digitization Depth | 16 bit | 16 bit | 16 bit | 16 bit | 16 bit |
Pixel Pitch | Not explicitly stated as a target, but the change from 194μm (Predicate) to 152μm (Subject) is a technological characteristic shown as an improvement. | 152 μm | 194μm | 194 μm | 100 μm |
Field of View | Matching the predicate and reference devices. | Small FD: 20x20, 15x15, 10x10; Large FD: 30x30, 20x20 | Small FD: 20x20, 15x15, 10x10; Large FD: 30x30, 20x20 | FPD 20 cm: 20, 15, 10 | FPD 20 cm: 20, 15, 10 |
Additional Acceptance Criteria (General):
- Compliance with voluntary standards (Table 3), FDA Guidance Documents (Table 4).
- Software verification and validation meeting acceptance criteria.
- Risk analysis completed and hazards mitigated.
- Human Factors Usability Validation showing addressing human factors and successful clinical use tests.
- Cybersecurity requirements met.
Study Proving Device Meets Acceptance Criteria:
The document describes several non-clinical performance tests and analyses to demonstrate that the Cios Alpha (VA30) meets the acceptance criteria, primarily for substantial equivalence to its predicate devices.
2. Sample Size Used for the Test Set and Data Provenance
- Test Set Sample Size: The document does not specify a distinct "test set" sample size in terms of patient data or number of images for evaluating the SSXI metrics. The performance evaluation seems to be based on engineering bench testing of device capabilities rather than a separate clinical image set.
- Data Provenance: The data provenance for the SSXI metrics and other performance tests is non-clinical bench testing. The document states: "Performance tests were conduct[ed] to test the functionality of the Cios Alpha (VA30)." It also mentions "Additional engineering bench testing was performed including: the non-clinical testing identified in the guidance for submission of 510(k) s for Solid State X-Ray Imaging Devices (SSXI); demonstration of system performance; and an imaging performance evaluation."
- The "clinical images are not required" statement further confirms the non-clinical nature of the specific SSXI evaluation.
- The Human Factor Usability Validation mentions "clinical use tests with customer report and feedback form," which implies some level of prospective, real-world (or simulated real-world) interaction, but specific sample sizes are not provided.
- The origin of the data is Siemens Healthcare GmbH Corporate Testing Laboratory and internal verification/validation processes.
3. Number of Experts Used to Establish the Ground Truth for the Test Set and Qualifications of Those Experts
- The document does not describe the use of experts to establish ground truth for a test set in the traditional sense of image interpretation for diagnostic accuracy. The testing primarily focuses on technical specifications of the imaging system itself.
- For the Human Factors Usability Validation, "customer report and feedback form" are mentioned, implying input from users (healthcare professionals), but no specific number or detailed qualifications are provided.
4. Adjudication Method for the Test Set
- Given that the primary performance evaluation described is non-clinical bench testing of engineering specifications (SSXI metrics), an adjudication method for a test set based on expert consensus would not be applicable or mentioned. The "ground truth" for these metrics is objectively measured device performance.
5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance
- No MRMC comparative effectiveness study is mentioned. This submission is for an imaging system (C-arm X-ray system), not an AI-powered diagnostic algorithm that assists human readers. While it includes "new software functions" like "Target Pointer," which "enables the automatic detection of K-wires and displays the trajectory," the document does not present a study evaluating the impact of this feature on human reader performance or diagnostic accuracy.
6. If a Standalone (i.e., algorithm only without human-in-the-loop performance) was done
- The document evaluates the Cios Alpha as an imaging system, not a standalone AI algorithm. While it contains new software features, the performance metrics discussed (e.g., DQE, MTF) are system-level imaging characteristics. The "Target Pointer" feature performs automatic detection, but its standalone performance (e.g., accuracy of K-wire detection) is not detailed in the provided text. The overall context is regulatory clearance for hardware and software modifications of an existing medical device, not a new AI-enabled diagnostic device undergoing standalone performance evaluation.
7. The Type of Ground Truth Used (expert consensus, pathology, outcomes data, etc.)
- For the SSXI performance metrics (DQE, Dynamic Range, MTF, etc.), the "ground truth" is based on objective physical measurements and technical standards. These are inherent properties of the imaging system's detector and processing.
- For software functions, "ground truth" is established through detailed software testing to confirm they "worked as intended" according to specifications and requirements.
- For Human Factors, ground truth would relate to usability and safety observations and feedback during "clinical use tests."
8. The Sample Size for the Training Set
- The document does not mention a training set sample size. This is expected as the submission primarily concerns an imaging system rather than a machine learning algorithm requiring a distinct training phase with annotated data. Although new software features are present, the submission focuses on their validation as part of the overall device.
9. How the Ground Truth for the Training Set Was Established
- Since no training set is discussed, the method for establishing its ground truth is also not applicable in this document.
Ask a specific question about this device
(67 days)
The OEC Elite mobile fluoroscopy system is designed to provide fluoroscopic and digital spot images of adult and pediatric populations during diagnostic, interventional and surgical procedures. Examples of a clinical application may include: orthopedic, gastrointestinal, endoscopic, urologic, critical care and emergency procedures.
The OEC Elite is a Mobile Fluoroscopic C-arm Imaging system used to assist trained surgeons and other qualified physicians. The system is used to provide fluoroscopic X-Ray images during diagnostic, interventional, and surgical procedures. These images help the physician visualize the patient's anatomy and interventional tools. This visualization helps to localize clinical regions of interest and pathology. The images provide real-time visualization and records of pre-procedure anatomy, in vivo-clinical activity and post-procedure outcomes. The system is composed of two primary physical components. The first is referred to as the "C - Arm" because of its "C" shaped image gantry; the second is referred to as the "Workstation", which is the primary interface for the user to interact with the system.
The C-arm is a stable mobile platform capable of performing linear motions (vertical, horizontal) and rotational motions (orbital, lateral, wig-wag) that allow the user to position the X-Ray image chain at various angles and distances with respect to the patient anatomy to be imaged. The C - arm is mechanically balanced allowing for ease of movement and capable of being "locked" in place using a manually activated lock. The C-Arm is comprised of the high voltage generator, software, X-ray control, and a "C" shaped image gantry, which supports an X-ray tube and a Flat Panel Detector or Image Intensifier, depending on the choice of detector configuration desired.
The workstation is a stable mobile platform with an articulating arm supporting a color image, high resolution, LCD display monitor. It also includes image processing equipment/software, recording devices, data input/output devices and power control systems.
The primary purpose of the mobile fluoroscopy system is to provide fluoroscopic images of the patient during diagnostic, interventional, and surgical procedures such as orthopedic, gastrointestinal, endoscopic, urologic, neurologic, critical care and emergency procedures.
The OEC Elite comes with four image receptor (detector) options: the choice of 21x21 cm or 31x31 cm (new) Thallium-doped Cesium Iodide Cs] solid state flat panel X-ray detector with Complementary Metal Oxide Semiconductor (CMOS) light imager; or the choice of a 9 inch or 12 inch of the same existing image intensifier as in the OEC 9900 Elite.
This document describes the OEC Elite, a mobile fluoroscopy system. It does not contain information on human performance studies or the establishment of ground truth by expert consensus for evaluating clinical tasks. Instead, it focuses on non-clinical engineering and imaging performance testing against defined metrics.
Here is an analysis based on the provided text, focusing on the acceptance criteria and the study proving the device meets them:
1. A table of acceptance criteria and the reported device performance:
The document outlines acceptance criteria based on performance metrics for Solid State X-ray Imaging Devices (SSXI) and compares the OEC Elite to its predicate device, OEC 9900 Elite.
SSXI Metrics | OEC Elite Performance Compared to Predicate OEC 9900 Elite |
---|---|
DQE (Detective Quantum Efficiency) | Improved |
Dynamic Range | Improved |
Spatial Resolution (MTF, Limiting Resolution) | Equivalent |
Temporal Resolution | Equivalent |
Contrast Resolution | Equivalent |
Beam Alignment | Equivalent |
Dose Rate | Equivalent |
Stability of the device characteristics over time | Equivalent |
Brightness uniformity | Improved |
Fluoroscopy Frame Rate | Equivalent |
Reuse Rate | Equivalent |
2. Sample size used for the test set and the data provenance:
The document refers to "additional engineering bench testing" and "imaging performance evaluation using anthropomorphic phantoms." It does not specify a distinct "test set" in terms of clinical images or patient data, nor does it provide a sample size for such a set. It appears the performance evaluations were conducted on the device itself and phantoms.
The data provenance is through non-clinical testing using anthropomorphic phantoms in a laboratory setting at GE OEC Medical Systems, Inc. (GE Healthcare).
3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts:
This information is not applicable as the evaluation was based on non-clinical engineering and imaging performance metrics, primarily using anthropomorphic phantoms and objective measurements. There was no mention of human experts establishing ground truth for a clinical test set in this context.
4. Adjudication method for the test set:
This information is not applicable as there was no clinical test set requiring human adjudication to establish ground truth. The evaluation focused on technical performance metrics against a predicate device.
5. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:
No MRMC comparative effectiveness study was done. The device described (OEC Elite) is a mobile fluoroscopy system, a medical imaging hardware device, not an AI-powered diagnostic tool that assists human readers. Therefore, the concept of "human readers improve with AI vs without AI assistance" does not apply to this submission.
6. If a standalone (i.e., algorithm only without human-in-the-loop performance) was done:
This is not applicable in the context of an x-ray imaging system. The OEC Elite is a hardware device that produces images, not an algorithm, and the performance criteria relate to image quality and system functionality, not algorithmic output without human intervention.
7. The type of ground truth used:
The "ground truth" for the non-clinical testing was established by objective measurements of physical performance metrics (e.g., DQE, spatial resolution) on the OEC Elite and compared to the established performance of the predicate device (OEC 9900 Elite) and reference devices. Additionally, compliance with recognized standards (e.g., IEC 60601-1 Ed. 3 series, 21CFR Subchapter J) served as a form of "ground truth" for safety and efficacy.
8. The sample size for the training set:
This information is not applicable. The OEC Elite is a medical imaging hardware system, not a machine learning or AI algorithm that requires a "training set."
9. How the ground truth for the training set was established:
This information is not applicable as there was no training set.
Ask a specific question about this device
Page 1 of 1