Search Results
Found 1 results
510(k) Data Aggregation
(164 days)
4D-VIEW 9.1 (4D VIEW PC SOFTWARE) MODEL: H48651SZ
Image Display of GE Ultrasound 3D/4D data sets for diagnostic purposes including measurements on displayed image.
4D View 9.1 is a standalone Software product, which can be installed only on a PC with Microsoft Windows Vista operating systems: Primary Operating Functions are: · Display and editing of GE Ultrasound 3D/4D data sets · Measurements on displayed image incl. derived calculations which are all based on medical literature in the following applications: Abdominal, Obstetrics, Gynecology, Cardiology, Urology, Vascular, Neurology, Small Parts, Pediatrics, Musculo-Skeletal (Orthopedics). · Data storage (image, measurement and patient data) · Data transfer to and from remote systems (e.g. via DICOM) · Adding annotations to acquired image Same measurements and calculations are available on the predicate devices
Here's a summary of the acceptance criteria and study information for the GE Healthcare 4D-VIEW 9.1 device, based on the provided 510(k) summary:
The 4D-VIEW 9.1 device did not require clinical studies to support substantial equivalence. Therefore, there are no specific "acceptance criteria" or a "study that proves the device meets the acceptance criteria" in the traditional sense of a clinical performance study.
Instead, the submission focused on demonstrating substantial equivalence to predicate devices through non-clinical tests and technological similarity. The "acceptance criteria" are effectively that the device functions as intended, complies with voluntary standards, and has undergone appropriate quality assurance measures.
Here's the breakdown based on the provided documents:
1. Table of Acceptance Criteria and Reported Device Performance
Acceptance Criteria (Implied) | Reported Device Performance |
---|---|
Functional Capabilities: | |
- Display and editing of GE Ultrasound 3D/4D data sets | Primary Operating Function: Display and editing of GE Ultrasound 3D/4D data sets |
- Measurements on displayed image (incl. derived calculations) | Primary Operating Function: Measurements on displayed image incl. derived calculations which are all based on medical literature in the following applications: Abdominal, Obstetrics, Gynecology, Cardiology, Urology, Vascular, Neurology, Small Parts, Pediatrics, Musculo-Skeletal (Orthopedics). Same measurements and calculations are available on the predicate devices. |
- Data storage (image, measurement, patient data) | Primary Operating Function: Data storage (image, measurement and patient data) |
- Data transfer to/from remote systems (e.g., via DICOM) | Primary Operating Function: Data transfer to and from remote systems (e.g. via DICOM) |
- Adding annotations to acquired images | Primary Operating Function: Adding annotations to acquired image |
Compliance with Standards: | The 4D VIEW 9.1 and its applications comply with voluntary standards as detailed in Section 9, 11 and 17 of this premarket submission. |
Quality System Processes: | Quality assurance measures applied: Risk Analysis, Requirements Reviews, Design Reviews, Performance testing (Verification), Simulated use testing (Validation). |
Substantial Equivalence to Predicate Device(s): | The device employs the same fundamental scientific technology as its predicate devices (Viewpoint 5.0 K050943 and Voluson E8 K061682). GE Healthcare considers the 4D-VIEW 9.1 to be as safe, as effective, and performance is substantially equivalent to the predicate device(s). The FDA's 510(k) clearance confirms this finding. |
Operating System Compatibility: | Can be installed only on a PC with Microsoft Windows Vista operating systems. |
Intended Use: (as per Indications for Use statement) | Image Display of GE Ultrasound 3D/4D ultrasound data sets for diagnostic purposes, including measurements on displayed image. The device's primary operating functions directly support this intended use by enabling display, editing, measurements, and data management of such data sets. |
2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)
- No specific clinical test set was used for a performance study. The submission states: "The subject of this premarket submission, 4D-View 9.1, did not require clinical studies to support substantial equivalence."
- "Performance testing (Verification)" and "Simulated use testing (Validation)" were conducted as part of quality assurance measures, but details regarding sample size, data provenance, or specific methodologies for these internal tests are not provided in the 510(k) summary. These are typically internal engineering and V&V activities rather than formal clinical studies.
3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)
- Not applicable. As no clinical studies were required or conducted, there was no independent test set requiring expert ground truth establishment for a performance evaluation.
4. Adjudication method (e.g. 2+1, 3+1, none) for the test set
- Not applicable. No clinical test set requiring adjudication was used.
5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance
- No MRMC comparative effectiveness study was done. The device 4D-VIEW 9.1 is a software product for viewing and measuring 3D/4D ultrasound data, not an AI-powered diagnostic tool for interpretation assistance in the context of comparative effectiveness studies against human readers.
6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done
- Not explicitly described as a standalone performance study in the way this question is usually posed for AI algorithms. The device itself is described as "standalone Software product." Its performance is based on its ability to accurately display and calculate measurements from GE Ultrasound 3D/4D data. These functions are verified through internal performance testing and simulated use, much like any standard software. There isn't an "algorithm only" performance reported in the context of a diagnostic accuracy study.
7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)
- No clinical ground truth (expert consensus, pathology, outcomes data) was used. The "ground truth" for the device's functions (e.g., measurement calculations, data display fidelity) would have been established internally through engineering specifications, known mathematical formulas, and comparison to outputs from the predicate devices or validated internal references during verification and validation testing.
8. The sample size for the training set
- Not applicable. This device is a viewing and measurement software. It is not an AI/ML algorithm that requires a "training set" in the conventional sense. Its "knowledge" is embedded in its programming logic and algorithms derived from medical literature and engineering principles, rather than learned from a dataset.
9. How the ground truth for the training set was established
- Not applicable. As there is no training set, there is no ground truth establishment for a training set. The accuracy of its core functions (measurements, display) would be validated against established mathematical principles and reference standards during development and testing.
Ask a specific question about this device
Page 1 of 1