Search Results
Found 2 results
510(k) Data Aggregation
(21 days)
The VALORY system is a General Radiography X-ray imaging system used in hospitals, clinics and medical practices by radiographers, radiologists and physicists to make, process and view static X-ray radiographic images of the skeleton (including skull, spinal column and extremities), chest, abdomen and other body parts on adults and pediatric patients.
Applications can be performed with the patient in sitting, standing or lying position.
The system is not intended for use in Mammography applications
VALORY is a solid state x-ray system, a direct radiography (DR) system (product code MQB) intended to capture general radiographic images of the human body. VALORY is a ceiling mounted stationary X-Ray system with digital image capture that consists of a tube and operator console with a patient table and/or wall stand. VALORY uses Agfa's NX workstation with MUSICA2 TM image processing and flat-panel detectors of the scintillator-photodetector type (Cesium Iodide - CsI) to capture and process the digital image.
The provided text describes the VALORY system, a general radiography X-ray imaging system, and its substantial equivalence to a predicate device (DR 600, K152639). However, it does not detail specific acceptance criteria for performance metrics (e.g., sensitivity, specificity, accuracy) or a study proving the device meets these criteria in the way a clinical performance study would.
Instead, the submission focuses on demonstrating substantial equivalence through:
- Comparison of technological characteristics.
- Bench testing for image quality and usability.
- Software verification and validation.
- Compliance with electrical safety, EMC, and radiation protection standards.
- Adherence to quality management systems and guidance documents.
Here's an attempt to extract and organize the requested information, noting where specific details are not provided in the text:
1. Table of Acceptance Criteria and Reported Device Performance
The submission does not specify quantitative acceptance criteria for image quality or clinical performance metrics (like sensitivity, specificity, or accuracy) in a traditional sense. The performance is assessed relative to the predicate device.
Acceptance Criteria (Implicit) | Reported Device Performance |
---|---|
Image Quality: Equivalent or better than predicate device. | "Image quality validation testing was conducted using anthropomorphic adult and pediatric phantoms and evaluated by qualified internal experts and external radiographers. The radiographers evaluated the VALORY X-ray system with the DR 600 (predicate device. K 152639) using XD 14 (K211790, pending 510(k) clearance) and DR 14s (K161368) flat-panel detectors comparing overall image quality. The test results indicated that the VALORY X-ray system has at least the same if not better image quality than the predicate device (DR 600 - K152639) and other flat-panel detectors currently on the market." |
"Additional image quality validation testing for NX 23 was completed in scope of the DX-D Imaging Package with XD Detectors and included a full range of GenRad image processing applications compared to MUSICA 2 image processing using anonymized adult and pediatric phantoms and read by eight internal experts." |
| Usability: Meets safety and workflow requirements. | "Usability evaluations for VALORY were conducted with external radiographers. The usability studies evaluated overall product safety, including workflow functionality for adults and pediatric patients, system movements, information and support for components. The results of the usability tests, fulfillment of the validation acceptance criteria, and assessment of remaining defects support VALORY passing usability validation testing." |
| Software Safety: Acceptable risk profile. | "Software verification testing for NX 23 was completed... For the NX 23 (NX Orion) software there are a total of 535 risks in the broadly acceptable region and 37 risks in the ALARP region with only four of these risks identified. Zero risks were identified in the Not Acceptable Region. Therefore, the device is assumed to be safe, the benefits of the device are assumed to outweigh the residual risk." |
| Compliance: Meets relevant electrical, EMC, and radiation standards. | VALORY is compliant to FDA Subchapter J mandated performance standards 21 CFR 1020.30 and 1020.31. Also compliant with IEC 60601 series, ISO 13485, ISO 14971, DICOM, IEC 62366-1, and IEC 62304. |
| Functional Equivalence: Same intended use as predicate. | The VALORY system has indications for use that is consistent with and substantially equivalent to the legally marketed predicate device (K152639). |
2. Sample Size Used for the Test Set and Data Provenance
The "test set" consisted of:
- Anthropomorphic adult and pediatric phantoms.
- No information on the exact number of phantoms or images used in these tests.
- The data provenance is not specified as country of origin, but it is implied to be laboratory-generated phantom data as "No clinical trials were performed in the development of the device. No animal or clinical studies were performed in the development of the new device."
- The studies were retrospective in the sense that they were conducted for submission, but not using existing clinical patient data.
3. Number of Experts Used to Establish the Ground Truth for the Test Set and Their Qualifications
- Image Quality Evaluation: "qualified internal experts and external radiographers" evaluated images from phantoms compared to the predicate device. For NX 23 testing, "eight internal experts" read images. Specific qualifications (e.g., years of experience) for these experts are not provided.
- Usability Evaluation: "external radiographers." Specific numbers and qualifications are not provided.
4. Adjudication Method for the Test Set
The document does not explicitly describe an adjudication method for the expert evaluations, such as 2+1 or 3+1. It states experts "evaluated" and "read" the images, implying their findings were used as the basis for comparison.
5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study Was Done, and the Effect Size of How Much Human Readers Improve with AI vs. Without AI Assistance
- No MRMC comparative effectiveness study was done.
- The device is a general radiography X-ray imaging system, not explicitly an AI-assisted diagnostic device for specific disease detection. Its image processing (MUSICA/MUSICA2/MUSICA3) is an established technology from the predicate device and other Agfa systems. The study focuses on demonstrating image quality equivalence, not on reader performance improvement with AI assistance.
6. If a Standalone (i.e., Algorithm Only Without Human-in-the-Loop Performance) Was Done
- The VALORY system itself is an imaging system, not a standalone diagnostic algorithm.
- The software risk assessment for NX 23 (the workstation software) involved evaluating risks without explicit human interaction for every risk listed, focusing on software failure modes. However, this is not a diagnostic "algorithm only without human-in-the-loop performance" in the sense of an AI CAD system. The image processing algorithms within MUSICA are an integral part of the system's image generation, not a separate diagnostic interpretation.
7. The Type of Ground Truth Used
- Image Quality: The ground truth for image quality evaluation was based on the performance of the predicate device (DR 600) and other marketed flat-panel detectors, combined with expert evaluation of images from anthropomorphic phantoms. It's a comparative assessment rather than a definitive "ground truth" of a disease state.
- Usability: Ground truth for usability was established through the feedback and performance of external radiographers during usability studies, ensuring the system met workflow and safety requirements.
- Software/Safety: Ground truth for software safety and compliance was against established standards (IEC, ISO, FDA regulations) and internal risk analyses.
8. The Sample Size for the Training Set
The document makes no mention of a "training set" in the context of machine learning or AI models. The device is being cleared based on substantial equivalence to an existing X-ray system, not as a novel AI diagnostic algorithm that requires a distinct training dataset. The "image processing algorithms" mentioned are presented as existing, cleared technologies from previous Agfa devices.
9. How the Ground Truth for the Training Set Was Established
Since there is no mention of a distinct training set for an AI model, this information is not applicable and not provided in the document.
Ask a specific question about this device
(53 days)
The DR 800 with DSA system is indicated for performing dynamic imaging examinations (fluoroscopy and/or rapid sequence) of the following anatomies/procedures:
- · Positioning fluoroscopy procedures
- · Gastro-intestinal examinations
- · Urogenital tract examinations
- · Angiography
- · Digital Subtraction Angiography
It is intended to replace fluoroscopic images obtained through image intensifier technology. In addition, the system is intended for projection radiography of all body parts.
In addition, the system provides the Agfa Tomosynthesis option, which is intended to acquire tomographic slices of human anatomy and to be used with Agfa DR X-ray systems. Digital Tomosynthesis is used to synthesize tomographic slices from a single tomographic sweep.
Not intended for cardiovascular and cerebrovascular contrast studies. Not intended for mammography applications.
Agfa's DR 800 with DSA medical device is a fluoroscopic x-ray system that includes digital angiography (product code JAA) intended to capture tomographic, static and dynamic images of the human body. The DR 800 is a floor-mounted radiographic, fluoroscopic and tomographic system that consists of a tube and operator console with a motorized tilting patient table. FLFS overlay and bucky with optional wall stand and ceiling suspension. The new device uses Agfa's NX workstation with MUSICA image processing and flat-panel detectors for digital, wide dynamic range and angiographic image capture. It is capable of replacing other direct radiography, tomography, image intensifying tubes and TV cameras, including computed radiography systems with conventional or phosphorous film cassettes.
This submission is to add the newest version of the DR 800 with Digital Subtraction Angiography (DSA) to Agfa's radiography portfolio.
Here's an analysis of the acceptance criteria and study information for the Agfa DR 800 with DSA, based on the provided text:
1. Table of Acceptance Criteria and Reported Device Performance
The provided document does not explicitly list quantitative acceptance criteria in a table format for performance metrics. Instead, it describes a more qualitative approach, focusing on equivalence to predicate devices and confirmation through expert evaluation.
Acceptance Criteria (Inferred from text) | Reported Device Performance |
---|---|
Bench Testing (General Performance) | "Technical and acceptance testing was completed on the DR 800 with DSA in order to confirm the medical device functions and performs as intended. All deviations or variances are documented in a defect database and addressed in the CRD documentation and verified. All mitigations have been tested and passed. All design input requirements have been tested and passed. All planned verification activities have been successfully completed." |
Functionality and Usability | "Performance functionality and usability evaluations were conducted with qualified experts. The results of these tests fell within the acceptance criteria for the DR 800 with DSA; therefore, the DR 800 supports GenRad, Full Leg/ Full Spine (FLFS), roadmapping and Digital Subtraction Angiography (DSA) workflow." |
Clinical Image Quality (DSA) | "Clinical image validation was conducted using anthropomorphic phantoms and evaluated by qualified experts. The radiographers evaluated the DSA image quality on the DR 800 by comparing overall image quality with the primary predicate A device (K190373). Diagnostic confidence for DSA image quality and roadmapping on the DR 800 was between good and excellent." The document also states, "Clinical image quality evaluation is not essential in establishing substantial equivalence for the DR 800 with DSA. Adequate Bench Testing results should be sufficient to determine device safety and effectiveness." This indicates that while performed, it wasn't a strict acceptance criterion in the same vein as quantitative safety/effectiveness thresholds. |
Software Verification & Validation (Safety/Risk) | "The complete device has been certified and validated. During the final risk analysis meeting, the risk management team concluded that the medical risk is no greater than with conventional x-ray film previously released to the field." "For the NX 23 (NX Orion) software there are a total of 535 risks in the broadly acceptable region and 37 risks in the ALARP region with only four of these risks identified. Zero risks were identified in the Not Acceptable Region. Therefore, the device is assumed to be safe, the benefits of the device are assumed to outweigh the residual risk." |
Electrical Safety and Electromagnetic Compatibility (EMC) Testing | The device is compliant with IEC 60601-1, IEC 60601-1-2, IEC 60601-1-3, and IEC 60601-2-54. The DR 800 is also compliant with FDA Subchapter J mandated performance standards 21 CFR 1020.30 - 1020.32. |
Quality Management, Risk Management, DICOM, Usability Engineering | The company's in-house procedures conform to ISO 13485, ISO 14971, ACR/NEMA PS3.1-3.20 (DICOM), and IEC 62366-1. (This implies compliance with these standards as part of overall acceptance). |
2. Sample Size Used for the Test Set and Data Provenance
- Test Set Sample Size: Not explicitly stated in terms of number of images or cases. The document mentions "anthropomorphic phantoms" for clinical image validation.
- Data Provenance: The study used "anthropomorphic phantoms," which are physical models designed to simulate human anatomy for imaging purposes. This indicates a laboratory/phantom study rather than real patient data. The country of origin for the phantom data is not specified, but the submission is from Agfa N.V. (Belgium). It is a prospective study in the sense that the new device was evaluated with these phantoms.
3. Number of Experts Used to Establish the Ground Truth for the Test Set and Qualifications of Those Experts
- Number of Experts: Not explicitly stated. The document mentions "qualified experts" and "radiographers."
- Qualifications of Experts: Described as "qualified experts" and "radiographers." No specific experience levels (e.g., "10 years of experience") are provided.
4. Adjudication Method for the Test Set
Not specified. The document states "evaluated by qualified experts" and "radiographers evaluated...by comparing overall image quality with the primary predicate A device," implying a comparative evaluation rather than a strict adjudication process for ground truth establishment.
5. If a Multi Reader Multi Case (MRMC) Comparative Effectiveness Study Was Done, If so, What was the effect size of how much human readers improve with AI vs without AI assistance
No, a Multi Reader Multi Case (MRMC) comparative effectiveness study was not conducted. This is not an AI-assisted diagnostic device; it's a conventional X-ray system with digital image processing and DSA capabilities. The study compared the device's image quality to a predicate device, focusing on equivalence, not human reader improvement with AI.
6. If a Standalone (i.e., algorithm only without human-in-the-loop performance) Was Done
Yes, in essence, the "Bench Testing" and "Software Verification and Validation Testing" sections describe standalone performance evaluations of the device's functions and image processing algorithms. The "Clinical image validation" with phantoms also focuses on the device's output (image quality) rather than human interaction with the device in a diagnostic workflow where the human acts as the ultimate decision-maker for the study’s performance outcome.
7. The Type of Ground Truth Used
The "ground truth" for the image quality evaluation was based on expert comparison and qualitative assessment of images produced by the device, specifically assessing "diagnostic confidence for DSA image quality and roadmapping" as "between good and excellent" when compared to a predicate device. This is primarily an expert consensus on image quality rather than pathology, clinical outcomes, or a gold standard.
8. The Sample Size for the Training Set
Not applicable. This device is an X-ray imaging system, not a machine learning or AI algorithm that requires a training set of data. The image processing algorithms are described as being "similar to those previously cleared" or "similar to the primary predicate device."
9. How the Ground Truth for the Training Set Was Established
Not applicable, as this device does not utilize a machine learning model that would require a ground truth for a training set.
Ask a specific question about this device
Page 1 of 1