Search Filters

Search Results

Found 4 results

510(k) Data Aggregation

    K Number
    K221803
    Manufacturer
    Date Cleared
    2022-07-18

    (26 days)

    Product Code
    Regulation Number
    892.1720
    Reference & Predicate Devices
    Why did this record match?
    Reference Devices :

    K151465, K172793, K210619

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    This is a digital mobile diagnostic x-ray system intended for use by a qualified/trained doctor or technician on both adult and pediatric subjects for taking diagnostic radiographic exposures of the skull, spinal column, chest, abdomen, extremities, and other body parts. Applications can be performed with the patient sitting, standing, or lying in the prone or supine position. Not for mammography.

    Device Description

    This is a modified version of our previous predicate mobile PHOENIX. The predicate PHOENIX mobile is interfaced with Konica – Minolta Digital X-ray panels and CS-7 or Ultra software image acquisition. PHOENIX mobile systems will be marketed in the USA by KONICA MINOLTA. Models with the CS-7 Software will be marketed as AeroDR TX m01. Models with the Ultra software will be marketed as mKDR Xpress. The modification adds two new models of compatible Konica-Minolta digital panels, the AeroDR P-65 and AeroDR P-75, cleared in K210619. These newly compatible models are capable of a mode called DDR, Dynamic Digital Radiography wherein a series of radiographic exposures can be rapidly acquired, up to 15 frames per seconds maximum (300 frames).

    AI/ML Overview

    The provided text describes a 510(k) premarket notification for a mobile x-ray system. The document focuses on demonstrating substantial equivalence to a legally marketed predicate device rather than presenting a study to prove the device meets specific performance-based acceptance criteria for an AI/algorithm.

    Therefore, many of the requested details, such as specific acceptance criteria for algorithm performance, sample sizes for test sets, expert ground truth establishment, MRMC studies, or standalone algorithm performance, are not applicable or not present in this type of submission.

    The essence of this submission is that the entire mobile x-ray system, including its components (generator, panels, software), is deemed safe and effective because it is substantially equivalent to a previously cleared device, with only minor modifications (adding two new compatible digital panels and enabling a DDR function in the software, which is stated to be "unchanged firmware" and "moderate level of concern").

    Here's an attempt to address your questions based on the provided text, while acknowledging that many of them pertain to AI/algorithm performance studies, which are not the focus of this 510(k):

    1. A table of acceptance criteria and the reported device performance

    The document does not specify performance-based acceptance criteria for an AI/algorithm. Instead, it demonstrates substantial equivalence to a predicate device by comparing technical specifications. The "acceptance criteria" in this context are implicitly met if the new device's specifications (kW rating, kV range, mA range, collimator, power source, panel interface, image area sizes, pixel sizes, resolutions, MTF, DQE) are equivalent to or improve upon the predicate, and it remains compliant with relevant international standards.

    CharacteristicPredicate: K212291 PHOENIXPHOENIX/AeroDR TX m01 and PHOENIX/mKDR Xpress.Acceptance Criterion (Implicit)Reported Performance
    Indications for UseDigital mobile diagnostic x-ray for adults/pediatrics, skull, spine, chest, abdomen, extremities. Not for mammography.SAMEMust be identical to predicate.SAME (Identical)
    ConfigurationMobile System with digital x-ray panel and image acquisition computerSAMEMust be identical to predicate.SAME (Identical)
    X-ray Generator(s)kW: 20, 32, 40, 50 kW; kV: 40-150 kV (1 kV steps); mA: 10-650 mASAMEMust be identical to predicate.SAME (Identical)
    CollimatorRalco R108FSAMEMust be identical to predicate.SAME (Identical)
    Meets US Performance StandardYES 21 CFR 1020.30SAMEMust meet this standard.YES (Identical)
    Power SourceUniversal, 100-240 V~, 1 phase, 1.2 kVASAMEMust be identical to predicate.SAME (Identical)
    SoftwareKonica-Minolta CS-7 or UltraCS-7 and Ultra modified for DDR modeFunctions must be equivalent/improved; DDR enabled.CS-7 and Ultra modified for DDR mode
    Panel InterfaceEthernet or Wi-Fi wirelessSAMEMust be identical to predicate.SAME (Identical)
    Image Area Sizes (Panels)Listed AeroDR P-seriesListed AeroDR P-series + P-65, P-75Expanded range must be compatible and cleared.Expanded range compatible, previously cleared.
    Pixel Sizes (Panels)Listed AeroDR P-seriesListed AeroDR P-series + P-65, P-75Expanded range must be compatible and cleared.Expanded range compatible, previously cleared.
    Resolutions (Panels)Listed AeroDR P-seriesListed AeroDR P-series + P-65, P-75Expanded range must be compatible and cleared.Expanded range compatible, previously cleared.
    MTF (Panels)Listed AeroDR P-seriesListed AeroDR P-series + P-65, P-75Performance must be equivalent or better.P-65 (Non-binning) 0.62, (2x2 binning) 0.58; P-75 (Non-binning) 0.62, (2x2 binning) 0.58
    DQE (Panels)Listed AeroDR P-seriesListed AeroDR P-series + P-65, P-75Performance must be equivalent or better.P-65 0.56 @ 1 lp/mm; P-75 0.56 @ 1 lp/mm
    Compliance StandardsN/AIEC 60601-1, -1-2, -1-3, -2-54, -2-28, -1-6, IEC 62304Must meet relevant international safety standards.Meets all listed IEC standards.
    Diagnostic Quality ImagesN/AProduced diagnostic quality images as good as predicateMust produce images of equivalent diagnostic quality.Verified

    2. Sample size used for the test set and the data provenance

    No specific test set or data provenance (country, retrospective/prospective) is mentioned for AI/algorithm performance. The "testing" involved "bench and non-clinical tests" to verify proper system operation and safety, and that the modified combination of components produced diagnostic quality images "as good as our predicate generator/panel combination." This implies physical testing of the device rather than a dataset for algorithm evaluation.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts

    Not applicable. There was no specific test set requiring expert-established ground truth for an AI/algorithm evaluation. The determination of "diagnostic quality images" likely involved internal assessment by qualified personnel within the manufacturer's testing process.

    4. Adjudication method (e.g. 2+1, 3+1, none) for the test set

    Not applicable. No adjudication method is described as there was no formal expert-read test set for algorithm performance.

    5. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    No. An MRMC study was not conducted as this submission is not about an AI-assisted diagnostic tool designed to improve human reader performance. It is for a mobile x-ray system.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    No. This submission is for a medical device (mobile x-ray system), not a standalone AI algorithm. The software components (CS-7 and Ultra) are part of the image acquisition process, and the only software "modification" mentioned is enabling the DDR function, which is a feature of the new panels, not an AI for diagnosis.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc)

    Not applicable. The substantial equivalence argument relies on comparing technical specifications and demonstrating that the physical device produces images of "diagnostic quality" equivalent to the predicate, rather than an AI producing diagnostic outputs against a specific ground truth.

    8. The sample size for the training set

    Not applicable. This is not an AI/ML algorithm submission requiring a training set. The software components are for image acquisition and processing, not for AI model training.

    9. How the ground truth for the training set was established

    Not applicable, as no training set for an AI/ML algorithm was used or mentioned.

    Ask a Question

    Ask a specific question about this device

    K Number
    K212291
    Device Name
    PHOENIX
    Manufacturer
    Date Cleared
    2021-09-14

    (54 days)

    Product Code
    Regulation Number
    892.1720
    Reference & Predicate Devices
    Why did this record match?
    Reference Devices :

    K151465,K172793

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    This is a digital mobile diagnostic x-ray system intended for use by a qualified/trained doctor or technician on both adult and pediatric subjects for taking diagnostic radiographic exposures of the skull, spinal column, chest, abdomen, extremities, and other body parts. Applications can be performed with the patient sitting, standing, or lying in the prone or supine position. Not for mammography.

    Device Description

    This is a new type of our previous predicate mobile PhoeniX. The predicate PhoeniX mobile is interfaced with Canon Digital X-ray panels and Canon control software CXDI-NE. The new PhoeniX mobile is interfaced with Konica – Minolta Digital X-ray panels and CS-7 or Ultra software image acquisition. Phoenix mobile systems will be marketed in the USA by KONICA MINOLTA. Models with the CS-7 Software will be marketed as AeroDR Tran-X Models with the Ultra Software will be marketed as mKDR II. The compatible digital receptor panels are the same for either model. The CS-7 software was cleared under K151465/K172793, while the Ultra software is new. The CS-7 is a DIRECT DIGITIZER used with an image diagnosis device, medical imaging device and image storage device connected via the network. This device digitally processes patient images collected by the medical imaging device to provide image and patient information. By contrast the Ultra-DR software is designed as an exam-based modality image acquisition tool. Ultra-DR software and its accompanying Universal Acquisition Interface (UAI) were developed to be acquisition device independent. Basic Features of the software include Modality Worklist Management (MWM) / Modality Worklist (MWL) support, DICOM Send, CD Burn, DICOM Print, and Exam Procedure Mapping. Ultra Software is designed to increase patient throughput while minimizing data input errors. Ultra is made up of multiple components that increase efficiency while minimizing errors. The main components of Ultra are the Worklist, Acquisition Interface and Configuration Utility. These components combine to create a Stable, Powerful, and Customizable Image capture system. The intuitive graphical user interface is designed to improve Radiology, Technologist accuracy, and image quality. Worklist and Exam screens were developed to allow site specific customizations to seamlessly integrate into existing practice workflows.

    AI/ML Overview

    Here's an analysis of the acceptance criteria and study information for the PHOENIX Digital Mobile Diagnostic X-Ray System, based on the provided text.

    Based on the provided document, the PHOENIX device is a digital mobile diagnostic x-ray system, and the submission is for a modification to an existing cleared device (K192011 PHOENIX). The "study" described is primarily non-clinical bench testing to demonstrate that the modified system, with new digital flat-panel detectors (AeroDR series) and new acquisition software (Ultra), is as safe and effective as the predicate device. No clinical study information is provided in this document.

    1. Table of Acceptance Criteria and Reported Device Performance

    The document does not explicitly state "acceptance criteria" in a quantitative, measurable sense for the overall device performance. Instead, it focuses on demonstrating substantial equivalence to a predicate device. The comparison is primarily in the form of feature similarity and compliance with international standards for safety and electrical performance.

    CharacteristicPredicate (K192011 PHOENIX)PHOENIX (Proposed)Comparison of Performance
    Indications for UseIntended for use by a qualified/trained doctor or technician on both adult and pediatric subjects for taking diagnostic radiographic exposures of the skull, spinal column, chest, abdomen, extremities, and other body parts. Applications can be performed with the patient sitting, standing, or lying in the prone or supine position. Not for mammography.SAME (includes device description as requested by FDA)Met: Indications for use are identical, signifying no change in intended clinical application.
    ConfigurationMobile System with digital x-ray panel and image acquisition computerSAMEMet: Basic physical configuration remains unchanged.
    X-ray Generator(s)kW rating: 20 kW, 32 kW, 40 kW and 50 kW. kV range: from 40 kV to 150 kV in 1 kV steps. mA range: from 10 mA to 630 mA / 640 mA / 650 mA.SAMEMet: The X-ray generator specifications are identical, ensuring consistent radiation output characteristics.
    CollimatorRalco R108FRalco R108FMet: The collimator model is identical, ensuring consistent radiation field shaping.
    Meets US Performance StandardYES 21 CFR 1020.30SAMEMet: Compliance with the US Performance Standard for diagnostic X-ray systems is maintained.
    Power SourceUniversal power supply, from 100 V~ to 240 V~. 1 phase, 1.2 kVASAMEMet: Power supply specifications are identical.
    SoftwareCanon control software CXDI-NEKonica-Minolta control software CS-7 (K151465 or K172793) OR Konica-Minolta control software Ultra.Met (by validation): New software (Ultra) validated according to FDA Guidance. CS-7 was previously reviewed. This is a key change, and compliance is asserted through specific software validation.
    Panel InterfaceEthernet or Wi-Fi wirelessSAMEMet: Interface method is unchanged.
    Image Area Sizes (Detectors)CANON CXDI-401C 16"x 17", CXDI-701C 14" x 17", CXDI-801C 11" x 14", CXDI-710C 14" x 17", CXDI-810C 14" x 11", CXDI-410C 17" x 17"AeroDR P-51 14" x 17", AeroDR P-52 14" x 17", AeroDR P-61 14" x 17", AeroDR P-71 17" x 17", AeroDR P-81 10" x 12". (Similar range of sizes, all previously cleared)Met (by equivalence): The new detectors offer a "similar range of sizes" and are all "previously cleared" by FDA. This implies their performance characteristics within those sizes are acceptable.
    Pixel Sizes (Detectors)CANON CXDI (all 125 µm)AeroDR P-51 175 µm, AeroDR P-52 175 µm, AeroDR P-61 100/200 µm, AeroDR P-71 100/200 µm.Met (by equivalence): The new pixel sizes are different but are associated with previously cleared detectors, implying their diagnostic utility is acceptable. Specific performance comparison (e.g., to predicate's pixel size) isn't given for diagnostic equivalence, but rather for detector equivalence.
    Resolutions (Detectors)CANON CXDI (various, e.g., CXDI-401C 3320 × 3408 pixels)AeroDR P-51 1994 × 2430 pixels, AeroDR P-52 1994 × 2430 pixels, AeroDR P-61 3488 × 4256 pixels, AeroDR P-71 4248 × 4248 pixels, AeroDR P-81 2456 × 2968 pixels.Met (by equivalence): Similar to pixel size, specific resolutions differ but are for previously cleared detectors. Diagnostic equivalence is asserted by the prior clearance of the detectors themselves.
    MTF (Detectors)CANON CXDI (all 0.35 @ 2cy/mm)AeroDR P-51 0.30 @ 2cy/mm, AeroDR P-52 0.30 @ 2cy/mm, AeroDR P-61 0.30 @ 2cy/mm, AeroDR P-71 0.30 @ 2cy/mm, AeroDR P-81 0.30 @ 2cy/mm.Met (by equivalence): The new detectors have slightly lower MTF values at 2cy/mm, but these are for previously cleared detectors, implying acceptable image quality for diagnostic use.
    DQE (Detectors)CANON CXDI (all 0.6 @ 0 lp/mm)AeroDR P-51 0.62 @ 0 lp/mm, AeroDR P-52 0.62 @ 0 lp/mm, AeroDR P-61 0.56 @ 1 lp/mm, AeroDR P-71 0.56 @ 1 lp/mm, AeroDR P-81 0.56 @ 1 lp/mm.Met (by equivalence): DQE values differ but are for previously cleared detectors, suggesting acceptable performance. Some are higher, some are slightly lower (e.g., P61/P71/P81 at 1 lp/mm vs. predicate at 0 lp/mm). The key is the "previously cleared" status.
    Compliance with StandardsN/A (implied by predicate clearance)IEC 60601-1:2005+A1:2012, IEC 60601-1-2:2014, IEC 60601-1-3:2008+A1:2013, IEC 60601-2-54:2009+A1:2015, IEC 60601-2-28:2010, IEC 60601-1-6:2010 + A1:2013, IEC 62304:2006 + A1:2016.Met: Device tested and found compliant with these international standards for safety and essential performance.

    Summary of the "Study" Proving Acceptance Criteria

    The study described is a non-clinical, bench testing-based assessment for demonstrating substantial equivalence rather than a clinical study measuring diagnostic performance outcomes.

    The core argument for substantial equivalence is based on:

    1. Identical Indications for Use.
    2. Identical platform (mobile system, generator, collimator, power source).
    3. Replacement of components (detectors and acquisition software) with components that are either:
      • Previously FDA cleared (AeroDR detectors, CS-7 software).
      • Validated according to FDA guidance (Ultra software).
    4. Compliance with recognized international standards for medical electrical equipment.

    Here are the specific details requested:

    2. Sample Size Used for the Test Set and Data Provenance

    • Sample Size for Test Set: Not applicable. No patient-level test set data is mentioned for testing diagnostic performance. The "test set" consisted of physical devices (systems covering all generator/panel combinations) for bench testing and software for validation.
    • Data Provenance: Not applicable for a clinical test set. The testing was non-clinical bench testing. The detectors themselves (AeroDR) are stated to have been "previously cleared" by the FDA, implying their performance was established via other submissions, likely including data from various countries consistent with regulatory submissions.

    3. Number of Experts Used to Establish the Ground Truth for the Test Set and Qualifications of those Experts

    • Not applicable. There was no clinical test set requiring expert ground truth establishment for diagnostic accuracy.

    4. Adjudication Method for the Test Set

    • Not applicable. There was no clinical test set requiring adjudication.

    5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study was done

    • No, an MRMC comparative effectiveness study was not done. The document explicitly states: "Clinical testing was not required to establish substantial equivalence because all digital x-ray receptor panels have had previous FDA clearance."
    • Effect size of human readers improvement: Not applicable, as no such study was performed.

    6. If a Standalone (i.e., algorithm only without human-in-the-loop performance) was done

    • Yes, in spirit, for the software component. The new image acquisition software (Ultra) was validated according to the "FDA Guidance: Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices." This validation assesses the software's functionality and performance as a standalone component within the system, ensuring it correctly manages workflow, acquires images, and processes them. However, this is software validation, not a standalone diagnostic performance study in the context of an AI algorithm producing diagnostic outputs.

    7. The Type of Ground Truth Used

    • For the overall device: Substantial equivalence to a legally marketed predicate device (K192011 PHOENIX), which itself would have demonstrated safety and effectiveness.
    • For the components (detectors): Prior FDA clearance of the Konica-Minolta AeroDR panels served as the "ground truth" for their imaging characteristics (MTF, DQE, pixel size, etc.) being diagnostically acceptable.
    • For the software (Ultra): Validation against specified functional and performance requirements outlined in the FDA software guidance, which serves as the ground truth for software quality and safety.
    • For the PHOENIX system itself: Compliance with international safety and performance standards (IEC series) served as the ground truth for its electrical, mechanical, and radiation safety.

    8. The Sample Size for the Training Set

    • Not applicable. This device is not an AI/ML algorithm that requires a training set in the conventional sense of image analysis. It is an imaging acquisition device. The software validation is based on engineering principles and testing, not statistical training on a dataset.

    9. How the Ground Truth for the Training Set Was Established

    • Not applicable. As above, no training set for an AI/ML algorithm was used.
    Ask a Question

    Ask a specific question about this device

    Why did this record match?
    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    This software is intended to generate digital radiographic images of the skull, spinal column, extremities, and other body parts in patients of all ages. Applications can be performed with the patient sitting, or lying in the prone or supine position and is intended for use in all routine radiography exams. The product is not intended for mammographic applications.

    This software is not meant for mammography, fluoroscopy, or angiography.

    Device Description

    The I-Q View is a software package to be used with FDA cleared solid-state imaging receptors. It functions as a diagnostic x-ray image acquisition platform and allows these images to be transferred to hard copy, softcopy, and archive devices via DICOM protocol. The flat panel detector is not part of this submission. In the I-Q View software, the Digital Radiography Operator Console (DROC) software allows the following functions:

      1. Add new patients to the system; enter information about the patient and physician that will be associated with the digital radiographic images.
      1. Edit existing patient information.
      1. Emergency registration and edit Emergency settings.
      1. Pick from a selection of procedures, which defines the series of images to be acquired.
      1. Adiust technique settings before capturing the x-ray image.
      1. Preview the image, accept or reject the image entering comments or rejection reasons to the image. Accepted images will be sent to the selected output destinations.
      1. Save an incomplete procedure, for which the rest of the exposures will be made at a later time.
      1. Close a procedure when all images have been captured.
      1. Review History images, resend and reprint images.
      1. Re-exam a completed patient.
      1. Protect patient records from being deleted by the system.
      1. Delete an examined Study with all images being captured.
      1. Edit User accounts.
      1. Check statistical information.
      1. Image QC.
      1. Image stitching.
      1. Provides electronic transfer of medical image data between medical devices.
    AI/ML Overview

    The provided document is a 510(k) summary for the I-Q View software. It focuses on demonstrating substantial equivalence to a predicate device through bench testing and comparison of technical characteristics. It explicitly states that clinical testing was not required or performed.

    Therefore, I cannot provide details on clinical acceptance criteria or a study proving the device meets them, as such a study was not conducted for this submission. The document relies on bench testing and comparison to a predicate device to establish substantial equivalence.

    Here's a breakdown of what can be extracted from the provided text regarding acceptance criteria and the "study" (bench testing) that supports the device:

    1. Table of Acceptance Criteria and Reported Device Performance

    Since no clinical acceptance criteria or performance metrics are provided, this table will reflect the general statements made about the device performing to specifications.

    Acceptance Criteria (Implied)Reported Device Performance
    Device functions as intended for image acquisition.Demonstrated intended functions.
    Device performs to specification.Performed to specification.
    Integration with compatible solid-state detectors performs within specification.Verified integration performance within specification.
    Software is as safe and functionally effective as the predicate.Bench testing confirmed as safe and functionally effective as predicate.

    2. Sample size used for the test set and the data provenance

    • Test Set Sample Size: Not applicable/not reported. The document describes bench testing, not a test set of patient data.
    • Data Provenance: Not applicable. Bench testing generally involves internal testing environments rather than patient data from specific countries or retrospective/prospective studies.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts

    • Not applicable. As no clinical test set was used, no experts were needed to establish ground truth for patient data. Bench testing typically relies on engineering specifications and verification.

    4. Adjudication method for the test set

    • Not applicable. No clinical test set or human interpretation was involved.

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    • No, an MRMC comparative effectiveness study was not done. The document explicitly states: "Clinical Testing: The bench testing is significant enough to demonstrate that the I-Q View software is as good as the predicate software. All features and functionality have been tested and all specifications have been met. Therefore, it is our conclusion that clinical testing is not required to show substantial equivalence." The device is software for image acquisition, not an AI-assisted diagnostic tool.

    6. If a standalone (i.e. algorithm only without human-in-the loop performance) was done

    • Yes, in a sense. The "study" described is bench testing of the software's functionality and its integration with solid-state detectors. This is an evaluation of the algorithm/software itself.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)

    • For bench testing, the "ground truth" would be the engineering specifications and expected functional behavior of the software and its interaction with hardware components. It's about verifying that the software performs according to its design requirements.

    8. The sample size for the training set

    • Not applicable. The I-Q View is described as an image acquisition and processing software, not an AI/machine learning model that typically requires a training set of data.

    9. How the ground truth for the training set was established

    • Not applicable, as there is no mention of a training set or AI/machine learning component.

    Summary of the "Study" (Bench Testing) for K203703:

    The "study" conducted for the I-Q View software was bench testing. This involved:

    • Verification and validation of the software.
    • Demonstrating the intended functions and relative performance of the software.
    • Integration testing to verify that compatible solid-state detectors performed within specification as intended when used with the I-Q View software.

    The conclusion drawn from this bench testing was that the software performs to specification and is "as safe and as functionally effective as the predicate software." This was deemed sufficient to demonstrate substantial equivalence, and clinical testing was explicitly stated as not required.

    Ask a Question

    Ask a specific question about this device

    Why did this record match?
    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The KDR™ AU-DDR System Advanced U-Arm with Dynamic Digital Radiography and KDR™ AU System Advanced U-Arm with Static Digital Radiography is indicated for use by qualified/trained doctor or technician on both adult and pediatric subjects for taking diagnostic static and serial radiographic exposures of the skull, spinal column, chest, abdomen, extremities, and other body parts. Applications can be performed with the patient sitting, standing, or lying in the prone or supine position (not for mammography).

    Device Description

    The proposed System is a digital radiography diagnostic system that has the capability of obtaining two modes (static mode and dynamic modes) of radiographic exposures of the skull, spinal column, chest, abdomen, extremities, and other body parts. Images may be obtained with the patient sitting, standing, or lying in the prone or supine position. It is not intended for mammographic use. The system is configurable in two options. Both are exactly the same with the exception of the option to select one of two flat panel detectors. One configuration, referred to as KDR™ AU-DDR System Advanced U-Arm with Dynamic Digital Radiography contains a HD/FNB flat panel and the other configuration, referred to as KDR™ AU System Advanced U-Arm with Static Digital Radiography a HQ/KDR panel. The technological feature of each flat panel detector is described below.

    The proposed system is a compact, floor and wall mounted radiographic system with proprietary ULTRA software and DICOM 3 connectivity.

    The system consists of a combination of several components. The System's hardware consists of the 3 kev components:

    1. A floor and wall-mounted Positioner (also referred to as a stand)
    2. A generator
    3. An off-the-shelf computer with proprietary software (also referred to as an acquisition workstation)

    The positioner has a swivel arm that has several rotating and linear movements, and movement controls including an information screen. Mounted on the positioner are:
    a) A collimator,
    b) An X-ray tube
    c) An Automatic Exposure Control (AEC)
    d) A flat panel detector (There are 2 configurations available for the end user to select. The KDR™ AU-DDR System Advanced U-Arm with Dynamic Digital Radiography contains a HD/FNB flat panel detector capable of obtaining both static and dynamic images and the KDR™ AU System Advanced U-Arm with Static Digital Radiography, which contains the HQ/KDR flat panel detector capable of obtaining static images only.

    Hardware accessories include:

    1. A mobile patient table
    2. Stitching stand
    3. Weight bearing stand

    Optional Hardware accessories include:

    1. Motorized height adjustable table
    2. 3 knob collimator
    3. Dose area product meter
    4. Advanced weight bearing stand

    The proposed system has a proprietary ULTRA software as the central interface of the system. The software for the proposed system enables users to acquire static and dynamic images.

    There are two modes within the software package of the proposed system, "static mode," which may be used to generate, a single frame of radiographic images captured at a single time and "dynamic mode" (or "Dynamic Digital Radiography," abbreviated "DDR,") which generates multiple frames in a single series, presenting the physician with a diagnostic view of dynamic density and anatomic motion without using fluoroscopy or cine. The number of images acquired with the proposed system are limited to 300 compared with flouroscopy or cine, which do not limit the number of images (Note: only the configuration with the HD/FNB flat panel detector is capable of obtaining both static and dynamic images. The other configuration may only obtain static images).

    The system is also capable of quickly assuming a preprogrammed position when a new exam is selected, saving time when positioning the equipment. This is referred to as "auto positioning," and made possible by the positioner and image processing software working together.

    AI/ML Overview

    The provided document is a 510(k) summary for the Konica Minolta Healthcare Americas KDR™ AU-DDR System Advanced U-Arm with Dynamic Digital Radiography and KDR™ AU System Advanced U-Arm with Static Digital Radiography.

    This document describes the device and its substantial equivalence to predicate devices, focusing on regulatory compliance and technical specifications rather than specific clinical performance data for AI/software components. The primary performance data discussed refers to compliance with safety and performance standards for X-ray systems, not an AI-driven diagnostic or assistive feature.

    Therefore, many of the requested points regarding acceptance criteria and study details (like sample size for test/training sets, expert ground truth, adjudication methods, MRMC studies, or standalone performance for an AI component) are not present in this document. The document primarily addresses the safety and effectiveness of the X-ray system hardware and its software for image acquisition, not an AI-based diagnostic tool.

    Based on the provided text, here's what can be extracted:


    1. Table of Acceptance Criteria and Reported Device Performance

    The document does not provide a table of acceptance criteria with specific performance metrics for an AI component. Instead, it refers to compliance with various electrical, safety, and imaging standards for the overall X-ray system.

    Acceptance Criteria (Compliance with Standards)Reported Device Performance
    IEC 60601-1 version 3.1 (General requirements for basic safety and essential performance)The System complies with the requirements.
    IEC 60601-1-2, 4th edition (Electromagnetic compatibility)The System complies with the requirements. Surrounding equipment also follows the standard. Electrical testing performed by TUV Rheinland of North America and certified as complying with each standard tested.
    IEC 60601-1-3 rev 2.1 (Radiation protection in diagnostic X-ray equipment)The System complies with the requirements.
    21 CFR Part 1020:30 and 21 CFR Part 1020:31 (Standards for ionizing radiation emitting products)The system was tested against and complies with these standards.
    IEC 60601-2-54, 1.2 edition (Particular requirements for basic safety and essential performance of X-ray equipment for radiography and radioscopy)The System complies with the requirements.
    DICOM standardThe system was also tested and complies with the DICOM standard.
    User Requirement Software Specifications, Device Requirements for Performance, Packaging, Design Requirements, Human/Ergonomic Factors, Interfacing with other devices and Compatibility with the environment of the intended useThe system successfully passed all verification and validation testing, functioning as intended and expected.

    2. Sample size used for the test set and the data provenance:

    • Not explicitly stated in the provided document. The document discusses compliance with technical standards for an X-ray system, not the performance of an AI algorithm on a specific medical image dataset.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts:

    • Not applicable / not stated. This document focuses on the technical and safety performance of an X-ray imaging system, not on a machine learning model requiring expert-annotated ground truth for diagnostic accuracy.

    4. Adjudication method (e.g. 2+1, 3+1, none) for the test set:

    • Not applicable / not stated. The context of this document does not involve diagnostic interpretations requiring adjudication.

    5. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:

    • Not done / not stated. The device described is an X-ray acquisition system; it does not present itself as an AI-assistive diagnostic tool for human readers in the context of this 510(k) summary.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done:

    • Not applicable / not stated. The device is an X-ray system, which includes software for image acquisition ("proprietary ULTRA software"), but the document does not describe a standalone AI algorithm for diagnostic interpretation.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.):

    • Not applicable / not stated. Ground truth, in the context of diagnostic AI models, is not relevant to the compliance testing of an X-ray imaging system described here.

    8. The sample size for the training set:

    • Not applicable / not stated. The document describes an X-ray system, and there's no mention of a machine learning component requiring a training set in this context.

    9. How the ground truth for the training set was established:

    • Not applicable / not stated. As no training set is mentioned for an AI algorithm, ground truth establishment is not relevant to the information provided.
    Ask a Question

    Ask a specific question about this device

    Page 1 of 1