Search Filters

Search Results

Found 9 results

510(k) Data Aggregation

    K Number
    K250665
    Device Name
    SKR 3000
    Date Cleared
    2025-06-17

    (104 days)

    Product Code
    Regulation Number
    892.1680
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    SKR 3000

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    This device is indicated for use in generating radiographic images of human anatomy. It is intended to a replace radiographic film/screen system in general-purpose diagnostic procedures. This device is not indicated for use in mammography, fluoroscopy, and angiography applications.

    Device Description

    The digital radiography SKR 3000 performs X-ray imaging of the human body using an X-ray planar detector that outputs a digital signal, which is then input into an image processing device, and the acquired image is then transmitted to a filing system, printer, and image display device as diagnostic image data.

    • This device is not intended for use in mammography
    • This device is also used for carrying out exposures on children.
      The Console CS-7, which controls the receiving, processing, and output of image data, is required for operation. The CS-7 is a software with basic documentation level. CS-7 implements the following image processing; gradation processing, frequency processing, dynamic range compression, smoothing, rotation, reversing, zooming, and grid removal process/scattered radiation correction (Intelligent-Grid). The Intelligent-Grid is cleared in K151465.
      The FPDs used in SKR 3000 can communicate with the image processing device through the wired Ethernet and/or the Wireless LAN (IEEE802.11a/n and FCC compliant). The WPA2-PSK (AES) encryption is adopted for a security of wireless connection.
      The SKR 3000 is distributed under a commercial name AeroDR 3.
      The purpose of the current premarket submission is to add pediatric use indications for the SKR 3000 imaging system.
    AI/ML Overview

    The provided FDA 510(k) clearance letter and summary for the SKR 3000 device focuses on adding a pediatric use indication. However, it does not contain the detailed performance data, acceptance criteria, or study specifics typically found in a clinical study report. The document states that "image quality evaluation was conducted in accordance with the 'Guidance for the Submission of 510(k)s for Solid State X-ray Imaging Devices'" and that "pediatric image evaluation using small-size phantoms was performed on the P-53." It also mentions that "The comparative image evaluation demonstrated that the SKR 3000 with P-53 provides substantially equivalent image performance to the comparative device, AeroDR System 2 with P-52, for pediatric use."

    Based on the information provided, it's not possible to fully detail the acceptance criteria and the study that proves the device meets them according to your requested format. The document implies that the "acceptance criteria" likely revolved around demonstrating "substantially equivalent image performance" to a predicate device (AeroDR System 2 with P-52) for pediatric use, primarily through phantom studies, rather than a clinical study with human patients and detailed diagnostic performance metrics.

    Therefore, many of the requested fields cannot be filled directly from the provided text. I will provide the information that can be inferred or directly stated from the document and explicitly state when information is not available.

    Disclaimer: The information below is based solely on the provided 510(k) clearance letter and summary. For a comprehensive understanding, one would typically need access to the full 510(k) submission, which includes the detailed performance data and study reports.


    Acceptance Criteria and Device Performance Study for SKR 3000 (Pediatric Use Indication)

    The primary objective of the study mentioned in the 510(k) summary was to demonstrate substantial equivalence for the SKR 3000 (specifically with detector P-53) for pediatric use, compared to a predicate device (AeroDR System 2 with P-52).

    1. Table of Acceptance Criteria and Reported Device Performance

    Given the nature of the submission (adding a pediatric indication based on substantial equivalence), the acceptance criteria are not explicitly quantifiable metrics like sensitivity/specificity for a specific condition. Instead, the focus was on demonstrating "substantially equivalent image performance" through phantom studies.

    Acceptance Criteria (Inferred from Document)Reported Device Performance (Inferred/Stated)
    Image quality of SKR 3000 with P-53 for pediatric applications to be "substantially equivalent" to predicate device (AeroDR System 2 with P-52)."The comparative image evaluation demonstrated that the SKR 3000 with P-53 provides substantially equivalent image performance to the comparative device, AeroDR System 2 with P-52, for pediatric use."
    Compliance with "Guidance for the Submission of 510(k)s for Solid State X-ray Imaging Devices" for pediatric image evaluation using small-size phantoms."image quality evaluation was conducted in accordance with the 'Guidance for the Submission of 510(k)s for Solid State X-ray Imaging Devices'. Pediatric image evaluation using small-size phantoms was performed on the P-53."

    2. Sample Size Used for the Test Set and Data Provenance

    • Sample Size (Test Set): Not specified. The document indicates "small-size phantoms" were used, implying a phantom study, not a human clinical trial. The number of phantom images or specific phantom configurations is not detailed.
    • Data Provenance: Not specified. Given it's a phantom study, geographical origin is less relevant than for patient data. It's an internal study conducted to support the 510(k) submission. Retrospective or prospective status is not applicable as it's a phantom study.

    3. Number of Experts Used to Establish Ground Truth for the Test Set and Qualifications

    • Number of Experts: Not specified. Given this was a phantom study, ground truth would likely be based on physical measurements of the phantoms and expected image quality metrics, rather than expert interpretation of pathology or disease. If human evaluation was part of the "comparative image evaluation," the number and qualifications of evaluators are not provided.
    • Qualifications: Not specified.

    4. Adjudication Method for the Test Set

    • Adjudication Method: Not specified. For a phantom study demonstrating "substantially equivalent image performance," adjudication methods like 2+1 or 3+1 (common in clinical reader studies) are generally not applicable. The comparison would likely involve quantitative metrics from the generated images.

    5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study was Done

    • MRMC Study: No. The document states "comparative image evaluation" and "pediatric image evaluation using small-size phantoms." This strongly implies a technical performance assessment using phantoms, rather than a clinical MRMC study with human readers interpreting patient cases. Therefore, no effect size of human readers improving with AI vs. without AI assistance can be reported, as AI assistance in image interpretation (e.g., CAD) is not the focus of this submission; it's about the imaging system's ability to produce quality images for diagnosis.

    6. If a Standalone (i.e., algorithm only without human-in-the-loop performance) was Done

    • Standalone Performance: Not applicable in the traditional sense of an AI algorithm's diagnostic performance. The device is an X-ray imaging system. The "performance" being evaluated is its ability to generate images, not to provide an automated diagnosis. The "Intelligent-Grid" feature mentioned is an image processing algorithm (scattered radiation correction), but its standalone diagnostic performance is not the subject of this specific submission; its prior clearance (K151465) is referenced.

    7. The Type of Ground Truth Used

    • Ground Truth Type: For the pediatric image evaluation, the ground truth was based on phantom characteristics and expected image quality metrics. This is inferred from the statement "pediatric image evaluation using small-size phantoms was performed."

    8. The Sample Size for the Training Set

    • Training Set Sample Size: Not applicable. The SKR 3000 is an X-ray imaging system, not an AI model that requires a "training set" in the machine learning sense for its primary function of image acquisition. While image processing algorithms (like Intelligent-Grid) integrated into the system might have been developed using training data, the submission focuses on the imaging system's performance for pediatric use.

    9. How the Ground Truth for the Training Set Was Established

    • Ground Truth for Training Set: Not applicable, as no training set (in the context of an AI model's image interpretation learning) is explicitly mentioned or relevant for the scope of this 510(k) submission for an X-ray system.
    Ask a Question

    Ask a specific question about this device

    K Number
    K241319
    Device Name
    SKR 3000
    Date Cleared
    2024-11-21

    (195 days)

    Product Code
    Regulation Number
    892.1680
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    SKR 3000

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The SKR 3000 is indicated for use in generating radiographic images of human anatomy. It is intended to replace radiographic film/screen system in general-purpose diagnostic procedures.

    The SKR 3000 is not indicated for use in mammography, fluoroscopy and angiography applications. The P-53 is for adult use only.

    Device Description

    The digital radiography SKR 3000 performs X-ray imaging of the human body using an Xray planar detector that outputs a digital signal, which is then input into an image processing device, and the acquired image is then transmitted to a filing system, printer, and image display device as diagnostic image data.

    • This device is not intended for use in mammography
    • This device is also used for carrying out exposures on children.

    The Console CS-7, which controls the receiving, processing, and output of image data, is required for operation. The CS-7 is a software with basic documentation level. CS-7 implements the following image processing; gradation processing, frequency processing, dynamic range compression, smoothing, rotation, reversing, zooming, and grid removal process/scattered radiation correction - (Intelligent-Grid). The Intelligent-Grid is cleared in K151465.

    This submission is to add the new flat-panel x-ray detector (FPD) P-53 to the SKR 3000. The new P-53 panel shows improved performance compared to the predicate device. The P-53 employs the same surface material infused with Silver ions (antibacterial properties) as the reference device.

    The FPDs used in SKR 3000 can communicate with the image processing device through the wired Ethernet and/or the Wireless LAN (IEEE802.11a/n and FCC compliant). The WPA2-PSK (AES) encryption is adopted for a security of wireless connection.

    The SKR 3000 is distributed under a commercial name AeroDR 3.

    AI/ML Overview

    The provided text does not contain detailed information about specific acceptance criteria or the study used to prove the device meets those criteria in the typical format of a clinical trial or performance study report. Instead, it is an FDA 510(k) clearance letter and a 510(k) Summary for the Konica Minolta SKR 3000 (K241319).

    This document focuses on demonstrating substantial equivalence to a predicate device (K151465 - AeroDR System2) rather than providing a detailed report of a performance study with specific acceptance criteria, sample sizes, expert involvement, and ground truth establishment, as one might find for a novel AI/software medical device.

    The "Performance Data" section mentions "comparative image testing was conducted to demonstrate substantially equivalent image performance for the subject device" and "the predetermined acceptance criteria were met." However, it does not specify what those acceptance criteria were, what the reported performance was against those criteria, or the methodology of the "comparative image testing."

    Therefore, I cannot populate the table or answer most of your specific questions based on the provided text.

    Here's what can be extracted and inferred from the text:

    1. A table of acceptance criteria and the reported device performance:

    This information is not explicitly provided in the document. The document states: "The performance testing was conducted according to the 'Guidance for the Submission of 510(k)s for Solid State X-ray Imaging Devices.' The comparative image testing was conducted to demonstrate substantially equivalent image performance for the subject device. The other verification and validation including the items required by the risk analysis for the SKR 3000 were performed and the results showed that the predetermined acceptance criteria were met. The results of risk management did not require clinical studies to demonstrate the substantial equivalence of the proposed device."

    The comparison table on page 6 provides a comparison of specifications between the subject device (SKR 3000 with P-53) and the predicate device (AeroDR System2 with P-52), which might imply performance improvements that were part of the "acceptance criteria" for demonstrating substantial equivalence:

    FeatureSubject Device (SKR 3000 / P-53)Predicate Device (AeroDR System2 / P-52)Implication (Potential "Performance")
    Pixel size150 µm175 µmImproved spatial resolution
    Max. Resolution2.5 lp/mm2.0 lp/mmHigher resolution
    DQE (1.0 lp/mm)40% @ 1mR35% @ 1mRImproved detective quantum efficiency

    2. Sample sized used for the test set and the data provenance:

    • Sample Size for Test Set: Not specified. The document mentions "comparative image testing" but does not detail the number of images or patients in the test set.
    • Data Provenance: Not specified (e.g., country of origin, retrospective or prospective).

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts:

    • Not specified. Given that it's a 510(k) for an X-ray system rather than an AI diagnostic algorithm, the "ground truth" for image quality assessment would likely be based on physical phantom measurements and potentially visual assessment by qualified individuals, but the details are not provided.

    4. Adjudication method (e.g. 2+1, 3+1, none) for the test set:

    • Not specified.

    5. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:

    • Not done/Not specified. This is not an AI-assisted device subject to typical MRMC studies. The device is a digital radiography system itself. The document states, "The results of risk management did not require clinical studies to demonstrate the substantial equivalence of the proposed device."

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done:

    • Not applicable. This is a hardware device (X-ray detector and system), not a standalone algorithm.

    7. The type of ground truth used:

    • Inferred based on context: Likely objective physical measurements (e.g., resolution, DQE) and potentially qualitative image assessment against a known reference (predicate device or established norms). The phrase "comparative image testing" suggests direct comparison of images produced by the subject device vs. predicate device. Not explicitly stated to be expert consensus, pathology, or outcomes data.

    8. The sample size for the training set:

    • Not applicable / Not specified. This is a hardware device; typical "training sets" are associated with machine learning algorithms. Its design and manufacturing would be based on engineering principles and quality control, not a data-driven training set in the AI sense.

    9. How the ground truth for the training set was established:

    • Not applicable / Not specified. (See point 8)

    Summary of what is known/inferred:

    • Acceptance Criteria: "Predetermined acceptance criteria were met" for performance parameters related to image quality and safety. Specific numerical criteria are not detailed, but improved resolution and DQE over the predicate are highlighted.
    • Study Design: "Comparative image testing" and general "performance testing" were conducted according to FDA guidance for solid-state X-ray imaging devices.
    • Sample Size/Provenance/Experts/Adjudication/MRMC: Not specified, expected as this is a hardware 510(k) for substantial equivalence demonstrating non-inferiority/improvement in physical specifications rather than a diagnostic AI/CADe study.
    • Ground Truth: Likely objective physical performance metrics and visual comparison with a predicate, not clinical diagnoses or outcomes.
    • Training Set: Not applicable for a hardware device in the context of AI.
    Ask a Question

    Ask a specific question about this device

    K Number
    K223267
    Device Name
    SKR 3000
    Date Cleared
    2022-11-17

    (24 days)

    Product Code
    Regulation Number
    892.1680
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    SKR 3000

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The SKR 3000 is indicated for use in generating radiographic images of human anatomy. It is intended to replace radiographic film/screen system in general-purpose diagnostic procedures.

    The SKR 3000 is not indicated for use in mammography, fluoroscopy and angiography applications.

    Device Description

    The digital radiography SKR 3000 performs X-ray imaging of the human body using an Xray planar detector that outputs a digital signal, which is then input into an image processing device, and the acquired image is then transmitted to a filing system, printer, and image display device as diagnostic image data.

    • This device is not intended for use in mammography
    • This device is also used for carrying out exposures on children.

    The Console CS-7, which controls the receiving, processing, and output of image data, is required for operation. The CS-7 is a software with Moderate level of concern. CS-7 implements the following image processing; gradation processing, frequency processing, dynamic range compression, smoothing, rotation, reversing, zooming, and grid removal process/scattered radiation correction - (Intelligent-Grid). The Intelligent-Grid is cleared in K151465. The CS-7 modifications have been made for a wireless serial radiography.

    The SKR 3000 is distributed under a commercial name AeroDR 3.

    This submission is to introduce a wireless serial radiography into the SKR 3000 system. The wireless serial radiography function of P-65 / P-75 used with Phoenix was cleared under K221803. These detectors are wireless and their serial radiography functions are not being controlled by the x-ray generator. Hence no detector integration testing is necessary.

    AI/ML Overview

    The provided text is a 510(k) Summary for the Konica Minolta SKR 3000 device, which is a digital radiography system. This document focuses on demonstrating substantial equivalence to a predicate device (K213908), rather than presenting a detailed study proving the device meets specific acceptance criteria with performance metrics, sample sizes, expert involvement, or statistical analysis.

    The document states that the changes made to the SKR 3000 (specifically the addition of wireless serial radiography for P-65 and P-75 detectors) did not require clinical studies. Therefore, the information requested about a study demonstrating the device meets acceptance criteria regarding clinical performance is not available in this filing. The "Performance Data" section primarily addresses compliance with electrical safety and EMC standards.

    However, based on the information provided, here's what can be extracted and inferred regarding "acceptance criteria" in the context of this 510(k) submission:

    1. Table of Acceptance Criteria and Reported Device Performance

    Given that no clinical study specific to this submission's modifications is presented, the "acceptance criteria" here relate to general regulatory and technical compliance rather than clinical performance metrics (e.g., sensitivity, specificity for a particular pathology). The "reported device performance" is essentially a statement of compliance.

    Acceptance Criteria CategoryReported Device Performance
    Safety and Effectiveness"The technological differences raised no new issues of safety or effectiveness as compared to its predicate device (K213908)."
    Performance to Specifications"Performance tests demonstrate that the SKR 3000 performs according to specifications and functions as intended."
    Compliance with Standards"The SKR 3000 is designed to comply with the following standard; AAMI/ANSI ES 60601-1 (Ed.3.1) and IEC 60601-1-2 (Ed.4.0)." (General electrical safety and electromagnetic compatibility standards are met.)
    Risk Analysis"The verification and validation including the items required by the risk analysis for the SKR 3000 were performed and the results demonstrated that the predetermined acceptance criteria were met. The results of risk management did not require clinical studies to demonstrate the substantial equivalency of the subject device modifications."
    Functional Equivalence (Wireless Radiography)The submission implies that the newly added wireless serial radiography functions (P-65 / P-75) are functionally equivalent to the wired serial radiography functions of the predicate device, especially since "no detector integration testing is necessary" because "their serial radiography functions are not being controlled by the x-ray generator."

    Note: This table reflects the nature of a 510(k) submission focused on substantial equivalence rather than a clinical performance study.


    Here's the breakdown for the other requested information, based on the limitations of the provided document:

    2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)

    • Not provided. The document states that "The results of risk management did not require clinical studies to demonstrate the substantial equivalency of the subject device modifications." This indicates that no clinical "test set" with patient data was used for this specific submission. The performance assessment was based on non-clinical testing (e.g., engineering verification, validation testing to internal specifications and regulatory standards).

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)

    • Not applicable. As no clinical study or test set with patient data was conducted or analyzed, there were no experts establishing ground truth for performance metrics like diagnostic accuracy.

    4. Adjudication method (e.g. 2+1, 3+1, none) for the test set

    • Not applicable. No clinical test set.

    5. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    • Not applicable. This submission is for a digital radiography system, not an AI-powered diagnostic aide. No MRMC study was performed or is relevant to this submission.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    • Not applicable. This device is an imaging system, not an algorithm for standalone diagnosis. The "performance tests" mentioned are related to the hardware and software functionality of the imaging system itself (e.g., image quality specifications, electrical safety, EMC), not an algorithm's diagnostic performance.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc)

    • Not applicable. For the purpose of this 510(k) filing for device modifications, the "ground truth" for performance was implicitly defined by the compliance with engineering specifications, safety standards (AAMI/ANSI ES 60601-1, IEC 60601-1-2), and the functional equivalence to the predicate device. No clinical ground truth (e.g., pathology, outcomes) was established for this submission.

    8. The sample size for the training set

    • Not applicable. This device is not an AI/ML algorithm that requires a "training set" in the sense of patient data for learning.

    9. How the ground truth for the training set was established

    • Not applicable. No training set for an AI/ML algorithm.
    Ask a Question

    Ask a specific question about this device

    K Number
    K213908
    Device Name
    SKR 3000
    Date Cleared
    2022-01-31

    (48 days)

    Product Code
    Regulation Number
    892.1680
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    SKR 3000

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The SKR 3000 is indicated for use in generating radiographic images of human anatomy. It is intended to replace radiographic film/screen system in general-purpose diagnostic procedures.

    The SKR 3000 is not indicated for use in mammography, fluoroscopy and angiography applications.

    Device Description

    The digital radiography SKR 3000 performs X-ray imaging of the human body using an Xray planar detector that outputs a digital signal, which is then input into an image processing device, and the acquired image is then transmitted to a filing system, printer, and image display device as diagnostic image data.

    • This device is not intended for use in mammography
    • This device is also used for carrying out exposures on children.

    The Console CS-7, which controls the receiving, processing, and output of image data, is required for operation. CS-7 implements the following image processing; gradation processing, frequency processing, dynamic range compression, smoothing, rotation, reversing, zooming, and grid removal process/scattered radiation - correction (Intelligent-Grid). The Intelligent-Grid is cleared in K151465.

    This submission is to add new flat-panel x-ray detectors (FPDs), P-82 and P-85, into the SKR 3000. The P-82 and P-85 employ the same surface material infused with Silver ions (antibacterial properties) as the predicate device. The only difference between the P-82 and P-85 is the number of Li-ion capacitors. The P-85 has two Li-ion capacitors and the P-82 has one. These new P-82 and P-85 are not applicable to the serial radiography which acquires multiple frames of radiography image serially.

    The FPDs used in SKR 3000 can communicate with the image processing device through the wired Ethernet and/or the Wireless LAN (IEEE802.11a/n and FCC compliant). The WPA2-PSK (AES) encryption is adopted for a security of wireless connection.

    The SKR 3000 is distributed under a commercial name AeroDR 3.

    AI/ML Overview

    The provided text describes the Konica Minolta SKR 3000, a digital radiography system, and seeks 510(k) clearance by demonstrating substantial equivalence to a predicate device (K210619), which is also an SKR 3000 model. The submission focuses on adding new flat-panel x-ray detectors (FPDs), P-82 and P-85, to the existing system.

    Here's an analysis of the acceptance criteria and study information:

    1. Table of Acceptance Criteria and Reported Device Performance:

    The document implicitly defines "acceptance criteria" by comparing the specifications and performance of the subject device (SKR 3000 with P-82/P-85 FPDs) against its predicate device (SKR 3000 with P-65 FPD). The acceptance criteria are essentially the performance levels of the predicate device, which the new FPDs must meet or exceed.

    Feature / Performance MetricAcceptance Criteria (Predicate P-65)Reported Device Performance (Subject P-82/P-85)Meets Criteria?
    Indications for UseSame as SubjectGenerates radiographic images of human anatomy, replaces film/screen in general diagnostic procedures, not for mammography, fluoroscopy, angiography.Yes
    Detection methodIndirect conversion methodIndirect conversion methodYes
    ScintillatorCsI (Cesium Iodide)CsI (Cesium Iodide)Yes
    TFT sensor substrateGlass-based TFT substrateFilm-based TFT substrateN/A (difference accepted, no new safety/effectiveness issues)
    Image area sizeP-65: 348.8×425.6mm (3,488×4,256 pixels)P-82/P-85: 348.8×425.6mm (3,488×4,256 pixels)Yes
    Pixel size100 µm / 200 µm / 400 µm100 µm / 200 µmYes (smaller range still includes acceptable sizes)
    A/D conversion16 bit (65,536 gradients)16 bit (65,536 gradients)Yes
    Max. ResolutionP-65: 4.0 lp/mmP-82/P-85: 4.0 lp/mmYes
    MTF (1.0 lp/mm)(Non-binning) 0.62, (2x2 binning) 0.58(Non-binning) 0.62, (2x2 binning) 0.58Yes
    DQE (1.0 lp/mm)56% @ 1mR59% @ 1mRYes (exceeds)
    External dimensionsP-65: 384(W)×460(D)×15(H)mmP-82/P-85: 384(W)×460(D)×15(H)mmYes
    IP Code (IEC 60529)IPX6IP56N/A (minor difference, presumed acceptable)
    Battery TypeLithium-ion capacitorLithium-ion capacitorYes
    Number of batteriesP-65: TwoP-82: One, P-85: TwoN/A (difference in configuration, performance evaluated)
    Battery duration in standbyP-65: Approx. 13.2 hoursP-82: Approx. 6.0 hours, P-85: Approx. 13.2 hoursYes (P-85 meets, P-82 is different but acceptable for its configuration)
    Surface MaterialSurface infused with Silver ions (antibacterial properties)Surface infused with Silver ions (antibacterial properties)Yes
    Communication I/FWired and WirelessWired and WirelessYes
    Operator console (Software)CS-7, AeroDR3 interface for P-65 (CTDS)CS-7, AeroDR3 interface for P-82 and P-85 (CTDS)Yes
    Image ProcessingSame complex image processing algorithmsSame complex image processing algorithmsYes
    Serial radiographyApplicableNot applicableN/A (difference in feature, not an "acceptance criterion" in this context as new FPDs don't support it)

    Note: The acceptance criteria are largely implied by the claim of substantial equivalence. The document primarily focuses on demonstrating that new FPDs (P-82 and P-85) either match or improve upon the predicate's performance for critical imaging parameters. Differences in the TFT substrate material, pixel size options, number of batteries, and serial radiography capability are noted but explained as not raising new safety or effectiveness concerns.

    2. Sample size used for the test set and the data provenance:

    The document states: "The performance tests according to the 'Guidance for the Submission of 510(k)s for Solid State X-ray Imaging Devices' and the other verification and validation including the items required by the risk analysis for the SKR 3000 were performed and the results demonstrated that the predetermined acceptance criteria were met."

    This indicates that specific performance tests were conducted. However, the document does not explicitly state the sample size used for the test sets (e.g., number of images, number of phantom studies, number of human subjects, if any) nor the data provenance (e.g., country of origin, retrospective or prospective nature of clinical data if used). Given the type of device (X-ray system component) and the nature of the submission (adding new FPDs to an existing cleared system), the "performance data" presented is primarily technical specifications and phantom-based measurements, not typically large-scale clinical trials with human subjects.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts:

    The document does not mention the use of experts to establish ground truth. As this is a technical performance comparison of imaging hardware (FPDs), ground truth would likely be established through objective physical measurements and established technical standards (e.g., imaging phantoms, dosimeters) rather than expert human interpretation of medical images for diagnostic accuracy.

    4. Adjudication method for the test set:

    Since there is no mention of human experts or clinical image interpretation studies, there is no adjudication method described.

    5. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:

    The document does not mention an MRMC study, nor does it refer to AI or AI-assisted improvements for human readers. This device is a digital radiography system (hardware), and the submission focuses on its technical performance compared to a predicate, not on AI algorithms or their impact on reader performance.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done:

    The device described is an X-ray imaging system, not an algorithm. Therefore, a standalone algorithm-only performance study is not applicable in this context. The performance evaluated is that of the hardware components (FPDs) within the system.

    7. The type of ground truth used:

    The ground truth for the performance parameters (e.g., Max. Resolution, MTF, DQE) would be established through objective physical measurements using standardized phantoms and test procedures as per industry standards (e.g., "Guidance for the Submission of 510(k)s for Solid State X-ray Imaging Devices"). For other specifications like battery life or dimensions, ground truth is based on engineering measurements and design specifications.

    8. The sample size for the training set:

    The document does not refer to a training set. This is because the submission is for hardware components (FPDs) for an X-ray system, not for a machine learning or AI-based diagnostic algorithm that would require training data.

    9. How the ground truth for the training set was established:

    As there is no mention of a training set, there is no information on how its ground truth would be established.

    Ask a Question

    Ask a specific question about this device

    K Number
    K210619
    Device Name
    SKR 3000
    Date Cleared
    2021-08-24

    (176 days)

    Product Code
    Regulation Number
    892.1680
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    SKR 3000

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    This device is indicated for use in generating radiographic images of human anatomy. It is intended to a replace radiographic film/screen system in general-purpose diagnostic procedures.

    This device is not indicated for use in mammography, fluoroscopy, and angiography applications.

    Device Description

    The digital radiography SKR 3000 performs X-ray imaging of the human body using an Xray planar detector that outputs a digital signal, which is then input into an image processing device, and the acquired image is then transmitted to a filing system, printer, and image display device as diagnostic image data.

    The subject device SKR3000 is not intended for use in mammography
    This device is also used for carrying out exposures on children.

    The Console CS-7, which controls the receiving, processing, and output of image data, is required for operation. CS-7 implements the following image processing; gradation processing, frequency processing, dynamic range compression, smoothing, rotation, reversing, zooming, and grid removal process/scattered radiation - correction (Intelligent-Grid). The Intelligent-Grid is cleared in K151465.

    The proposed SKR 3000 is modified to consist of new FPD P-65 and P-75 in addition to previously cleared P-61, P-71, and P-81, Console CS-7 and other peripherals. The DR Detector uses the exposure signal or exposure from the X-ray device to generate X-ray digital image data for diagnosis, including serial exposure images, and send to the image processing controller.

    The operator console software, Console CS-7, is a software program for installation on a OTC PC. Software module modifications have been made to use new FPDs (P-65 and P-75) (Cassette Type Detection Software (CTDS)) and to support 40 seconds serial radiography (SIC).

    The FPDs used in SKR 3000 can communicate with the image processing device through the wired Ethernet and/or the Wireless LAN (IEEE802.11a/n and FCC compliant). The WPA2-PSK (AES) encryption is adopted for a security of wireless connection.

    The new DR panels, P-65 and P-75, employ the surface material containing antibacterial agent in both radiation and irradiation sides. In the serial radiography settings, acquisition time has been changed from up to 20 seconds to 40 seconds to observe a variety of dynamic objects. Other control parameters of serial radiography are not changed from the predicate device.

    The SKR 3000 is distributed under a commercial name AeroDR 3.

    AI/ML Overview

    This document describes the Konica Minolta SKR 3000, a digital radiography system, and its substantial equivalence to a predicate device. The information provided focuses on the device's design, specifications, and performance testing to demonstrate compliance with standards, but does not include a detailed study proving the device meets specific acceptance criteria related to diagnostic accuracy or clinical outcomes through a prospective trial involving human readers. The provided text primarily focuses on engineering and regulatory compliance, not clinical performance metrics in the context of AI assistance or human reader improvement.

    However, based on the provided text, here's a breakdown of the acceptance criteria met through performance testing as described, and the absence of certain study types:

    1. Table of Acceptance Criteria and Reported Device Performance

    The document broadly states that "the performance tests according to the 'Guidance for the Submission of 510(k)s for Solid State X-ray Imaging Devices' and the other verification and validation including the items required by the risk analysis for the SKR3000 were performed and the results demonstrated that the predetermined acceptance criteria were met."

    While specific numerical acceptance criteria and their corresponding reported device performance values are not explicitly detailed in the text, the comparison table implicitly highlights characteristics where performance is expected to be equivalent or improved. For instance, the Signal-to-Noise Ratio (SNR) and Detective Quantum Efficiency (DQE) are critical performance metrics for X-ray detectors, and based on the equivalence asserted, one can infer that these metrics met predefined acceptance thresholds.

    Given the information in the "Comparison Table", the following can be inferred as performance aspects that were evaluated and met criteria for substantial equivalence:

    Acceptance Criteria (Implied from comparison)Reported Device Performance (Implied from comparison)
    Image Quality Metrics:
    MTF (1.0 cycle/mm)Non-binning: 0.62
    MTF (1.0 cycle/mm)2x2 binning: 0.58
    DQE (1.0 cycle/mm)56% @ 1mR
    DQE (0 cycle/mm)65% @ 0.02mR
    Exposure Acquisition TimeMax. acquisition time: 40 seconds (for serial radiography)
    Battery Duration in StandbyP-65: Approx. 13.2 hours; P-75: Approx. 12.2 hours
    Antibacterial PropertiesSurface infused with Silver ions (antibacterial properties)
    Environmental Protection (IPX)IPX6
    Regulatory ComplianceAAMI/ANSI ES 60601-1 (Ed.3.1), IEC 60601-1-2 (Ed.4.0), and ISO 10993-1 (2018) met.
    Software FunctionalityNew FPD support (CTDS) and 40 seconds serial radiography support (SIC) operating as intended.
    Absence of New Safety/Effectiveness IssuesPerformance tests demonstrated no new issues compared to predicate device.

    2. Sample Size Used for the Test Set and Data Provenance

    The provided document does not detail any clinical test set or data provenance in terms of patient images or specific study populations. The performance data mentioned refers to engineering and quality assurance tests, not clinical performance studies with patient data.

    3. Number of Experts Used to Establish Ground Truth and Qualifications

    This information is not applicable or disclosed in the provided text. The document refers to engineering performance tests and compliance with regulatory standards, not expert-adjudicated clinical ground truth.

    4. Adjudication Method for the Test Set

    This information is not applicable or disclosed as there is no mention of a human-reviewed test set or adjudication process for diagnostic performance.

    5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study Was Done

    No, a Multi-Reader Multi-Case (MRMC) comparative effectiveness study is not mentioned in the provided text. The document indicates that "the results of risk management did not require clinical studies to demonstrate the substantial equivalency of the proposed device," which suggests that comparative effectiveness with human readers or AI assistance was not a component of this 510(k) submission.

    6. If a Standalone (i.e., algorithm only without human-in-the-loop performance) Study Was Done

    The device itself is a digital radiography system, which generates images. While there are software components (like Console CS-7 for image processing), the submission focuses on the hardware (FPDs) and overall system performance in generating X-ray images, not an AI algorithm's standalone diagnostic performance. Therefore, such a standalone diagnostic algorithm study is not mentioned. The "performance tests" refer to technical specifications and safety, not diagnostic accuracy.

    7. The Type of Ground Truth Used

    Based on the document, the "ground truth" for the acceptance criteria was primarily based on technical specifications, regulatory standards, and engineering performance requirements. These include metrics like MTF, DQE, mechanical dimensions, battery life, IPX ratings, and compliance with electrical safety and electromagnetic compatibility standards. No clinical ground truth (e.g., pathology, outcomes data, or expert consensus on disease presence) is mentioned as being used for performance evaluation in this submission.

    8. The Sample Size for the Training Set

    This is not applicable or disclosed. The document does not describe the development or training of an AI algorithm in the context of machine learning, so there is no mention of a training set of images.

    9. How the Ground Truth for the Training Set Was Established

    This is not applicable or disclosed as there is no mention of an AI training set.

    Ask a Question

    Ask a specific question about this device

    K Number
    K182688
    Device Name
    SKR 3000
    Date Cleared
    2018-12-18

    (83 days)

    Product Code
    Regulation Number
    892.1680
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    SKR 3000

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The SKR 3000 is indicated for use in generating radiographic images of human anatomy. It is intended to replace radiographic film/screen system in general-purpose diagnostic procedures.

    The SKR 3000 is not indicated for use in mammography, fluoroscopy and angiography applications.

    Device Description

    The modified SKR 3000 employs additional peripheral units, the Detector Interface Unit 2 (DIU2) and the Generator Interface Units 3 (GIU3), to incorporate serial radiography operation that is an additional radiography acquisition sequence.

    The system is intended for use replacing a radiographic film/screen system in general-purpose diagnostic procedures of human anatomy. The system can be used in conjunction with current cleared AeroDR FPDs. The P-61, P-71, P-81 and the other compatible FPDs availably used in SKR 3000 are lightweight, mobile FPD and they are formed in compatible size with the cassette of ISO standard size. The FPDs availably used for the serial radiography are P-61 and P-71.

    The SKR 3000 performs radiography imaging of the human body using an X-ray planar detector (FPD) that outputs a digital signal, which is then input into an image processing device. The serial radiography is a function of multi-frame radiography simply repeating single still radiography multiple times. The acquired image is transmitted to a filing system, printer, and image display device as diagnostic image after applying image processing to the raw data of image by the image processing device, Console CS-7.

    AI/ML Overview

    This document describes the SKR 3000 device, specifically modified with additional peripheral units (Detector Interface Unit 2 (DIU2) and Generator Interface Units 3 (GIU3)) to incorporate a serial radiography operation. The purpose of this submission is to demonstrate substantial equivalence to a predicate device (K172793).

    Since the primary claim is substantial equivalence based on technological characteristics and performance tests, not a new clinical claim or improvement over human readers, much of the requested information regarding AI performance and comparative studies with human readers is not applicable or provided in this document.

    Here's a breakdown of the available information:

    1. Table of Acceptance Criteria and Reported Device Performance

    The document does not explicitly provide a table of acceptance criteria with specific numerical performance metrics. Instead, it states that:

    • "All of the verification activities required by the specification and the risk analysis for the SKR 3000 were performed and the results demonstrated that the predetermined acceptance criteria were met."
    • "The performance tests according to the 'Guidance for the Submission of 510(k)s for Solid State X-ray Imaging Devices' and the other verification and validation including the items required by the risk analysis for the SKR 3000 were performed and the results demonstrated that the predetermined acceptance criteria were met."

    The acceptance criteria are implicitly tied to compliance with relevant standards and the risk analysis. Performance is reported as meeting these predetermined criteria.

    Acceptance Criteria CategoryReported Device Performance
    Operational PrinciplesSame scientific technologies and operational principal as predicate device (K172793). Additional peripheral components (DIU2, GIU3) employ equivalent function and comply with same EMC and electrical safety standards. Modified serial radiography operation does not raise new safety or effectiveness issues.
    Verification ActivitiesPerformed according to specifications and risk analysis; predetermined acceptance criteria met.
    Performance TestsPerformed according to "Guidance for the Submission of 510(k)s for Solid State X-ray Imaging Devices" and other verification/validation, including risk analysis items; predetermined acceptance criteria met.
    SafetyConforms with AAMI/ANSI ES 60601-1 (Ed.3.1), IEC 60601-1-2, and ISO 10993-1. Risks, including serial radiography, reduced to acceptable levels per ISO 14971.
    BiocompatibilityNo material change in patient contact materials.

    2. Sample size used for the test set and the data provenance

    The document does not specify a separate "test set" in the context of image analysis performance as typically seen with AI devices. The assessment relies on engineering and performance tests of the device's hardware and software functionalities for image acquisition and processing.

    • No specific sample size of images or patients is provided for a "test set" in the context of diagnostic accuracy.
    • The provenance of data (e.g., country of origin, retrospective/prospective) is not mentioned because the evaluation is based on technical verification and validation, rather than clinical data.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts

    Not applicable. The ground truth for this device's performance demonstration appears to be defined by technical specifications, compliance with standards, and risk analysis, rather than expert consensus on diagnostic interpretations of images.

    4. Adjudication method (e.g., 2+1, 3+1, none) for the test set

    Not applicable, as no external expert adjudication of images is described.

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, if so, what was the effect size of how much human readers improve with AI vs without AI assistance

    Not applicable. This device is a digital radiography system, not an AI-powered image analysis tool intended to assist human readers or improve their performance. The submission focuses on the safety and effectiveness of the imaging system itself.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    Not applicable. The SKR 3000 is an X-ray imaging system, not a standalone algorithm.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)

    The "ground truth" for the device's performance is established through:

    • Technical Specifications: The device's ability to produce radiographic images of human anatomy according to defined parameters (e.g., pixel size, A/D conversion, frame rate for serial radiography).
    • Compliance with Standards: Meeting established engineering and safety standards (AAMI/ANSI ES 60601-1, IEC 60601-1-2, ISO 10993-1).
    • Risk Analysis: Demonstrating that identified risks, including those related to the new serial radiography function, are reduced to acceptable levels.
    • Comparison to Predicate Device: The modified device maintains similar technological characteristics and operational principles to its legally marketed predicate, which implicitly serves as a benchmark for "ground truth" regarding general-purpose diagnostic imaging.

    8. The sample size for the training set

    Not applicable. This is not an AI/machine learning device that requires a training set of data.

    9. How the ground truth for the training set was established

    Not applicable, as there is no training set for this device.

    Ask a Question

    Ask a specific question about this device

    K Number
    K172793
    Device Name
    SKR 3000
    Date Cleared
    2017-10-12

    (27 days)

    Product Code
    Regulation Number
    892.1680
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    SKR 3000

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The SKR 3000 is indicated for use in generating radiographic images of human anatomy. It is intended to replace radiographic film/screen system in general-purpose diagnostic procedures.

    The SKR 3000 is not indicated for use in mammography, tomography and angiography applications.

    Device Description

    The proposed SKR 3000 is modified to consist of new FPD P-71 and P-81 in addition to previously cleared P-61, Console CS-7 and other peripherals. Additionally, a radiography sequence of AeroSync, non tethered operation mode, is modified to incorporate improved usage that allow user to use a FPD without console CS-7. A user can capture the images and store them in FPD memories until when those readout images are transferred to the console CS-7 by a user.

    The system is intended for use replacing a radiographic film/screen system in general-purpose diagnostic procedures of human anatomy. The system can be used in conjunction with current cleared AeroDR FPDs. The P-61, P-71, P-81 and the other compatible FPDs availably used in SKR 3000 are lightweight, mobile FPD and they are formed in compatible size with the cassette of ISO standard size.

    The SKR 3000 performs radiography imaging of the human body using an X-ray planar detector (FPD) that outputs a digital signal, which is then input into an image processing device. The acquired image is transmitted to a filing system, printer, and image display device as diagnostic image after applying image processing to the raw data of image by the image processing device, Console CS-7.

    The radiography sequences to be synchronized with a trigger timing of Xray irradiation are the same as the way of predicate device that requires the SRM / S-SRM connection between the SKR 3000 and X-ray generator. The AeroSync allows to use a FPD without a wired connection of SRM / S-SRM, the FPD itself begins to acquire an image when it detects X-ray irradiation in this mode. The compatible X-ray system is same as that of previously cleared SKR 3000.

    The FPDs used in SKR 3000 can communicate with the image processing device through the wired Ethernet and/or the Wireless LAN (IEEE802.11a/n and FCC compliant). The WPA2-PSK (AES) encryption is adopted for a security of wireless connection.

    The SKR 3000 is designed to comply with the following standards; AAMI/ANSI ES 60601-1 (Ed.3.1), IEC 60601-1-2, and ISO 10993-1.

    AI/ML Overview

    Based on the provided text, the device in question, the KONICA MINOLTA SKR 3000, is an X-ray imaging system, and the submission is for a modification to an already cleared predicate device (K162504 The original SKR 3000) by adding new Flat Panel Detectors (FPDs) (P-71 and P-81) and a modified radiography sequence (AeroSync).

    The crucial point from the document is:
    "The concurrence study is not necessary according to the new SSXI Guidance because the proposed P-71 / P-81 have same imaging performance but overall change in the dimensions only. Besides, the results of risk management did not require clinical studies to demonstrate the substantial equivalency of the proposed device."

    This statement indicates that the regulatory body determined that a full-fledged clinical study or comparative effectiveness study was NOT required because the new FPDs primarily change in physical dimensions while maintaining the same imaging performance as the already cleared predicate device. Therefore, many of the typical acceptance criteria and study aspects you'd expect for a novel AI or diagnostic device are not detailed in this 510(k) summary because they were not deemed necessary for this specific submission.

    However, I can extract information relevant to the general performance testing and the implicit criteria used for this type of device based on what is stated:


    Acceptance Criteria and Device Performance (Implicit/Inferred from the document's context)

    For this specific 510(k) submission regarding the modification of the SKR 3000, the primary acceptance criteria revolved around demonstrating that the new components (P-71 and P-81 FPDs and the AeroSync modification) do not degrade the established performance of the predicate device and comply with relevant safety standards. The document explicitly states that the new FPDs have the "same imaging performance" as the predicate.

    Acceptance Criteria (Inferred from regulatory context for X-ray systems)Reported Device Performance (as stated or implied)
    Imaging Performance Equivalence"P-71 / P-81 have same imaging performance" as predicate P-61.
    (e.g., Spatial Resolution, Contrast Resolution, Dose efficiency, DQE, MTF)(Specific metrics not provided, but equivalence asserted)
    Safety and Electrical Compliance"The system is in conformance with the standards described above [AAMI/ANSI ES 60601-1 (Ed.3.1), IEC 60601-1-2]" and "same standards to those of predicate device."
    Biocompatibility"All patient contact materials... are identical to those of predicate device, and are evaluated under ISO 10993-1 and determined as acceptable."
    Mechanical Integrity/DurabilityExplicitly stated "overall change in the dimensions" for P-71 and P-81, implies new mechanical properties were evaluated. (No specific performance data given, but assumed within "performance tests")
    Functional Equivalence (AeroSync)"The system including a modified radiography sequence has same operational principles and designing as those of the predicate device." "Allows user to use a FPD without console CS-7" (new feature with stated functionality).

    Study Details (as per the provided text for this specific 510(k) modification)

    1. Sample sizes used for the test set and the data provenance:

      • The document mentions "Performance tests" were performed "according to the 'Guidance for the Submission of 510(k)s for Solid State X-ray Imaging Devices' and the other verification and validation including the items required by the risk analysis."
      • No specific sample sizes (e.g., number of images, patients) for the test set are mentioned. This is typical for submissions focused on hardware modifications retaining "imaging performance" rather than proving a new diagnostic capability. The tests would likely involve phantom images and bench testing rather than patient data in large quantities.
      • Data provenance is not specified. Given the nature of the tests (bench/phantom), patient data provenance (country, retrospective/prospective) is likely irrelevant for this specific submission, as it's not a clinical performance study.
    2. Number of experts used to establish the ground truth for the test set and the qualifications of those experts:

      • Not applicable / Not specified. Since this is primarily a hardware/engineering performance and safety validation (demonstrating equivalence to a predicate system rather than a new diagnostic claim), there is no mention of human experts establishing "ground truth" for diagnostic purposes. The ground truth for engineering tests typically comes from calibrated measurement equipment and reference standards.
    3. Adjudication method (e.g., 2+1, 3+1, none) for the test set:

      • Not applicable / None specified. No adjudication method is mentioned as there's no diagnostic ground truth established by human readers for comparative or standalone performance in this context.
    4. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:

      • No, an MRMC comparative effectiveness study was explicitly NOT done. The document states: "The concurrence study is not necessary according to the new SSXI Guidance because the proposed P-71 / P-81 have same imaging performance but overall change in the dimensions only." This device is a digital radiography system, not an AI-based diagnostic tool.
    5. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done:

      • Not applicable in the context of an "algorithm." This device is an X-ray imaging system. The performance tests would be "standalone" in the sense that the device itself is tested for its physical and imaging characteristics (e.g., MTF, DQE, spatial resolution) using phantoms and test procedures, without a human in a diagnostic loop.
    6. The type of ground truth used (expert consensus, pathology, outcomes data, etc.):

      • Engineering/Physical Ground Truth: The "ground truth" for the performance tests would be based on calibrated measurements against established standards for X-ray imaging performance (e.g., physical properties of test phantoms, known radiation doses, measurements of image quality metrics like spatial resolution and contrast-to-noise ratio). It is not expert consensus, pathology, or outcomes data related to disease diagnosis.
    7. The sample size for the training set:

      • Not applicable / Not mentioned. This device is a traditional X-ray imaging system, not an AI/machine learning device that requires a training set.
    8. How the ground truth for the training set was established:

      • Not applicable. As no training set is mentioned, no ground truth establishment for it is relevant.

    Summary of Key Takeaways from the Document:

    • This 510(k) is for a modification to an existing X-ray system (SKR 3000), primarily introducing new FPDs with different physical dimensions but "same imaging performance."
    • Due to the nature of the modification (maintaining imaging performance and primarily changing dimensions), full clinical studies or comparative effectiveness studies (like MRMC) were explicitly deemed unnecessary by the FDA for this submission.
    • The "performance tests" focused on ensuring the new components met engineering and safety standards and maintained equivalence to the predicate device's established imaging performance.
    • "Ground truth" in this context refers to physical measurements and adherence to technical specifications rather than medical diagnostic outcomes or expert consensus on clinical cases.
    Ask a Question

    Ask a specific question about this device

    K Number
    K171716
    Device Name
    SKR 3000
    Date Cleared
    2017-08-25

    (77 days)

    Product Code
    Regulation Number
    892.1680
    Reference & Predicate Devices
    N/A
    Why did this record match?
    Device Name :

    SKR 3000

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The SKR 3000 is indicated for use in generating radiographic images of human anatomy. It is intended to replace radiographic film/screen system in general-purpse diagnostic procedures.

    The SKR 3000 is not indicated for use in mammography, tomography and angiography applications.

    Device Description

    Not Found

    AI/ML Overview

    It looks like you've provided an FDA 510(k) clearance letter for the Konica Minolta SKR 3000, which is a stationary x-ray system. However, this document does not contain information about acceptance criteria or a study proving the device meets those criteria, particularly in the context of an AI-powered medical device.

    The provided text only states:

    • The device name (SKR 3000)
    • Its regulation number and name (21 CFR 892.1680, Stationary x-ray system)
    • Its regulatory class (II)
    • Its intended use: "generating radiographic images of human anatomy" and "intended to replace radiographic film/screen system in general-purpose diagnostic procedures."
    • Its contraindications: "not indicated for use in mammography, tomography and angiography applications."
    • That it is a Prescription Use device.

    There is no information within this document about:

    • Specific image quality metrics or performance targets (acceptance criteria).
    • Any studies conducted to validate its performance beyond simply replacing film/screen systems.
    • Any AI component, ground truth, expert readers, or sample sizes for training/testing.

    Therefore, I cannot fulfill your request using the provided text. The document is a clearance letter for a conventional X-ray system, not an AI/ML-enabled device, and does not detail performance studies or acceptance criteria for an AI algorithm.

    Ask a Question

    Ask a specific question about this device

    K Number
    K162504
    Device Name
    SKR 3000
    Date Cleared
    2016-10-03

    (26 days)

    Product Code
    Regulation Number
    892.1680
    Reference & Predicate Devices
    Why did this record match?
    Device Name :

    SKR 3000

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The SKR 3000 is indicated for use in generating radiographic images of human anatomy. It is intended to replace a radiographic film/screen system in general-purpose diagnostic procedures.

    The SKR 3000 is not indicated for use in mammography, tomography and angiography applications.

    Device Description

    The SKR 3000 consisting of new FPD P-61, Console CS-7 and other peripherals is intended for use replacing a radiographic film/screen system in general-purpose diagnostic procedures of human anatomy. The system can be used in conjunction with current cleared AeroDR FPDs. The P-61 and the other compatible FPDs availably used in SKR 3000 are lightweight, mobile FPD and they are formed in compatible size with the cassette of ISO standard size.

    The SKR 3000 performs radiography imaging of the human body using an X-ray planar detector (FPD) that outputs a digital signal, which is then input into an image processing device. The acquired image is transmitted to a filing system, printer, and image display device as diagnostic image after applying image processing to the raw data of image by the image processing device, Console CS-7.

    AI/ML Overview

    The provided text describes the Konica Minolta SKR 3000, a digital radiography system, and its substantial equivalence to a predicate device (K141271 - AeroDR SYSTEM 2). The document suggests that a formal clinical study was not required to demonstrate substantial equivalence.

    Here's an analysis of the requested information based on the provided text:

    1. A table of acceptance criteria and the reported device performance:

    The document states: "The performance tests according to the 'Guidance for the Submission of 510(k)s for Solid State X-ray Imaging Devices' and the other verification and validation including the items required by the risk analysis for the SKR 3000 were performed and the results demonstrated that the predetermined acceptance criteria were met."

    It also mentions a concurrence study: "The concurrence study in a way of comparing images between the proposed device and the predicate device was conducted, and the qualified persons have affirmed and have concluded that both images of proposed P-61 and predicate AeroDR P-51 are equivalent and have sufficient capabilities for the intended purpose of the device."

    However, the document does not explicitly list specific quantitative acceptance criteria (e.g., minimum DQE, specific MTF values, contrast-to-noise ratio requirements) or specific performance metrics from the SKR 3000. It merely states that "predetermined acceptance criteria were met" and that images were "equivalent."

    Acceptance Criteria (General)Reported Device Performance
    Meet "Guidance for Submission of 510(k)s for Solid State X-ray Imaging Devices" performance testsPerformed, results "demonstrated that the predetermined acceptance criteria were met."
    Meet other verification and validation tests (including risk analysis requirements)Performed, results "demonstrated that the predetermined acceptance criteria were met."
    Image equivalence to predicate device (AeroDR SYSTEM 2)Concurrence study concluded images from SKR 3000 and predicate are "equivalent and have sufficient capabilities for the intended purpose."

    2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective):

    The document does not specify the sample size for the test set used in the "concurrence study" or the other "performance tests." It also does not provide information on the country of origin of the data or whether it was retrospective or prospective.

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience):

    The document states "qualified persons have affirmed and have concluded," but it does not specify the number of experts or their qualifications.

    4. Adjudication method (e.g. 2+1, 3+1, none) for the test set:

    The document does not specify any adjudication method used in the concurrence study or other performance tests.

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:

    The document does not mention a multi-reader multi-case (MRMC) comparative effectiveness study. The device is a digital radiography system, not an AI-powered diagnostic tool, so an MRMC study comparing human readers with and without AI assistance would not be applicable in this context. The study performed was a "concurrence study" comparing images from the new device to the predicate device.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done:

    The device described is a digital X-ray system (hardware and associated image processing), not an AI algorithm. Therefore, the concept of "standalone (algorithm only)" performance without human-in-the-loop is not directly applicable in the way it would be for an AI diagnostic algorithm. The performance tests would be evaluating the image acquisition and processing capabilities of the system itself. The "concurrence study" did evaluate the images produced by the device.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc):

    For the "concurrence study," the ground truth appears to be established by expert judgment that images from the proposed device were "equivalent" and had "sufficient capabilities" compared to the predicate device's images. There is no mention of pathology, outcomes data, or other objective ground truth methods.

    8. The sample size for the training set:

    The document does not mention a training set, as the SKR 3000 is an imaging acquisition and processing system, not a machine learning model that requires a labeled training set in the conventional sense. The "training" described would be the development and calibration of the image processing algorithms based on engineering principles and potentially internal testing, not a separate labeled dataset.

    9. How the ground truth for the training set was established:

    As no training set (in the context of machine learning) is explicitly mentioned, there is no information on how its ground truth was established. The "ground truth" for the device's technical specifications and image quality would have been established through engineering design, optical/X-ray physics principles, and quality assurance testing against established imaging standards.

    Ask a Question

    Ask a specific question about this device

    Page 1 of 1