Search Filters

Search Results

Found 2 results

510(k) Data Aggregation

    K Number
    K052938
    Device Name
    CR-PRO
    Manufacturer
    Date Cleared
    2005-11-17

    (28 days)

    Product Code
    Regulation Number
    892.1680
    Reference & Predicate Devices
    Predicate For
    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The CR-Pro is a free standing, laser driven, image digitizer intended to produce digital copies of phosphor plate recorded images in 16 levels of gray scale. The digital copies are transmitted to an internal Intel Pentium 4 based personal computer (PC) where they may be displayed, processed, or compressed for archiving or transmission via computer networks to other medical facility sites.

    This device is not to be used for primary imaging diagnosis in mammography.

    Device Description

    The eRadlink CR-Pro is a digitizing scanner that converts radiographic film transparency images to digital format. This is accomplished by utilizing a laser beam light source and a proprietary sealed path of fiber optics. The new technology provides superior image quality, requires no internal optics cleaning, no optical alignment and is inherently highly accurate and reliable.

    Phosphor plates from a minimum of 10 inches to a maximum of 14 inches in width, is driven past the digitizing laser beam by a clocked, stepping motor. Scanned data is electronically converted from analog to 16 bit digital gray scale and transmitted to the internal computer for processing.

    AI/ML Overview

    Here's an analysis of the provided text regarding the eRadlink CR-Pro's acceptance criteria and studies:

    Summary of Device Performance and Acceptance Criteria

    The provided 510(k) summary for the eRadlink CR-Pro does not explicitly define "acceptance criteria" in terms of specific performance thresholds for image quality metrics (e.g., SNR, spatial resolution, contrast resolution) that the device must meet. Instead, the "Effectiveness" section states:

    "Program testing and calibration using Beryllium gray-scale wedge, body part phantoms and typical x-ray plate samples has demonstrated the CR-Pro’s conformance to its defined specifications."

    This implies that the acceptance criteria are tied to the device's defined specifications and successful demonstration of conformance through specific tests. While the specifications for various hardware components (like dynamic range, pixel/mm, gray scale) are listed in comparison to a predicate device, specific acceptance thresholds for image quality when tested with phantoms are not detailed.

    Table of Acceptance Criteria and Reported Device Performance

    As specific numerical acceptance criteria for image quality from phantom studies are not explicitly stated as performance thresholds, I will present the relevant specifications from the comparison tables which implicitly represent the device's targeted performance based on the predicate device.

    Acceptance Criterion (Implicit from Predicate & Device Spec)Reported Device Performance (eRadlink CR-Pro)
    Dynamic Range0.0 - 3.5 OD (Similar to Predicate's 0.5 - 3.8 OD)
    Gray Scale Depth16 (bits Transmitted) (Superior to Predicate's 8 or 12 bits)
    Spot Size100 µm (Matches Predicate)
    Pixel/mm8.0 (Comparable to Predicate's 10.09)
    Digitizing Rate100 lines/sec (Comparable to Predicate's 115 lines/sec)
    Image Quality ConformanceDemonstrated conformance to defined specifications through program testing and calibration using Beryllium gray-scale wedge, body part phantoms, and typical x-ray plate samples.
    Compliance with Safety StandardsIEC 60601-1, 2; 21CFR1040.10, DICOM 3:2004, EN55022:1998, EN55024:1998, EN61000-3-2:2000, EN61000-3-3:1995, SMPTE RP 215-2001, SMPTE 349M-2001

    Study Information

    1. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)

      • The document does not specify a numerical sample size for a clinical test set.
      • The effectiveness was demonstrated through "program testing and calibration using Beryllium gray-scale wedge, body part phantoms and typical x-ray plate samples." These are laboratory-based tests using inanimate objects, not a clinical trial withpatient data.
      • Data provenance: Not applicable as it's not clinical data. The tests would have been performed by the manufacturer, eRadlink Inc., presumably in Torrance, California, USA, where they are based. The study is prospective in the sense that the device was tested to demonstrate current functionality.
    2. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)

      • Not applicable. As the testing involved phantoms and calibration, there was no need for expert clinicians to establish "ground truth" in the diagnostic sense. The ground truth would be based on the known physical properties and measurements from the test objects and the expected output of the device as per its design specifications.
    3. Adjudication method (e.g. 2+1, 3+1, none) for the test set

      • Not applicable, as there was no clinical test set requiring adjudication.
    4. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

      • No MRMC comparative effectiveness study was done. This device is a digitizing scanner, not an AI-powered diagnostic tool. The submission focuses on demonstrating substantial equivalence to a predicate device for its core function of converting radiographic film to digital format.
    5. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

      • The "effectiveness" testing described is inherently standalone in the sense that the device performs its digitizing function, and its output is assessed against its specifications using physical test objects. There is no "human-in-the-loop" performance being evaluated in this context, nor is there an "algorithm only" performance reported in the way it might for an AI diagnostic device.
    6. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)

      • The ground truth for the "effectiveness" testing consisted of the known physical properties of the Beryllium gray-scale wedge and body part phantoms, and the defined specifications of the CR-Pro device. It's a technical ground truth related to image capture and conversion accuracy, not a clinical ground truth.
    7. The sample size for the training set

      • Not applicable. This device is a hardware digitizer with associated software for processing and transmission. It does not appear to involve machine learning or AI models that would require a "training set" in the conventional sense. The software functions (e.g., image rotation, DICOM send/receive) are standard functionalities, not learned from data.
    8. How the ground truth for the training set was established

      • Not applicable, as there was no training set.
    Ask a Question

    Ask a specific question about this device

    K Number
    K020243
    Device Name
    LASERPRO 16
    Manufacturer
    Date Cleared
    2002-03-05

    (41 days)

    Product Code
    Regulation Number
    892.2030
    Reference & Predicate Devices
    Predicate For
    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The LaserPro 16 is a desk top laser image digitizer intended to produce digital copies of radiological film in 10 bit gray scale. The digital copies are transmitted to a conventional based personal computer (PC) where they may be transmitted to a PACS (Picture Archiving and Communication System) or other networks to other medical facility sites.

    Device Description

    The eRadLink LaserPro 16 is a digitizing scanner that converts radiographic film transparency images to digital format. This is accomplished by utilizing a laser beam light source and a proprietary sealed optic path. There are no internal lenses, mirrors, or electro-optic devices. The new technology provides superior image quality, requires no internal optics cleaning, no optical alignment and is inherently highly accurate and reliable. Film, from a minimum of 2 inches to a maximum of 14 inches in width and, from a minimum of 2 inches to a maximum of over 52 inches in length is driven passed the scanning laser beam by a clocked, stepping motor. Scanned data is electronically converted from analog to 16 bit digital gray scale and transmitted to the host computer in DICOM format.

    AI/ML Overview

    Here's an analysis of the provided text regarding the Laser Film Digitizer (eRadLink LaserPro 16), focusing on acceptance criteria and the study that proves device performance:

    1. Table of Acceptance Criteria and Reported Device Performance

    The provided document (K020243) is a 510(k) Summary for a medical device. It focuses on demonstrating substantial equivalence to a predicate device rather than explicitly defining and meeting specific, quantitative acceptance criteria for safety and effectiveness in the way a clinical trial might.

    The "Effectiveness" section [4] states:
    "Program testing and calibration using Stoeffer T4110 gray-scale strip, linearity test patterns and typical x-ray film samples has demonstrated the LaserPro's 16's conformance to its defined specifications."

    This sentence implies that the "defined specifications" are the acceptance criteria, and the testing performed confirms conformance. However, the specific numerical values of these "defined specifications" are not explicitly listed in the document as measurable acceptance criteria with corresponding performance results.

    Instead, the document primarily compares the device's features and characteristics to a predicate device. If we were to infer "acceptance criteria" from the comparison, they would generally relate to matching or exceeding the capabilities of the predicate device.

    Inferred "Acceptance Criteria" from Predicate Comparison and Reported Performance:

    Feature (Acceptance Criteria - inferred comparison to predicate)Predicate Device (Lumiscan 75) PerformanceeRadLink LaserPro 16 Reported Device PerformanceComments/Rationale for Inferred Criteria
    Scan Size (min)7" x 7"2" x 2"The LaserPro 16 meets/exceeds the minimum scan size of the predicate. Inferring an acceptance criterion of "at least as good as predicate."
    Scan Size (max)14" x 36"14" x 15' (14" x 180")The LaserPro 16 significantly exceeds the maximum scan size of the predicate. Inferring an acceptance criterion of "at least as good as predicate."
    Spot Size100 um116 umThe LaserPro 16 has a slightly larger spot size (lower resolution capability) than the predicate. Without explicit justification, this might be a point of concern for some, but in a 510(k), it's deemed "substantially equivalent." Acceptance criteria implicitly allow for minor differences if overall safety and effectiveness are maintained.
    Dynamic Range0.5 - 3.8 OD0.0-4.1 ODThe LaserPro 16 meets/exceeds the dynamic range of the predicate, offering a wider range. Inferring an acceptance criterion of "at least as good as predicate."
    Gray Scale12 bits8, 12, or 16 bitsThe LaserPro 16 meets/exceeds the gray scale capability of the predicate, offering more options. Inferring an acceptance criterion of "at least as good as predicate."
    Digitizing Rate115 lines/sec100 lines/sec (16 bit gray scale)The LaserPro 16 has a slightly slower digitizing rate at 16-bit gray scale, but the predicate rate (115 lines/sec) might be at a lower bit depth (e.g., 12-bit). If the predicate's 115 lines/sec was also at 16-bit, then the LaserPro is slightly below. However, the FDA found it substantially equivalent.
    Pixel/mm10.098.6The LaserPro 16 has fewer pixels/mm (lower resolution) than the predicate (8.6 vs 10.09). Again, this is a minor difference that was accepted as part of substantial equivalence.
    Conformance to "Defined Specifications"N/A (Predicate has its own specs)Demonstrated by program testing & calibrationThis is the most direct statement of meeting criteria, but the specifications themselves are not detailed.
    Safety ComplianceN/A (Predicate has its own compliance)UL260, CSA 22.2, TUV, IEC 601-1 approvedThe device must meet recognized safety standards.
    EMT RequirementsN/A (Predicate has its own compliance)CISPR 11 Class B for EMT requirementsThe device must meet recognized electromagnetic compatibility standards.

    Study That Proves the Device Meets Acceptance Criteria:

    The document describes the study in section (8) EFFECTIVENESS:

    "Program testing and calibration using Stoeffer T4110 gray-scale strip, linearity test patterns and typical x-ray film samples has demonstrated the LaserPro's 16's conformance to its defined specifications."

    2. Sample Size Used for the Test Set and Data Provenance

    • Sample Size: Not explicitly stated. The description mentions "typical x-ray film samples" but does not quantify the number of samples used (e.g., number of films, number of images).
    • Data Provenance: Not explicitly stated. The document does not specify the country of origin of the "typical x-ray film samples" or if they were retrospective or prospective. Given the context of a 510(k) submission for a digitizer, these would likely be existing, retrospective films.

    3. Number of Experts Used to Establish Ground Truth for the Test Set and Qualifications

    • Number of Experts: Not mentioned.
    • Qualifications of Experts: Not mentioned.

    It's highly probable that for a device like a film digitizer, "ground truth" would be established by comparing the digitized output to the original film's properties (e.g., optical density, resolution, linearity) rather than relying on human interpretation of the digitized image for diagnostic accuracy comparisons at this stage of approval. Therefore, expert radiologists might not have been directly involved in establishing the "ground truth" for the technical performance of the digitizer itself.

    4. Adjudication Method for the Test Set

    • No adjudication method is mentioned, as the study described is technical validation of digitization performance against physical standards (gray-scale strips, linearity patterns) and "typical x-ray film samples," not a diagnostic study requiring human interpretation and consensus.

    5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study was Done

    • No Multi-Reader Multi-Case (MRMC) comparative effectiveness study was performed or described. The study focuses on the technical performance of the digitizer, not its impact on human reader performance or diagnostic accuracy. There is no mention of comparing human readers with AI assistance vs. without AI assistance.

    6. If a Standalone (Algorithm Only Without Human-in-the-Loop Performance) Study was Done

    • Yes, a standalone study was done. The "Program testing and calibration" described in section (8) "EFFECTIVENESS" is a standalone evaluation of the device's technical specifications. The device, being a digitizer, is the algorithm/hardware combination that performs the digitization process. The study evaluates its output against known standards, independent of human interpretation or interaction beyond setting up the test.

    7. The Type of Ground Truth Used

    • The ground truth used appears to be based on technical standards and physical measurements.
      • Stoeffer T4110 gray-scale strip: A standardized tool used to evaluate the gray-scale reproduction and dynamic range of imaging systems.
      • Linearity test patterns: Used to assess the geometric accuracy and linearity of the digitization process.
      • Typical x-ray film samples: These would have known optical densities, resolutions, and content, allowing for evaluation of how accurately the digitizer captures these features.

    8. The Sample Size for the Training Set

    • Not applicable. This device is a film digitizer, not an AI/ML algorithm that requires a "training set" in the conventional sense. Its "training" is in its design, engineering, and factory calibration to meet physical specifications, not in learning from a dataset.

    9. How the Ground Truth for the Training Set was Established

    • Not applicable. As explained above, there is no "training set" for a film digitizer in the context of the provided document. The ground truth for its calibration and validation relies on standardized physical references like the Stoeffer T4110 and linearity test patterns.
    Ask a Question

    Ask a specific question about this device

    Page 1 of 1