Search Filters

Search Results

Found 1 results

510(k) Data Aggregation

    K Number
    DEN040012
    Manufacturer
    Date Cleared
    2004-12-23

    (50 days)

    Product Code
    Regulation Number
    862.2570
    Reference & Predicate Devices
    N/A
    Predicate For
    N/A
    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The Affymetrix GeneChip® Microarray Instrumentation System consisting of GeneChip® 3000Dx scanner with autoloader, FS450Dx fluidics station and GCOSDx software is intended to measure fluorescence signals of labeled DNA target hybridized to GeneChip® arrays.

    Device Description

    The Affymetrix GeneChip Microarray Instrumentation System is designed to work with microarrays based on Affymetrix GeneChip ® technology. The system includes the FS450Dx Fluidics Station, GCS3000Dx Scanner, and GCOSDx Software. The FS450Dx performs hybridization, washing, and staining. The GCS3000Dx Scanner is a wide-field, epifluorescent, confocal, scanning laser microscope which scans the chip after staining. The GCOSDx Software provides the interface between the user and instrument systems, controls the instruments, and processes arrays and data collection.

    AI/ML Overview

    The provided document describes the Affymetrix GeneChip Microarray Instrumentation System, which includes the FS450Dx Fluidics Station, GCS3000Dx Scanner, and GCOSDx Software. This system is intended to measure fluorescence signals of labeled DNA target hybridized to GeneChip® arrays, to be used with separately cleared GeneChip microarray assays.

    The submission is for a new device with no predicate, and the performance characteristics described focus on the analytical performance for reproducibility, as accuracy, linearity, carryover, and interfering substances are to be assessed during the clearance of individual assays run on the system.

    Here's an analysis of the provided information based on your request:

    1. Table of Acceptance Criteria and Reported Device Performance

    Given that this is an instrumentation system intended for use with various assays, the acceptance criteria are largely focused on the consistency and reproducibility of the hardware components. The document explicitly mentions targets for uniformity CVs.

    Acceptance CriteriaReported Device Performance
    Scanner Uniformity (Global): CV for global uniformity < 10% (for the array used to assess scanner performance alone)Scanner Uniformity (Global): The between-scan %CV range for all bright features was 4.2-5.2% for scanner 1, 3.2-4.0% for scanner 2, and 0.3-1.9% for scanner 3. For between-scanner, the %CV ranges for all bright features across the scanners were 4.4-10.9% across different time points. This suggests the global uniformity is within the <10% range for individual scanners, but slightly higher when comparing across scanners at later time points.
    Scanner Uniformity (Local): CV for local uniformity < 1% (each 400 uM2 gridded cell with surrounding 4 cells; averaged over alternating cells of the 32x32 array)Scanner Uniformity (Local): Not explicitly reported with a specific numerical value. The study mentions demonstrating uniformity of scanner performance by calculating CVs but only provides global CVs for the scanner performance. The overall average of CVs for 72 images for hybridized control oligonucleotide (reflecting cumulative variability of all system components) was 8.0%.
    Feature Position Check (Scanner): Displacement introduced by scanner (No explicit criterion, but implied to be minimal)Feature Position Check (Scanner): The scanner introduced no more than 2.5 microns of feature displacement.
    Overall System Reproducibility (Hybridization Uniformity): Signal %CV (cumulative variability of all system components)Overall System Reproducibility (Hybridization Uniformity): The overall average of CVs for 72 images for the control oligonucleotide was 8.0%. Hybridization uniformity was consistent across 36 arrays.
    Chip-to-Chip Variability (PM Features): Signal CV for perfect match featuresChip-to-Chip Variability (PM Features): One perfect match feature had a signal CV > 13%. (No explicit criterion provided, but performance is specified).
    Chip-to-Chip Variability (MM Features): Signal CV for mismatch featuresChip-to-Chip Variability (MM Features): No mismatch features had signal variation > 15%. (No explicit criterion provided, but performance is specified).
    Discrimination Score Reproducibility: Discrimination scores within 95% confidence intervalDiscrimination Score Reproducibility: 36 average discrimination scores from all arrays fell within 95% confidence interval (average = ave of 2 scanned images/array x 36 arrays).
    System Component Variability (ANOVA): No statistically significant variability introduced by FS, FS port, or scannerSystem Component Variability (ANOVA): P-values >0.05 for FS, FS port, and scanner, indicating no statistically significant variability introduced.

    2. Sample Size Used for the Test Set and Data Provenance

    • Sample Size (for comprehensive system reproducibility study):
      • Arrays: 36 individual GeneChip microarrays (12 chips on 3 Fluidics Stations, each chip scanned in duplicate on each of 3 scanners).
      • Scans: 72 total scans of the same test material (6 scans per chip: 2 duplicates x 3 scanners).
      • Fluidics Stations: 3
      • Scanners: 3
    • Data Provenance: Not explicitly stated (e.g., country of origin). The data appears to be prospective as it describes a specific study designed and executed to assess the device's performance.

    3. Number of Experts Used to Establish the Ground Truth for the Test Set and Qualifications of Those Experts

    Not applicable. This study is focused on the analytical performance (reproducibility and consistency) of the instrumentation system itself, not on the clinical interpretation of results from specific assays. Therefore, there is no "ground truth" derived from expert review of patient data in the context of this submission. The "ground truth" for this study is essentially the known concentration and design of the control probes and target samples used for the reproducibility assessment.

    4. Adjudication Method for the Test Set

    Not applicable. As there is no expert interpretation of clinical data or qualitative assessments requiring adjudication, no adjudication method was used. The study relies on quantitative measurements (fluorescence intensity, CVs, discrimination scores).

    5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study Was Done

    No, a MRMC comparative effectiveness study was not done. This study evaluates the instrumentation's ability to consistently generate raw data, not the diagnostic performance of an AI algorithm or the improvement of human readers with AI assistance. The device is an instrument system, not an AI diagnostic tool.

    6. If a Standalone (i.e., algorithm only without human-in-the-loop performance) Was Done

    Yes, in essence, a standalone performance assessment of the instrument system was conducted. The study explicitly evaluates the performance of the FS450Dx Fluidics Station, GCS3000Dx Scanner, and GCOSDx Software in generating consistent and reproducible raw data (CEL files). The "algorithm" here refers to the GCOSDx software's image processing and cell intensity calculation algorithms. The study measures the output of the system (intensity readings, CVs, discrimination scores) as an objective assessment of its analytical capabilities, independent of human interpretation of those results.

    7. The Type of Ground Truth Used

    The ground truth used for this analytical performance study consists of:

    • Known concentrations: Two target sets (for control and discrimination) at specific known concentrations (0.1, 0.5, 1, and 3nM).
    • Known array design: The precisely designed array with specific probe sequences (control probes, discrimination controls, PM/MM pairs) located in specific patterns.
    • Known chip lot: A single chip lot was used to minimize chip-to-chip variability as a source of error.

    Essentially, the "ground truth" is the expected and designed behavior of the system when processing known, controlled inputs, focusing on the system's ability to consistently and accurately (in terms of reproducibility) detect and quantify these known elements.

    8. The Sample Size for the Training Set

    Not applicable. This document describes the performance validation of a hardware and software system, not a machine learning or AI model that requires a training set. The GCOSDx software contains algorithms for image processing and data analysis, but these are likely deterministic algorithms, not adaptive machine learning models that would be "trained" on data in the modern sense.

    9. How the Ground Truth for the Training Set Was Established

    Not applicable. Since there is no "training set" for an AI model, the concept of establishing ground truth for it does not apply in this context.

    Ask a Question

    Ask a specific question about this device

    Page 1 of 1