(119 days)
The STUDIO on the Cloud Data Manager Software is intended for use by both patients and healthcare professionals to assist people with diabetes and their healthcare professionals in the review, analysis and evaluation of historical CGM data to support effective diabetes management. It is intended for use as an accessory to CGM devices with data interface capabilities.
The STUDIO on the Cloud Data Management ("STUDIO") Software is comprised of a data analysis and storage platform, report generation software, and an information delivery service.
Specifically, the proposed STUDIO Software performs the following functions:
- Data Upload: the SweetSpot Fetch Utility application will be used to access data from ● a Receiver, using either Mac or PC operating systems;
- Data Analysis: certain SweetSpot Platform functions will be used to validate. ● aggregate, and analyze (e.g., correlate) CGM data, and to create charts and reports that mimic the current STUDIO Pattern and Glucose Strips charts and reports;
- . Reports: The current STUDIO Pattern charts will be displayed on the user's computer screen, and both the Pattern and Glucose Strip charts can be saved to the user's computer in PDF format. Both reports may be printed by the user as a PDF document.
The STUDIO Software uses only retrospective data stored on the G4 PLATINUM device to create statistical reports, and does not make treatment recommendations.
Here's an analysis of the acceptance criteria and study for the STUDIO on the Cloud Data Management Software, based on the provided text:
Acceptance Criteria and Device Performance for STUDIO on the Cloud Data Management Software
The provided document describes a Continuous Glucose Monitor Data Management System which primarily involves software for data analysis and display, not a device with analytical performance characteristics in the traditional sense (like a diagnostic test). Therefore, the acceptance criteria are focused on the software's functionality, usability, and data accuracy in transferring and displaying retrospective CGM data.
1. Table of Acceptance Criteria and Reported Device Performance
Given the nature of this software device, the "performance" is largely about its functional correctness and usability.
Acceptance Criteria Category | Specific Acceptance Criteria | Reported Device Performance |
---|---|---|
Data Accuracy | All data fields uploaded from the Dexcom G4 PLATINUM receiver should be 100% accurate when compared to the data downloaded to a PC. | 100% Accurate: ("All data fields were reported to be 100% accurate.") |
Usability (Ease of Use) | Users (lay and professional) should be able to complete assigned tasks without assistance, demonstrating ease of use and label comprehension. | 96% Task Completion: ("96% of assigned tasks were able to be completed by users without assistance.") |
Software Functionality | The software should correctly perform data upload, analysis (validation, aggregation, correlation), and report generation, mimicking existing STUDIO Pattern and Glucose Strips charts. (Implicit criteria for software development processes) | Acceptable: Documentation related to software development (hazard analysis, requirements, design, traceability, V&V testing) was reviewed and found acceptable. The software performs listed functions as described in the device description. (Implicit from successful review) |
Risk Mitigation | Identified risks (e.g., device malfunction leading to diabetes mismanagement) must be adequately mitigated by general controls, including design controls. | Adequately Mitigated: Risks determined to be adequately mitigated by general controls, design controls, and prescription device restrictions. |
2. Sample Size Used for the Test Set and Data Provenance
-
Data Accuracy (Bench Testing):
- Sample Size: Forty (40) Dexcom G4 PLATINUM receivers.
- Data Provenance: Not explicitly stated, but it implies data from these physical receivers, which would contain retrospective CGM data. The context suggests this was internal bench testing, likely with a mix of real or simulated data that would be representative of data produced by the G4 PLATINUM system. It is retrospective in nature as it uses data already stored on the receivers.
-
Usability Study:
- Sample Size: Forty-four (44) lay and professional users.
- Data Provenance: The study was likely prospective in the sense that these users performed tasks during the study. The demographic characteristics (age, sex, and education level) of the users were varied. The location/country of origin of these users is not specified, but typically for FDA submissions, these would be US-based participants if not otherwise stated.
3. Number of Experts Used to Establish the Ground Truth for the Test Set and Qualifications
- Data Accuracy (Bench Testing): The "ground truth" for the data accuracy test was the same data downloaded directly to a PC. This implies a direct file comparison or database comparison. Therefore, no external experts were used to establish a subjective "ground truth." The ground truth was the original, unaltered data itself.
- Usability Study: No explicit mention of experts establishing a "ground truth" for task completion. The "ground truth" for success was likely objective: did the user successfully complete the assigned task? The assessment would have been done by study administrators observing user interactions.
4. Adjudication Method for the Test Set
- Data Accuracy (Bench Testing): Not applicable in the traditional sense of expert adjudication. The comparison was direct, likely automated or manual side-by-side comparison of data fields between the software-uploaded data and the PC-downloaded data. Any discrepancy would be a definitive error, not something requiring adjudication.
- Usability Study: The document does not specify an adjudication method. Task completion would typically be assessed by study observers or through automated logging of user actions. Given it's reporting "96% of assigned tasks were able to be completed by users without assistance," it implies a clear "pass/fail" for each task for each user, summing up to the total percentage. If there were ambiguities, a predefined scoring rubric or internal review process would be used, but this is not detailed.
5. If a Multi Reader Multi Case (MRMC) Comparative Effectiveness Study was Done
- No, a Multi-Reader Multi-Case (MRMC) comparative effectiveness study was not done. This type of study is typically performed for diagnostic or screening devices where human readers interpret medical images or data, and the AI's assistance to these readers is being evaluated.
- The STUDIO software is a data management tool for retrospective CGM data, not an AI-powered diagnostic aide. Its purpose is to present existing data, not to make interpretations or recommendations itself, nor to assist human readers in making new interpretations.
6. If a Standalone (i.e., algorithm only without human-in-the-loop performance) was Done
- Yes, a form of standalone performance was assessed regarding data accuracy. The "Bench Testing" compared the data processed by the software against the original data downloaded to a PC. This is an algorithm-only evaluation of data integrity during transfer and processing. The software's ability to accurately present the retrospective data is its primary "standalone" function.
- The "usability study" is human-in-the-loop, but it evaluates the human's interaction with the software, not the software's inherent analytical capabilities on its own.
7. The Type of Ground Truth Used
- Data Accuracy: The ground truth was the original, raw CGM data as directly downloaded from the Dexcom G4 PLATINUM receiver to a PC. This is a very objective, "reference data" type of ground truth.
- Usability: The ground truth for usability was successful completion of pre-defined tasks by users without assistance, measured objectively by study observers or system logs.
8. The Sample Size for the Training Set
- The document does not specify a separate training set size. This is typical for data management software that primarily performs aggregation, storage, and visualization of existing data, rather than machine learning models that require explicit "training." The software's design and functionality are based on defined rules and processes for handling CGM data, not on learning from a dataset.
9. How the Ground Truth for the Training Set Was Established
- As no explicit training set is mentioned for a machine learning model, this question is not applicable. The "ground truth" for the software's development (which could be considered analogous to a training phase in a different context) would be the specifications and expected behavior, informed by the Dexcom G4 PLATINUM data format and intended display logic. This would be established through engineering requirements and design (based on established CGM data structures and diabetes management reporting needs).
§ 862.2120 Continuous glucose monitor data management system.
(a)
Identification. A continuous glucose monitor data management system is an electronic device intended to acquire, process, and correlate retrospective data from a continuous glucose monitoring device. This device is intended to be used by patients or their healthcare providers when determining therapeutic strategies. A continuous glucose monitor data management system is not a drug dose calculator and does not provide treatment recommendations.(b)
Classification. Class I (general controls). The device is exempt from the premarket notification procedures in subpart E of part 807 of this chapter, subject to the limitations in § 862.9.