(102 days)
A software system used with the Microsoft Kinect intended to support repetitive task practice for rehabilitation of adults under supervision of a medical professional in a clinical or home setting. The system includes simulated activities of daily living (ADLs) for the upper extremity with audio-visual feedback & graphic movement representations for patients as well as patient performance metrics for the medical professional. Patient assessment, exercise guidance, and approval by the medical professional is required prior to use.
The VOTA software system comprises a VOTA patient-facing and a provider-facing Provider Dashboard. The VOTA patient-facing application supports repetitive task practice exercises for the upper extremity that are consistent with Standard of Care for physical rehabilitation of adults. The software runs on a personal computer under the Windows 8.1 operating system (or later) and uses a Microsoft Xbox One Kinect Sensor (hereafter referred to as Kinect Sensor) to track patient arm movements. These arm movements are translated into equivalent movements of a graphical avatar that represents the patient in a virtual environment. The patient is thus able to practice activities of daily living (ADLs) that involve meaningful tasks and evoke functional movements with graduated levels of difficulty. The activities are organized into a virtual "Road to Recovery" that traverses a series of four islands, each organized around a central theme. There is no physical contact between the patient and the device during exercises, and thus no energy is directed to the patient assessment by a medical professional, and selection of exercise and settings, is required prior to use.
The provider-facing VOTA Provider Dashboard application enables the medical professional to view patient performance metrics and participation history using data produced by the VOTA patient-facing application. The application runs on the same personal computer and operating system as the patientfacing application.
All hardware associated with VOTA are commercial-of-the-shelf, consumer hardware items. The VOTA system ships with the following:
- Microsoft Xbox One Kinect Sensor and Kinect power supply;
- Microsoft Xbox Kinect Adapter for Xbox One ;
- Kinect TV Mount for Xbox One;
- Personal computer (preloaded with VOTA software) and computer power supply;
- Wireless keyboard;
- HDMI cable;
- Getting Started Guide; and
- Third-party Labeling Package
The provided text describes the 510(k) premarket notification for the Virtual Occupational Therapy Application (VOTA). However, it does not contain a specific table of acceptance criteria nor a detailed study that proves the device meets specific acceptance criteria in the way typically seen for a new AI/ML drug or device submission with quantifiable performance metrics (e.g., sensitivity, specificity, accuracy).
The document focuses on demonstrating substantial equivalence to a predicate device (Jintronix Rehabilitation System (JRS)) by comparing intended use, technological characteristics, and safety characteristics, rather than establishing quantifiable performance acceptance criteria for VOTA itself. The clinical testing described is primarily to show effectiveness for rehabilitation, not to meet pre-defined, quantitative performance metrics for a diagnostic or assistive AI system.
Therefore, I will extract and synthesize the information available in the document regarding the device's performance, the type of testing conducted, and the evidence provided to support its safety and effectiveness relative to its intended use and predicate device. I will then explain why some requested information (like specific quantitative acceptance criteria and AI-specific study details) is not present in this type of submission.
Here's the closest representation of the requested information based on the provided text:
1. A table of acceptance criteria and the reported device performance
The document does not provide a formal table of quantitative acceptance criteria with corresponding performance metrics like sensitivity, specificity, or accuracy, as would be typical for an AI/ML diagnostic or predictive device. Instead, the "acceptance criteria" are implied through the demonstration of substantial equivalence to a predicate device and clinical usability/effectiveness for its intended rehabilitative purpose.
The "performance" is primarily assessed in terms of clinical effectiveness for rehabilitation and safety.
Implied "Acceptance Criteria" Category | Description / Reported Performance |
---|---|
Functional Gain / Clinical Effectiveness | Acceptance Implied by: Demonstration of clinically significant improvement in upper extremity (UE) motor performance. |
Reported Performance: Stroke patients (n=15) using VOTA for ~1 hour, 3 times/week, over 8 weeks (24 total sessions) achieved an average Fugl-Meyer UE (FMUE) improvement of 6 points. This was measured pre- and post-intervention using the FMUE, a widely-recognized and clinically-relevant measure. | |
Safety | Acceptance Implied by: Absence of adverse events, compliance with safety standards, and no unique safety concerns compared to predicate. |
Reported Performance: No adverse incidents or injuries were reported over the entire period of actual VOTA use by stroke patients in the clinical testing, spanning 240 total sessions of approximately 1 hour each. The device also complies with consumer electrical safety standards (e.g., UL) and laser Class 1 standard (IEC 60825-1:2007) for the Kinect sensor. The risk analysis (ISO 14971) indicated a "Moderate Level of Concern" due to a small, non-zero risk of minor injury from overexertion if incorrectly used, which is mitigated by medical professional supervision as stipulated in the Indications for Use. | |
Usability | Acceptance Implied by: Assessment using a widely-accepted instrument and systematic comparison to Standard of Care by licensed therapists. |
Reported Performance: Clinical testing included "assessment of usability using a widely-accepted instrument" and "systematic comparison of VOTA to Standard of Care by licensed therapists." (Specific scores or detailed results are not provided in this summary). | |
Accuracy of Tracking | Acceptance Implied by: Sufficiency of Kinect-based tracking for intended application and established literature. |
Reported Performance: Clinical testing "demonstrated that VOTA's Kinect-based upper extremity tracking produces valid results for the intended application." The Kinect-based tracking solution was found to be "sufficient, both to permit patients to successfully perform virtual ADL exercises and to support derivation of speed-based motor performance metrics." References were provided for existing literature demonstrating the accuracy of Kinect-based upper extremity tracking. | |
Functional Equivalence | Acceptance Implied by: Demonstration that core functionality aligns with predicate and supports Indications for Use. |
Reported Performance: Bench testing validated "the core functionality of the software system" and established "substantial equivalency to the Predicate." Traceability was provided between Indications for Use, system-level requirements, test plans, and documented test results showing success criteria are met. |
2. Sample size used for the test set and the data provenance
- Test Set Sample Size: 15 stroke survivors with upper extremity impairment participated in the clinical testing.
- Data Provenance: The clinical testing was conducted by the University of Virginia (UVa) Department of Physical Medicine and Rehabilitation and the UVa HealthSouth Rehabilitation Hospital under the approval and governance of the UVa Institutional Review Board for Human Subject Research (IRB-HSR). This indicates prospective data collection from a specific clinical setting in the USA.
3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts
- Number of Experts: Not explicitly stated as a distinct "ground truth" expert panel in the document. The clinical study involved:
- Licensed occupational therapists who supervised the sessions.
- Experienced therapists who assessed safety (over 200 hours of actual patient contact time using the VOTA system).
- Qualifications of Experts: Licensed occupational therapists; experienced therapists (implied clinical background). The Fugl-Meyer UE assessment (FMUE) is a gold-standard, clinician-administered test, meaning the scores collected by the trained therapists serve as the "ground truth" for motor performance.
4. Adjudication method for the test set
- The document does not describe a formal adjudication method (e.g., 2+1, 3+1) for establishing ground truth for the test set. For the FMUE assessment, it is a standardized clinical measure typically administered by a single trained therapist for each assessment. Inter-rater reliability (if multiple therapists assessed the same patient) or a consensus process is not mentioned.
5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance
- No, an MRMC comparative effectiveness study, as typically understood for evaluating AI assistance for human readers/clinicians, was not performed. This device is a direct patient-facing rehabilitation tool with a clinician supervising, not a diagnostic AI system assisting human interpretation of images or other data. The study was a clinical trial evaluating the therapeutic effect of the device on patients.
6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done
- This device is not a standalone diagnostic algorithm. It is a patient-facing application that requires human-in-the-loop (medical professional supervision) as stated in its Indications for Use: "under supervision of a medical professional" and "Patient assessment, exercise guidance, and approval by the medical professional is required prior to use."
- The "standalone" performance closest to what might be considered is the accuracy of the Kinect-based tracking (which is an algorithm within the system). The document states this tracking "produces valid results for the intended application" and was "sufficient" for performing exercises and deriving metrics, citing clinical testing and existing literature. This implies an internal validation of the tracking component, but not as a separately defined "standalone" study in the context of an AI-only performance claim.
7. The type of ground truth used
- The primary "ground truth" for evaluating the device's effectiveness was clinical outcomes data – specifically, pre- and post-intervention scores from the Fugl-Meyer UE (FMUE) assessment. This is a clinician-administered, standardized functional outcome measure.
- For safety, the "ground truth" was observation of adverse events/injuries by supervising therapists.
- For tracking accuracy, the ground truth was implied by the ability of patients to successfully perform virtual activities and the feasibility of deriving motor performance metrics, supported by existing literature on Kinect accuracy.
8. The sample size for the training set
- The document does not specify a sample size for a training set for the VOTA software. This type of submission (for a device like VOTA based on existing technology like Kinect and established rehabilitation principles) is focused on demonstrating substantial equivalence and clinical effectiveness, not on detailing the dataset used to train a novel AI/ML algorithm from scratch. While VOTA is software, it's not described as a deep learning or AI model requiring a large training dataset in the typical sense of current AI medical devices. It utilizes an off-the-shelf sensor (Kinect) whose core tracking algorithms were developed by Microsoft.
9. How the ground truth for the training set was established
- Since a "training set" for a novel AI/ML algorithm is not described, the method for establishing its ground truth is also not applicable/not provided in this document. The "ground truth" relevant to VOTA's performance is established in its clinical test set, as described in point 7.
§ 890.5360 Measuring exercise equipment.
(a)
Identification. Measuring exercise equipment consist of manual devices intended for medical purposes, such as to redevelop muscles or restore motion to joints or for use as an adjunct treatment for obesity. These devices also include instrumentation, such as the pulse rate monitor, that provide information used for physical evaluation and physical planning purposes., Examples include a therapeutic exercise bicycle with measuring instrumentation, a manually propelled treadmill with measuring instrumentation, and a rowing machine with measuring instrumentation.(b)
Classification. Class II (special controls). The device, when it is a measuring exerciser or an interactive rehabilitation exercise device for prescription use only, is exempt from the premarket notification procedures in subpart E of part 807 of this chapter subject to the limitations in § 890.9.