Use of Retrieval Tasks to Rehabilitate Sensorimotor Impairments Due to Brain Injury

Joe Epperson, Rachael Hudson, Eric Meyers, David Pruitt, Joel Wright, Y-Nhy Duong, Robert Rennaker, Michael Kilgard, Seth Hays, Texas Biomedical Device Center, Erik Jonsson School of Engineering and Computer Science, School of Behavioral and Brain Sciences, University of Texas at Dallas, 800 W. Campbell Road, Richardson, Texas 75080

There are many barriers to effective care for people with somatosensory dysfunction after stroke, including limited access to care and insufficient available therapy time. Additionally, current rehabilitation programs often overlook somatosensory impairments and focus primarily on motor impairments, leaving open the opportunity to address these impairments in patients after stroke simultaneously via accessible technology.

Vagus nerve stimulation (VNS) can be used to enhance plasticity in the brain related to active circuits. Preclinical models show that VNS paired with rehabilitation enhances recovery and synaptic plasticity in motor and tactile networks. Clinical trials pairing VNS with rehabilitation show enhanced motor and sensory recovery. However, this treatment is constrained to the clinical setting with a therapist guiding the session and subjectively delivering VNS.

The ReTrieve system we are developing at the University of Texas at Dallas is a user-friendly, automated, at-home, somatosensory retraining system. The ReTrieve system consists of three components: 1) box, 2) objects, and 3) software application. Training with ReTrieve requires users to discriminate between multiple objects relying solely on hand somatosensation, and retrieve a specific one. When an object is removed from the box, the software prompts the user to then place the object in the chute on the top of the box. The ReTrieve software application runs on the front-facing tablet and guides the user through the training session. Using the embedded tablet camera, the ReTrieve software uses a custom computer vision algorithm to automatically detect hand and object removal or insertion, thus allowing the software to guide the user through the training session. Preliminary results show that stroke patients can perform the task and the chosen object sets provide varying discrimination difficulty. Additionally, the triggering algorithm that pairs VNS with the exploration task presents an increased number of tasks per hour over the stroke rehabilitation standard.


Additional Abstract Information

Presenter: Joseph Epperson

Institution: University of Texas at Dallas

Type: Poster

Subject: Engineering

Status: Approved

Time and Location

Session: Poster 6
Date/Time: Tue 2:00pm-3:00pm
Session Number: 4619