Decoding brain activity related to gait during exoskeleton-assisted walking.
Lower-limb robotic exoskeletons have emerged as aids for over-ground, bipedal ambulation for individuals with motor limitations. The usability and clinical relevance of these robotics systems could be further enhanced by brain-machine interfaces (BMIs). Different approaches have been explored in the last decade to use BMIs based on EEG to interact with robotics exoskeletons. One of these approaches is based on detecting users’ motor imagery related to walk. In this regard, our group have developed different BMIs exploring the capabilities of using motor imagery for commanding exoskeletons. In addition, as the performance of current BMIs has to improve in order to command exoskeletons not only in clinic environments, but also at home or outdoors, our group is currently implementing a new BMI based on the combination of two paradigms: motor imagery and user’s attention during walking. Indeed, we have just published a paper showing some promising results using this new BMI. However, in this BMI, motor imagery is decoded only while users are walking through typical flat grounds and users’ attention is estimated from EEG. In this proposal we will use several EUROBENCH scenarios to get data that will allow us: (1) to verify that the attention estimated from EEG by our algorithm is correlated with the attention provided by EUROBENCH while the users are walking wearing an exoskeleton; and (2) validate that the algorithm that we have developed to detect subjects’ motor imagery from users can be applied if they are walking through non-flat terrains. The EEG signals recorded using the EUROBENCH scenarios and the results provided by our algorithms (subjects’ motor imagery decoded and users’ attention estimated) will be incorporated into the EUROBENCH database. This information will be a powerful resource for researchers interested in controlling lower-limb exoskeletons from EEG signals to develop, test and compare their algorithms.