Versatile implementation of a hardware–software architecture for development and testing of brain–computer interfaces

Jorge Antonio Martinez-Ledezma, Jose Hugo Barron-Zambrano, Alan Diaz-Manriquez, Juan Carlos Elizondo-Leal, Vicente Paul Saldivar-Alonso, Horacio Rostro-Gonzalez

Research output: Indexed journal article Articlepeer-review

2 Citations (Scopus)

Abstract

Brain–computer interfaces (BCI) have been focused on improving people’s lifestyles with motor or communication disabilities. However, the utilization of this technology has found news applications, such as increasing human capacities. Nowadays, several researchers are working on probing human capabilities to control several robotic devices simultaneously. The design of BCI is an intricate work that needs a long time to its implementation. For this reason, an architecture to design and implement different types of BCIs is presented in this article. The architecture has a modular design capable of reading various electroencephalography (EEG) sensors and controlling several robotic devices similar to the plug-and-play paradigm. To test the proposed architecture, a BCI was able to manage a hexapod robot and a drone was implemented. Firstly, a mobile robotic platform was designed and implemented. The BCI is based on eye blinking, where a single blinking represents a robot command. The command orders the robot to initiate or stops their locomotion for the hexapod robot. For the drone, a blink represents the takeoff or landing order. The blinking signals are obtained from the prefrontal and frontal regions of the head by EEG sensors. The signals are then filtered using temporal filters, with cutoff frequencies based on delta, theta, alpha, and beta waves. The filtered signals were labeled and used to train a classifier based on the multilayer perceptron (MLP) model. To generate the robot command, the proposal BCI used two models of MLP to ensure the classifier prediction. So, when the two classifiers make the same prediction, within a defined time interval, send the signal to the robot to start or stop its movement. The obtained results show that it is possible to get high precision to control the hexapod robot with a precision of 91.7% and an average of 81.4%.

Original languageEnglish
Article number1729881420980256
Number of pages18
JournalInternational Journal of Advanced Robotic Systems
Volume17
Issue number6
DOIs
Publication statusPublished - 2020
Externally publishedYes

Keywords

  • BCI
  • digital signal processing
  • drone
  • HW-SW architecture
  • legged robot
  • neural networks
  • robot control

Fingerprint

Dive into the research topics of 'Versatile implementation of a hardware–software architecture for development and testing of brain–computer interfaces'. Together they form a unique fingerprint.

Cite this