TY - JOUR
T1 - Versatile implementation of a hardware–software architecture for development and testing of brain–computer interfaces
AU - Martinez-Ledezma, Jorge Antonio
AU - Barron-Zambrano, Jose Hugo
AU - Diaz-Manriquez, Alan
AU - Elizondo-Leal, Juan Carlos
AU - Saldivar-Alonso, Vicente Paul
AU - Rostro-Gonzalez, Horacio
N1 - Publisher Copyright:
© The Author(s) 2020.
PY - 2020
Y1 - 2020
N2 - Brain–computer interfaces (BCI) have been focused on improving people’s lifestyles with motor or communication disabilities. However, the utilization of this technology has found news applications, such as increasing human capacities. Nowadays, several researchers are working on probing human capabilities to control several robotic devices simultaneously. The design of BCI is an intricate work that needs a long time to its implementation. For this reason, an architecture to design and implement different types of BCIs is presented in this article. The architecture has a modular design capable of reading various electroencephalography (EEG) sensors and controlling several robotic devices similar to the plug-and-play paradigm. To test the proposed architecture, a BCI was able to manage a hexapod robot and a drone was implemented. Firstly, a mobile robotic platform was designed and implemented. The BCI is based on eye blinking, where a single blinking represents a robot command. The command orders the robot to initiate or stops their locomotion for the hexapod robot. For the drone, a blink represents the takeoff or landing order. The blinking signals are obtained from the prefrontal and frontal regions of the head by EEG sensors. The signals are then filtered using temporal filters, with cutoff frequencies based on delta, theta, alpha, and beta waves. The filtered signals were labeled and used to train a classifier based on the multilayer perceptron (MLP) model. To generate the robot command, the proposal BCI used two models of MLP to ensure the classifier prediction. So, when the two classifiers make the same prediction, within a defined time interval, send the signal to the robot to start or stop its movement. The obtained results show that it is possible to get high precision to control the hexapod robot with a precision of 91.7% and an average of 81.4%.
AB - Brain–computer interfaces (BCI) have been focused on improving people’s lifestyles with motor or communication disabilities. However, the utilization of this technology has found news applications, such as increasing human capacities. Nowadays, several researchers are working on probing human capabilities to control several robotic devices simultaneously. The design of BCI is an intricate work that needs a long time to its implementation. For this reason, an architecture to design and implement different types of BCIs is presented in this article. The architecture has a modular design capable of reading various electroencephalography (EEG) sensors and controlling several robotic devices similar to the plug-and-play paradigm. To test the proposed architecture, a BCI was able to manage a hexapod robot and a drone was implemented. Firstly, a mobile robotic platform was designed and implemented. The BCI is based on eye blinking, where a single blinking represents a robot command. The command orders the robot to initiate or stops their locomotion for the hexapod robot. For the drone, a blink represents the takeoff or landing order. The blinking signals are obtained from the prefrontal and frontal regions of the head by EEG sensors. The signals are then filtered using temporal filters, with cutoff frequencies based on delta, theta, alpha, and beta waves. The filtered signals were labeled and used to train a classifier based on the multilayer perceptron (MLP) model. To generate the robot command, the proposal BCI used two models of MLP to ensure the classifier prediction. So, when the two classifiers make the same prediction, within a defined time interval, send the signal to the robot to start or stop its movement. The obtained results show that it is possible to get high precision to control the hexapod robot with a precision of 91.7% and an average of 81.4%.
KW - BCI
KW - digital signal processing
KW - drone
KW - HW-SW architecture
KW - legged robot
KW - neural networks
KW - robot control
UR - http://www.scopus.com/inward/record.url?scp=85098000257&partnerID=8YFLogxK
UR - https://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=pure_univeritat_ramon_llull&SrcAuth=WosAPI&KeyUT=WOS:000602956800001&DestLinkType=FullRecord&DestApp=WOS_CPL
U2 - 10.1177/1729881420980256
DO - 10.1177/1729881420980256
M3 - Article
AN - SCOPUS:85098000257
SN - 1729-8806
VL - 17
JO - International Journal of Advanced Robotic Systems
JF - International Journal of Advanced Robotic Systems
IS - 6
M1 - 1729881420980256
ER -