Robot guiding with obstacle avoidance algorithm for uncertain enviroments based on DTCNN

J. Albós-Canals, Jose Ás Villasante-Bembibre, S. Consul-Pacareu, Jordi Riera-Baburés, X. Vilasis-Cardona

Research output: Book chapterConference contributionpeer-review

1 Citation (Scopus)

Abstract

This paper introduces two applications of Discrete Time Cellular Non-Linear Networks (DTCNN) in a robot guiding avoiding obstacles algorithm and prove the feasibility of both applications: a high data rate one, using a CMOS camera, and small data rate one, using ultrasonic sensors. The key value of DTCNNs is the locally connections and the parallelism in processing. These characteristics permit a hardware implementation, in our case over a Field Programmable Gate Arraw (FPGA) and a real time template based algorithm processing. A camera and an ultrasonic sensor are used as avoiding obstacles system, requiring both implemnetations, different inputs informations: the first one complex enviroment information and the later for basic situations information where impulsive response is required. Both input can have an enhanced behaviour within DTCNN structure.

Original languageEnglish
Title of host publication2010 IEEE World Congress on Computational Intelligence, WCCI 2010 - 2010 International Joint Conference on Neural Networks, IJCNN 2010
DOIs
Publication statusPublished - 2010
Event2010 6th IEEE World Congress on Computational Intelligence, WCCI 2010 - 2010 International Joint Conference on Neural Networks, IJCNN 2010 - Barcelona, Spain
Duration: 18 Jul 201023 Jul 2010

Publication series

NameProceedings of the International Joint Conference on Neural Networks

Conference

Conference2010 6th IEEE World Congress on Computational Intelligence, WCCI 2010 - 2010 International Joint Conference on Neural Networks, IJCNN 2010
Country/TerritorySpain
CityBarcelona
Period18/07/1023/07/10

Fingerprint

Dive into the research topics of 'Robot guiding with obstacle avoidance algorithm for uncertain enviroments based on DTCNN'. Together they form a unique fingerprint.

Cite this