Robust on-line neural learning classifier system for data stream classification tasks

Andreu Sancho-Asensio*, Albert Orriols-Puig, Elisabet Golobardes

*Corresponding author for this work

    Research output: Indexed journal article Articlepeer-review

    12 Citations (Scopus)

    Abstract

    The increasing integration of technology in the different areas of science and industry has resulted in the design of applications that generate large amounts of data on-line. Most often, extracting information from these data is key, in order to gain a better understanding of the processes that the data are describing. Learning from these data poses new challenges to traditional machine learning techniques, which are not typically designed to deal with data in which concepts and noise levels may vary over time. The purpose of this paper is to present supervised neural constructivist system (SNCS), an accuracy-based neural-constructivist learning classifier system that makes use of multilayer perceptrons to learn from data streams with a fast reaction capacity to concept changes. The behavior of SNCS on data stream problems with different characteristics is carefully analyzed and compared with other state-of-the-art techniques in the field. This comparison is also extended to a large collection of real-world problems. The results obtained show that SNCS can function in a variety of problem situations producing accurate classification of data, whether the data are static or in dynamic streams.

    Original languageEnglish
    Pages (from-to)1441-1461
    Number of pages21
    JournalSoft Computing
    Volume18
    Issue number8
    DOIs
    Publication statusPublished - Aug 2014

    Keywords

    • Concept drift
    • Data streams
    • Genetic algorithms
    • Learning classifier systems
    • Neural constructivism
    • Neural networks

    Fingerprint

    Dive into the research topics of 'Robust on-line neural learning classifier system for data stream classification tasks'. Together they form a unique fingerprint.

    Cite this