Publications

Detailed Information

Superiority of Nonlinear Mapping in Decoding Multiple Single-Unit Neuronal Spike Trains: A Simulation Study

Cited 18 time in Web of Science Cited 24 time in Scopus
Authors

Kim, Kyung Hwan; Kim, Sung Shin; Kim, Sung June

Issue Date
2006-01
Publisher
Elsevier
Citation
J Neurosci Meth 2006;150:202-11
Keywords
Brain–machine interfaceSpike train decodingLinear filterMultilayer perceptronSupport vector machine
Abstract
One of the most important building blocks of the brain–machine interface (BMI) based on neuronal spike trains is the decoding algorithm,
a computational method for the reconstruction of desired information from spike trains. Previous studies have reported that a simple linear
filter is effective for this purpose and that no noteworthy gain is achieved from the use of nonlinear algorithms. In order to test this premise, we
designed several decoding algorithms based on the linear filter, and two nonlinear mapping algorithms using multilayer perceptron (MLP) and
support vector machine regression (SVR). Their performances were assessed using multiple neuronal spike trains generated by a biophysical
neuron model and by a directional tuning model of the primary motor cortex. The performances of the nonlinear algorithms, in general, were
superior. The advantages of using nonlinear algorithms were more profound for cases where false-positive/negative errors occurred in spike
trains. When the MLPs were trained using trial-and-error, they often showed disappointing performance comparable to that of the linear filter.
The nonlinear SVR showed the highest performance, and this may be due to the superiority of SVR in training and generalization.
ISSN
0165-0270
Language
English
URI
https://hdl.handle.net/10371/8857
DOI
https://doi.org/10.1016/j.jneumeth.2005.06.015
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share