Publications

Detailed Information

Dimensionality reduction based on ICA for regression problems

Cited 19 time in Web of Science Cited 20 time in Scopus
Authors

Kwak, Nojun; Kim, Chunghoon; Kim, Hwangnam

Issue Date
2008-08
Publisher
ELSEVIER
Citation
NEUROCOMPUTING, Vol.71 No.13-15, pp.2596-2603
Abstract
In manipulating data such as in supervised learning, we often extract new features from the original input variables for the purpose of reducing the dimensions of input space and achieving better performances. In this paper, we show how standard algorithms for independent component analysis (ICA) can be extended to extract attributes for regression problems. The advantage is that general ICA algorithms become available to a task of dimensionality reduction for regression problems by maximizing the joint mutual information between target variable and new attributes. We applied the proposed method to a couple of real world regression problems as well as some artificial problems and compared the performances with those of other conventional methods. Experimental results show that the proposed method can efficiently reduce the dimension of input space without degrading the regression performance. (C) 2008 Elsevier B.V. All rights reserved.
ISSN
0925-2312
URI
https://hdl.handle.net/10371/208348
DOI
https://doi.org/10.1016/j.neucom.2007.11.036
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Related Researcher

  • Graduate School of Convergence Science & Technology
  • Department of Intelligence and Information
Research Area Feature Selection and Extraction, Object Detection, Object Recognition

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share