Publications

Detailed Information

Differentiable Forward and Backward Fixed-Point Iteration Layers

DC Field Value Language
dc.contributor.authorJeon, Younghan-
dc.contributor.authorLee, Minsik-
dc.contributor.authorChoi, Jin Young-
dc.date.accessioned2022-10-05T04:09:58Z-
dc.date.available2022-10-05T04:09:58Z-
dc.date.created2021-03-05-
dc.date.issued2021-01-
dc.identifier.citationIEEE Access, Vol.9, pp.18383-18392-
dc.identifier.issn2169-3536-
dc.identifier.urihttps://hdl.handle.net/10371/185291-
dc.description.abstractRecently, several studies have proposed methods to utilize some classes of optimization problems in designing deep neural networks to encode constraints that conventional layers cannot capture. However, these methods are still in their infancy and require special treatments, such as the analysis of the Karush-Kuhn-Tucker (KKT) condition, to derive the backpropagation formula. In this paper, we propose a new formulation called the fixed-point iteration (FPI) layer, which facilitates the use of more complicated operations in deep networks. The backward FPI layer, which is motivated by the recurrent backpropagation (RBP) algorithm, is also proposed. However, in contrast to RBP, the backward FPI layer yields the gradient using a small network module without explicitly calculating the Jacobian. In actual applications, both forward and backward FPI layers can be treated as nodes in the computational graphs. All the components of our method are implemented at a high level of abstraction, which allows efficient higher-order differentiations on the nodes. In addition, we present two practical methods, the neural net FPI (FPI_NN) layer and the gradient descent FPI (FPI_GD) layer, whereby the FPI update operations are a small neural network module and a single gradient descent step based on a learnable cost function, respectively. FPI_NN is intuitive and simple, while FPI_GD can be used to efficient train energy function networks that have been studied recently. While RBP and related studies have not been applied to practical examples, our experiments show that the FPI layer can be successfully applied to real-world problems such as image denoising, optical flow, and multi-label classification.-
dc.language영어-
dc.publisherInstitute of Electrical and Electronics Engineers Inc.-
dc.titleDifferentiable Forward and Backward Fixed-Point Iteration Layers-
dc.typeArticle-
dc.identifier.doi10.1109/ACCESS.2021.3053764-
dc.citation.journaltitleIEEE Access-
dc.identifier.wosid000615026600001-
dc.identifier.scopusid2-s2.0-85106767831-
dc.citation.endpage18392-
dc.citation.startpage18383-
dc.citation.volume9-
dc.description.isOpenAccessY-
dc.contributor.affiliatedAuthorChoi, Jin Young-
dc.type.docTypeArticle-
dc.description.journalClass1-
Appears in Collections:
Files in This Item:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share