Publications

Detailed Information

Structured Energy Network as a Loss Function

Cited 0 time in Web of Science Cited 0 time in Scopus
Authors

Lee, Jay-Yoon; Patel, Dhruvesh; Goyal, Purujit; Zhao, Wenlong; Xu, Zhiyang; McCallum, Andrew

Issue Date
2022
Publisher
Advances in Neural Information Processing Systems
Citation
Advances in Neural Information Processing Systems, Vol.35
Abstract
Belanger & McCallum (2016) and Gygli et al. (2017) have shown that energy networks can capture arbitrary dependencies amongst the output variables in structured prediction; however, their reliance on gradient based inference (GBI) makes the inference slow and unstable. In this work, we propose Structured Energy As Loss (SEAL) to take advantage of the expressivity of energy networks without incurring the high inference cost. This is a novel learning framework that uses an energy network as a trainable loss function (loss-net) to train a separate neural network (task-net), which is then used to perform inference through a forward pass. We establish SEAL as a general framework wherein various learning strategies like margin-based, regression, and noise-contrastive could be employed to learn the parameters of loss-net. Through extensive evaluation on multi-label classification, semantic role labeling, and image segmentation, we demonstrate that SEAL provides various useful design choices, is faster at inference than GBI, and leads to significant performance gains over the baselines.
ISSN
1049-5258
URI
https://hdl.handle.net/10371/200916
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Related Researcher

  • Graduate School of Data Science
Research Area Constraint injection, Energy-based models, Structured Prediction

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share