Publications

Detailed Information

Generative Neural Fields by Mixtures of Neural Implicit Functions

Cited 0 time in Web of Science Cited 0 time in Scopus
Authors

You, Tackgeun; Kim, Jungtaek; Kim, Mijeong; Han, Bohyung

Issue Date
2023-12
Publisher
Neural information processing systems foundation
Citation
Advances in Neural Information Processing Systems, Vol.36
Abstract
We propose a novel approach to learning the generative neural fields represented by linear combinations of implicit basis networks. Our algorithm learns basis networks in the form of implicit neural representations and their coefficients in a latent space by either conducting meta-learning or adopting auto-decoding paradigms. The proposed method easily enlarges the capacity of generative neural fields by increasing the number of basis networks while maintaining the size of a network for inference to be small through their weighted model averaging. Consequently, sampling instances using the model is efficient in terms of latency and memory footprint. Moreover, we customize denoising diffusion probabilistic model for a target task to sample latent mixture coefficients, which allows our final model to generate unseen data effectively. Experiments show that our approach achieves competitive generation performance on diverse benchmarks for images, voxel data, and NeRF scenes without sophisticated designs for specific modalities and domains.
ISSN
1049-5258
URI
https://hdl.handle.net/10371/204811
DOI
https://doi.org/10.48550/arXiv.2310.19464
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share