Publications

Detailed Information

CONTEXTUAL LABEL TRANSFORMATION FOR SCENE GRAPH GENERATION

DC Field Value Language
dc.contributor.authorLee, Wonhee-
dc.contributor.authorKim, Sungeun-
dc.contributor.authorKim, Gun Hee-
dc.date.accessioned2022-06-24T00:26:19Z-
dc.date.available2022-06-24T00:26:19Z-
dc.date.created2022-05-09-
dc.date.issued2021-01-
dc.identifier.citationProceedings - International Conference on Image Processing, ICIP, Vol.2021-September, pp.2533-2537-
dc.identifier.issn1522-4880-
dc.identifier.urihttps://hdl.handle.net/10371/183767-
dc.description.abstract© 2021 IEEE.For scene graph generation, it is crucial to properly understand the relationships of objects within the context of the image. We design a label transformation method using a Transformer-VAE (Variational Autoencoder) structure, which converts bounding box labels into auxiliary labels that contain each objects context in an unsupervised manner. The auxiliary labels are then trained jointly with bounding box labels and relation labels in a multi-task way. Our approach does not require any external datasets or language prior and is applicable to any graph generation models that infer the relationship between pairs of objects. We validate our methods effectiveness and scalability with state-of-the-art scene graph generation models on VRD and VG datasets.-
dc.language영어-
dc.publisherIEEE-
dc.titleCONTEXTUAL LABEL TRANSFORMATION FOR SCENE GRAPH GENERATION-
dc.typeArticle-
dc.identifier.doi10.1109/ICIP42928.2021.9506213-
dc.citation.journaltitleProceedings - International Conference on Image Processing, ICIP-
dc.identifier.scopusid2-s2.0-85125591467-
dc.citation.endpage2537-
dc.citation.startpage2533-
dc.citation.volume2021-September-
dc.description.isOpenAccessN-
dc.contributor.affiliatedAuthorKim, Gun Hee-
dc.type.docTypeConference Paper-
dc.description.journalClass1-
Appears in Collections:
Files in This Item:
There are no files associated with this item.

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share