Publications

Detailed Information

Graph-based PU learning for binary and multiclass classification without class prior

Cited 1 time in Web of Science Cited 1 time in Scopus
Authors

Yoo, Jaemin; Kim, Junghun; Yoon, Hoyoung; Kim, Geonsoo; Jang, Changwon; Kang, U.

Issue Date
2022-08
Publisher
Springer Verlag
Citation
Knowledge and Information Systems, Vol.64 No.8, pp.2141-2169
Abstract
How can we classify graph-structured data only with positive labels? Graph-based positive-unlabeled (PU) learning is to train a binary classifier given only the positive labels when the relationship between examples is given as a graph. The problem is of great importance for various tasks such as detecting malicious accounts in a social network, which are difficult to be modeled by supervised learning when the true negative labels are absent. Previous works for graph-based PU learning assume that the prior distribution of positive nodes is known in advance, which is not true in many real-world cases. In this work, we propose GRAB (Graph-based Risk minimization with iterAtive Belief propagation), a novel end-to-end approach for graph-based PU learning that requires no class prior. GRAB runs marginalization and update steps iteratively. The marginalization step models the given graph as a Markov network and estimates the marginals of latent variables. The update step trains the binary classifier by utilizing the computed marginals in the objective function. We then generalize GRAB to multi-positive unlabeled (MPU) learning, where multiple positive classes exist in a dataset. Extensive experiments on five real-world datasets show that GRAB achieves the state-of-the-art performance, even when the true prior is given only to the competitors.
ISSN
0219-1377
URI
https://hdl.handle.net/10371/184810
DOI
https://doi.org/10.1007/s10115-022-01702-8
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share