Publications
Detailed Information
Advancing Beyond Identification: Multi-bit Watermark for Large Language Models via Position Allocation
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Yoo, KiYoon | - |
dc.contributor.author | Ahn, Wonhyuk | - |
dc.contributor.author | Kwak, Nojun | - |
dc.date.accessioned | 2024-08-08T01:17:32Z | - |
dc.date.available | 2024-08-08T01:17:32Z | - |
dc.date.created | 2024-08-05 | - |
dc.date.created | 2024-08-05 | - |
dc.date.issued | 2024 | - |
dc.identifier.citation | Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL 2024, Vol.1, pp.4031-4055 | - |
dc.identifier.uri | https://hdl.handle.net/10371/204962 | - |
dc.description.abstract | We show the viability of tackling misuses of large language models beyond the identification of machine-generated text. While existing zero-bit watermark methods focus on detection only, some malicious misuses demand tracing the adversary user for counteracting them. To address this, we propose Multi-bit Watermark via Position Allocation, embedding traceable multi-bit information during language model generation. Through allocating tokens onto different parts of the messages, we embed longer messages in high corruption settings without added latency. By independently embedding sub-units of messages, the proposed method outperforms the existing works in terms of robustness and latency. Leveraging the benefits of zero-bit watermarking (Kirchenbauer et al., 2023a), our method enables robust extraction of the watermark without any model access, embedding and extraction of long messages (≥ 32-bit) without finetuning, and maintaining text quality, while allowing zero-bit detection all at the same time. Code is released here: https://github.com/bangawayoo/mb-lmwatermarking. | - |
dc.publisher | Association for Computational Linguistics (ACL) | - |
dc.title | Advancing Beyond Identification: Multi-bit Watermark for Large Language Models via Position Allocation | - |
dc.type | Article | - |
dc.identifier.doi | 10.18653/v1/2024.naacl-long.224 | - |
dc.citation.journaltitle | Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL 2024 | - |
dc.identifier.scopusid | 2-s2.0-85199524591 | - |
dc.citation.endpage | 4055 | - |
dc.citation.startpage | 4031 | - |
dc.citation.volume | 1 | - |
dc.description.isOpenAccess | N | - |
dc.contributor.affiliatedAuthor | Kwak, Nojun | - |
dc.type.docType | Conference Paper | - |
dc.description.journalClass | 1 | - |
- Appears in Collections:
- Files in This Item:
- There are no files associated with this item.
Related Researcher
- Graduate School of Convergence Science & Technology
- Department of Intelligence and Information
Item View & Download Count
Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.