ACL2024: TTM-RE Memory-Augmented Document-Level Relation Extraction
Please see the scripts folder for example run files. Specifically, see run_roberta_rank.sh for the best results. Weights are released for roberta-large in the GitHub releases: https://github.com/chufangao/TTM-RE/releases/tag/v0.2
Please cite:
@inproceedings{gao-etal-2024-ttm,
title = "{TTM}-{RE}: Memory-Augmented Document-Level Relation Extraction",
author = "Gao, Chufan and
Wang, Xuan and
Sun, Jimeng",
editor = "Ku, Lun-Wei and
Martins, Andre and
Srikumar, Vivek",
booktitle = "Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = aug,
year = "2024",
address = "Bangkok, Thailand",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2024.acl-long.26",
doi = "10.18653/v1/2024.acl-long.26",
pages = "443--458",
Much thanks to https://github.com/www-Ye/SSR-PU and https://github.com/wzhouad/ATLOP for making their code open source.