Skip to content

Code for "Unsupervised Pretraining for Fact Verification by Language Model Distillation" (ICLR 2024)

License

Notifications You must be signed in to change notification settings

AdrianBZG/SFAVEL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Mar 3, 2024
76e54cf · Mar 3, 2024

History

1 Commit
Mar 3, 2024
Mar 3, 2024
Mar 3, 2024
Mar 3, 2024
Mar 3, 2024
Mar 3, 2024
Mar 3, 2024
Mar 3, 2024
Mar 3, 2024
Mar 3, 2024
Mar 3, 2024
Mar 3, 2024
Mar 3, 2024
Mar 3, 2024
Mar 3, 2024
Mar 3, 2024

Repository files navigation

SFAVEL: Unsupervised Pretraining for Fact Verification by Language Model Distillation

This is the official implementation of the paper "Unsupervised Pretraining for Fact Verification by Language Model Distillation".

Code coming up soon, stay tuned!

Citation

@inproceedings{
bazaga2024unsupervised,
title={Unsupervised Pretraining for Fact Verification by Language Model Distillation},
author={Adrián Bazaga and Pietro Liò and Gos Micklem},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=1mjsP8RYAw}
}

Contact

For feedback, questions, or press inquiries please contact Adrián Bazaga