NbAiLab/NCC
Viewer • Updated • 5.58M • 1.08k • 3
How to use vesteinn/FoBERT with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("fill-mask", model="vesteinn/FoBERT") # Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("vesteinn/FoBERT")
model = AutoModelForMaskedLM.from_pretrained("vesteinn/FoBERT")This is a Faroese language model, it was trained by adapting the ScandiBERT-no-faroese model on the FC3 corpus for 50 epochs.
If you find this model useful, please cite
@inproceedings{snaebjarnarson-etal-2023-transfer,
title = "{T}ransfer to a Low-Resource Language via Close Relatives: The Case Study on Faroese",
author = "Snæbjarnarson, Vésteinn and
Simonsen, Annika and
Glavaš, Goran and
Vulić, Ivan",
booktitle = "Proceedings of the 24th Nordic Conference on Computational Linguistics (NoDaLiDa)",
month = "may 22--24",
year = "2023",
address = "Tórshavn, Faroe Islands",
publisher = {Link{\"o}ping University Electronic Press, Sweden},
}