Instructions to use J-Seo/BIH with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use J-Seo/BIH with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="J-Seo/BIH")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("J-Seo/BIH") model = AutoModelForSequenceClassification.from_pretrained("J-Seo/BIH") - Notebooks
- Google Colab
- Kaggle
BIH (BERT Imitates Human) Model
This is finetuned model based on pretrained klue/roberta-large
BIH learns the examples evaluated by native Korean speakers on the 'fit for commonsense'
How to use
Please check this git link J-Seo/SRLev-BIH
BibTeX entry and citation info
@inproceedings{jay2022SRLev-BIH,
title={SRLev-BIH: An Evaluation Metric for Korean Generative Commonsense Reasoning},
author={Jaehyung Seo, Yoonna Jang, Jaewook Lee, Hyeonseok Moon, Sugyeong Eo, Chanjun Park, Aram So, and Heuiseok Lim},
booktitle={Proceedings of the 34th Annual Conference on Human & Cognitive Language Technology},
affilation={Korea University, NLP & AI},
month={October},
year={2022}
}
- Downloads last month
- 6