Instructions to use UdS-LSV/mcse-coco-roberta-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use UdS-LSV/mcse-coco-roberta-base with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="UdS-LSV/mcse-coco-roberta-base")# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("UdS-LSV/mcse-coco-roberta-base") model = AutoModel.from_pretrained("UdS-LSV/mcse-coco-roberta-base") - Notebooks
- Google Colab
- Kaggle
MCSE: Multimodal Contrastive Learning of Sentence Embeddings (NAACL 2022)
Paper link: https://aclanthology.org/2022.naacl-main.436/
Github: https://github.com/uds-lsv/MCSE
Author list: Miaoran Zhang, Marius Mosbach, David Adelani, Michael Hedderich, Dietrich Klakow
Model Details
- base model: roberta-base
- training data: Wiki1M + MS-COCO
Evaluation Results
| STS12 | STS13 | STS14 | STS15 | STS16 | STSBenchmark | SICKRelatedness | Avg. |
|---|---|---|---|---|---|---|---|
| 70.79 | 82.81 | 76.16 | 83.13 | 81.76 | 81.72 | 70.81 | 78.17 |
- Downloads last month
- 11