Instructions to use textattack/roberta-base-MRPC with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use textattack/roberta-base-MRPC with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="textattack/roberta-base-MRPC")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("textattack/roberta-base-MRPC") model = AutoModelForSequenceClassification.from_pretrained("textattack/roberta-base-MRPC") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 8f5b32f8a50405200eab6502b0dddbef9a17c3e19d0b9902c4e1379fed1fb500
- Size of remote file:
- 499 MB
- SHA256:
- f0b18579cbcbc116214655e09738909f368fe87dbb230b68640e26c606f79efa
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.