nyu-mll/glue
Viewer • Updated • 1.49M • 462k • 495
How to use JeremiahZ/roberta-base-wnli with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="JeremiahZ/roberta-base-wnli") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("JeremiahZ/roberta-base-wnli")
model = AutoModelForSequenceClassification.from_pretrained("JeremiahZ/roberta-base-wnli")This model is a fine-tuned version of roberta-base on the GLUE WNLI dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| No log | 1.0 | 40 | 0.6849 | 0.5634 |
| No log | 2.0 | 80 | 0.6912 | 0.5634 |
| No log | 3.0 | 120 | 0.6918 | 0.5634 |
| No log | 4.0 | 160 | 0.6964 | 0.4366 |
| No log | 5.0 | 200 | 0.6928 | 0.5634 |
| No log | 6.0 | 240 | 0.7005 | 0.4366 |
| No log | 7.0 | 280 | 0.6964 | 0.3099 |
| No log | 8.0 | 320 | 0.6986 | 0.3521 |
| No log | 9.0 | 360 | 0.6969 | 0.5493 |
| No log | 10.0 | 400 | 0.6976 | 0.5634 |