How to use suayptalha/MoE-Router with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="suayptalha/MoE-Router")
# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("suayptalha/MoE-Router") model = AutoModelForSequenceClassification.from_pretrained("suayptalha/MoE-Router")
labels = ['code', 'if', 'math', 'medical']
Files info
Base model