Instructions to use answerdotai/ModernBERT-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use answerdotai/ModernBERT-base with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="answerdotai/ModernBERT-base")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("answerdotai/ModernBERT-base") model = AutoModelForMaskedLM.from_pretrained("answerdotai/ModernBERT-base") - Notebooks
- Google Colab
- Kaggle
tokenizer
#24
by ulasarikaya - opened
there isn't a specific ModernBertTokenizer like there is for DistilBert (transformers.DistilBertTokenizer).
could anyone clarify if there's an equivalent tokenizer available for ModernBert, or do I have to use a generic one like AutoTokenizer?
thanks
It uses the PreTrainedTokenizerFast tokenizer. You can see this in tokenizer_config.json. Auto tokenizer will choose that tokenizer class.
how to use it with BertTokenizerFast?