MBPP SFT CodeParrot

Fine-tuned variant of codeparrot-small on 500 MBPP samples.

Usage

from transformers import AutoTokenizer, AutoModelForCausalLM

m = "vanishingradient/mbpp-sft-codeparrot"
tok = AutoTokenizer.from_pretrained(m)
mod = AutoModelForCausalLM.from_pretrained(m)

x = tok("Write a python function to reverse a string.", return_tensors="pt")
y = mod.generate(**x, max_new_tokens=120)
print(tok.decode(y[0], skip_special_tokens=True))
Downloads last month
8
Safetensors
Model size
0.1B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support