YAML Metadata Warning: empty or missing yaml metadata in repo card

Check out the documentation for more information.

This is my glorious attempt to understand the Mistral 7B model. Because the people from Mistral AI have open-sourced their model code, I tried to replicate a small version of the model. Like... really small. A whopping eight million parameters. Needless to say, the model is useless for anything.

The model was trained on a handful examples from the Cosmopedia dataset, which is an open-source version of the high quality textbook dataset in a similar style to the Phi dataset. Check out my GitHub to see the code used: https://github.com/LeonardPuettmann/understanding-mistral

How to use

Please don't. You should probably use Mistral 7B instead: mistralai/Mistral-7B-v0.3 Or if you are (very) GPU rich, you can try to train their model yourself: https://github.com/mistralai/mistral-inference

In the folder inference you actually find a small script, which allows you to chat with the 7B param model. All you need is a free HuggingFace API token.


license: apache-2.0

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support