Add `library_name` to metadata

#9
by nielsr HF Staff - opened

This PR enhances the model card by adding library_name: transformers to the metadata.

This tag is justified by the config.json file, which specifies "architectures": ["LlamaForCausalLM"] and "model_type": "llama". Llama-based models are typically integrated and used with the Hugging Face transformers library, enabling a predefined code snippet for users on the Hub.

No sample usage code snippet has been added as the provided GitHub README does not contain a suitable Python example for programmatic inference via a library.

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment