MobileLLM 80M (Replication)

This is a replicated version of the MobileLLM 80M model, based on the paper MobileLLM: Optimizing Sub-billion Parameter Language Models for On-Device Use Cases.

  • Model Size: 80M parameters
  • Architecture: Llama-based (Deep & Thin)
  • Status: Research / Testing
Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Space using Yangyang1205/MobileLLM 1