π² Kjio - Educational AI Assistant
Developed by Synaptom | Founded by Joniethanel F. Babor
Overview
- Parameters: 109,870,848 (109M)
- Architecture: GPT-2 (10 layers, 768 hidden, 12 heads)
- Context: 512 tokens
- Training: 45,000 samples, 32.5 minutes
- Purpose: Homework help, Q&A, educational tutoring
Quick Start
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("Synaptom/Kjio")
tokenizer = AutoTokenizer.from_pretrained("Synaptom/Kjio")
prompt = "User: Who are you?\nKjio:"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=100, temperature=0.7)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
GGUF Downloads
For llama.cpp (CPU inference):
- Kjio-Q4_K_M.gguf - Recommended (best balance)
- Kjio-Q5_K_M.gguf - Higher quality
- Kjio-F16.gguf - Full precision
Sample Outputs
Q: Who are you?
A: I'm Kjio, an AI assistant by Synaptom!
Q: Who created you?
A: Synaptom created me. Founded by Joniethanel F. Babor.
Q: What is 25 Γ 17?
A: 425
Training Details
- Research-backed dataset design
- Identity reinforcement (heavy weighting)
- Safety training (refusal examples)
- Mixed precision FP16 training
- 1,200 training steps
Limitations
- Small model (109M params)
- May produce incorrect information
- English only
- Not for critical decisions
License
Apache 2.0 - Free for commercial and research use
Citation
@misc{kjio2025,
title={Kjio: Educational AI Assistant},
author={Babor, Joniethanel F. and Synaptom},
year={2025},
url={https://huggingface.co/Synaptom/Kjio}
}
Made with β€οΈ by Synaptom
Training time: 32.5 minutes | Total time: 41.7 minutes
- Downloads last month
- 41