-
-
-
-
-
-
Inference Providers
Active filters: sea
Text Generation
• 7B • Updated
• 8.87k
• 68
mlx-community/SeaLLM-7B-v2-4bit-mlx
Updated
• 9
• 3
LoneStriker/SeaLLM-7B-v2-GGUF
7B • Updated
• 189
• 6
LoneStriker/SeaLLM-7B-v2-3.0bpw-h6-exl2
Text Generation
• Updated
LoneStriker/SeaLLM-7B-v2-4.0bpw-h6-exl2
Text Generation
• Updated
• 2
LoneStriker/SeaLLM-7B-v2-5.0bpw-h6-exl2
Text Generation
• Updated
• 1
LoneStriker/SeaLLM-7B-v2-6.0bpw-h6-exl2
Text Generation
• Updated
LoneStriker/SeaLLM-7B-v2-8.0bpw-h8-exl2
Text Generation
• Updated
• 1
LoneStriker/SeaLLM-7B-v2-AWQ
Text Generation
• 7B • Updated
• 2
Text Generation
• 8B • Updated
• 168
• 28
Text Generation
• 4B • Updated
• 168
• 6
Text Generation
• 2B • Updated
• 124
• • 8
Text Generation
• 0.6B • Updated
• 132
• 9
Text Generation
• 8B • Updated
• 116
• 8
Text Generation
• 4B • Updated
• 108
• 2
Text Generation
• 2B • Updated
• 44
• • 6
Text Generation
• 0.6B • Updated
• 170
• 7
sail/Sailor-1.8B-Chat-gguf
2B • Updated
• 282
• 3
sail/Sailor-0.5B-Chat-gguf
0.6B • Updated
• 522
• 4
4B • Updated
• 300
• 3
8B • Updated
• 379
• 5
Text Generation
• 9B • Updated
• 13.2k
• 50
SeaLLMs/SeaLLM-7B-v2.5-GGUF
9B • Updated
• 125
• 8
SeaLLMs/SeaLLM-7B-v2.5-mlx-quantized
Text Generation
• 2B • Updated
• 2
• 2
NikolayKozloff/Sailor-7B-Q8_0-GGUF
8B • Updated
• 9
• 1
QuantFactory/SeaLLM-7B-v2.5-GGUF
Text Generation
• 9B • Updated
• 128
• 1
QuantFactory/SeaLLM-7B-v2-GGUF
Text Generation
• 7B • Updated
• 79
• 1
Image-Text-to-Text
• 8B • Updated
• 5
• 5
NghiemAbe/SeaLLM-7B-v2.5-AWQ
Text Generation
• Updated
• 6