Jacob Nielsen
JacobBITLABS
AI & ML interests
None yet
Recent Activity
new activity
6 days ago
danish-foundation-models/danish-dynaword:Governmentpdfs
authored
a paper
about 2 months ago
Continual Quantization-Aware Pre-Training: When to transition from
16-bit to 1.58-bit pre-training for BitNet language models?
authored
a paper
about 2 months ago
DeToNATION: Decoupled Torch Network-Aware Training on Interlinked Online Nodes