arxiv:2508.02271
Jacob Nielsen
JacobBITLABS
AI & ML interests
None yet
Recent Activity
new activity
3 days ago
danish-foundation-models/danish-dynaword:Governmentpdfs
authored
a paper
about 1 month ago
Continual Quantization-Aware Pre-Training: When to transition from
16-bit to 1.58-bit pre-training for BitNet language models?
authored
a paper
about 1 month ago
DeToNATION: Decoupled Torch Network-Aware Training on Interlinked Online Nodes