news
May 15, 2025 | 🚀 Continual Quantization-Aware Pre-Training: When to transition from 16-bit to 1.58-bit pre-training for BitNet language models?, has been accepted to the ACL 2025 Findings! |
---|---|
Feb 01, 2025 | I officially started as an industrial PhD at Ordbogen A/S and University of Southern Denmark. |
Jan 24, 2024 | I successfully defended my masters thesis titled: “Object Detection with Transformers from A Drone Perspective” graded 12 on the Danish grading scale (ECTS mark: A) |