Research

Machine Learning

My research focuses on Deep Learning with a current emphasis on latent spaces and reasoning of Large Language Models.

First Author Publications

Daniel Kaiser, Arnoldo Frigessi, Ali Ramezani-Kebrya, Benjamin Ricaud

We decompose LLM reasoning token-efficiency into truncation robustness, conditional correctness, and workload-/trace-quality-normalized verbosity, to show that efficiency rankings can diverge from accuracy while revealing distinct sources of wasted tokens (verbosity, looping, or logic errors).

Daniel Kaiser, Arnoldo Frigessi, Ali Ramezani-Kebrya, Benjamin Ricaud

A synthetic benchmark grounded in Cognitive Load Theory (CLT) that generates natural-language logic puzzles with independently tunable parameters to diagnose LLM reasoning bottlenecks.

NeurIPS 2025: Workshop Efficient Reasoning

Daniel Kaiser, Arnoldo Frigessi, Ali Ramezani-Kebrya, Benjamin Ricaud

Building on the CogniLoad benchmark, this work introduces a novel efficiency metric for LLMs—tokens generated per solved puzzle—to evaluate computational cost alongside accuracy.

Amir Amel-Zadeh, Jan-Peter Calliess, Daniel Kaiser, Stephen Roberts

Authors listed in alphabetic order. Please refer to my thesis instead with all details.

Investigates the application of machine learning methods to forecast stock movements. First study to successfully apply Machine Learning to quantitative financial statement data.

Academic Achievements

Ranking in the top 1.5% of most downloaded authors on SSRN.

Received a full scholarship for research at the Oxford-Man Institute for Quantitative Finance.

Second place at the CQA Fall 2020 conference academic competition.

Commendation from MPLS Division (Oxford) for exceptional viva performance.

Finished MSc (by Research) degree in half the ordinary time.

Graduated BSc among the top 1% by study speed and GPA.

Won the Austrian foreign language competition in English 3 years in a row.