thinking about distributed, privacy-preserving and collaborative ml 🤔


ML Research @ University of Cambridge & Flower Labs;
community lead: ml-theory @ cohere labs.

Research
Read research overview

At present, my machine learning interests are centred around distributed, privacy-preserving, efficient and collaborative machine learning. In particular, I have a great affinity towards paradigms such as federated learning and neural network compression, and the interplay this has with optimisation theory. Generally speaking, the motivating factor for this is that I am circumspect towards the promise of centralised machine learning paradigms, which require enormous amounts of data and computational resources to achieve their state-of-the-art performance. Instead, I believe that there is much benefit to be derived from investigating alternate paradigms that can scale while promoting privacy inherently, supported by sound mathematical foundations.

If you find any of my research interesting, or you would like to collaborate on any ideas, please do not hesitate to reach out.

year title authors tags paper code misc
2026 LoRDO: Distributed Low-Rank Optimization with Infrequent Communication jovanović, iacob, safaryan, modoranu, sani, shen, qiu, alistarh, lane 🌳📈 arXiv
2025 MT-DAO: Multi-Timescale Distributed Adaptive Optimizers with Local Updates iacob, jovanović, safaryan, kurmanji, sani, horváth, shen, qiu, lane 🌳📈 ICLR 2026
2025 DES-LOC: Desynced Low Communication Adaptive Optimizers for Training Foundation Models iacob, sani, safaryan, giampouras, horváth, jovanović, kurmanji, aleksandrov, shen, qiu, lane 🌳📈 ICLR 2026
2025 Position: It's Time to Act on the Risk of Efficient Personalized Text Generation iofinova, jovanović, alistarh, 🔖 arxiv
2025 Panza: Design and Analysis of a Fully-Local Personalized Text Writing Assistant nicolicioiu, iofinova, jovanović, kurtic, nikdan, panferov, markov, shavit, alistarh 💻🗣️🌀 CPAL 2026 and ICLR 2025 Workshop on Foundation Models in the Wild bonus video
2024 Second-Order Optimisation and Imbalanced Class Distribution in Emotional Analysis jovanović 🌀📈 MPhil Thesis @ Cambridge
2023 Rumour Detection in the Wild: A Browser Extension for Twitter jovanović & ross 💻🗣️🌀 NLP-OSS @ EMNLP2023 ACL
2023 EDAC: Efficient Deployment of Audio Classification Models For COVID-19 Detection jovanović*, mihaly* & donaldson* 🌳🌀 arxiv preprint

resources + posts



education

experience

  • camlsys and flower labs logos

    May 2025 - Oct 2025

    research assistant | camlsys & flower labs

    Conducting research focusing on federated learning applied to the (pre)training of large scale models.

  • DAS logo

    Oct 2024 - March 2025

    research assistant | institute of science and technology austria Alistarh Group

    Working as a research assistant under the supervision of Dan Alistarh in the Distributed Algorithms and Systems (DAS) group focusing on compressed federated (pre)training of foundation models.

  • Summer 2023 - Summer 2023

    app. ai research intern | jp morgan chase & co. (Applied Innovation of Artificial Intelligence (AI2))

    Developed unsupervised learning methods, with an associated parallel data processing pipeline, for anomaly detection which increased previous performance by 30%.

  • Summer 2022 - Summer 2022

    ml eng intern | jp morgan chase & co. (Applied Innovation of Artificial Intelligence (AI2))

    Pursued personal research in Natural Language Processing, building a search engine, powered by dense passage retrieval, with q&a capabilities. Assisted researchers in delivering proof of concept projects, onboarding them to CI/CD platforms.

collaboration
view here

I have had the fortune to work with many kind and talented individuals throughout my young career. In particular, thank you to the following individuals for inspiring me (in no particular order):
alessandro palmarini
simon yu
sree harsha nelaturu