
I was a postdoc at the Singapore-MIT Alliance for Research and Technology working on neurosymbolic methods for efficient adaptation. My PhD was at EPFL supervised by Amir Zamir, on making models more reliable under changing environments. In my past life, I was a quant in New York and London and worked on creating systematic investment strategies for equity portfolios (or, a glorified coin flipper).
Controlled Training Data Generation with Diffusion Models
T. Yeo*, A. Atanov*, H. Benoit^, A. Alekseev^, R. Ray, P. Esmaeil Akhoondi, A. Zamir
ViPer: Visual Personalization of Generative Models via Individual Preference Learning
S. Salehi, M. Shafiei, R. Bachmann, T. Yeo, A. Zamir
🔍 Spotlight
4M: Massively Multimodal Masked Modelling
D. Mizrahi, R. Bachmann, O. F. Kar, T. Yeo, M. Gao, A. Dehghan, A. Zamir
Rapid Network Adaptation: Learning to Adapt Neural Networks Using Test-Time Feedback
T. Yeo, O. F. Kar, Z. Sodagar, A. Zamir
Task Discovery: Finding the Tasks that Neural Networks Generalize on
A. Atanov, A. Filatov, T. Yeo, A. Sohmshetty, A. Zamir
🎤 Oral
3D Common Corruptions and Data Augmentation
O. F. Kar, T. Yeo, A. Atanov, A. Zamir
🎤 Oral
🎤 Oral
Robust Learning Through Cross-Task Consistency
A. Zamir*, A. Sax*, T. Yeo, O. F. Kar, N. Cheerla, R. Suri, Z. Cao, J. Malik, L. Guibas
🎤 Oral
Iterative Classroom Teaching
T. Yeo, P. Kamalaruban, A. Singla, A. Merchant, T. Asselborn, L. Faucon, P. Dillenbourg, V. Cevher
EPFL
Ph.D. in Computer Science
Advisor: Amir Zamir, Pierre Dillenbourg
Thesis: Making Computer Vision Models Robust and Adaptive
University of Cambridge
M.Phil. in Machine Learning and Machine Intelligence
Thesis: Bayesian optimization for natural language processing
Postdoctoral Researcher — Singapore-MIT Alliance for Research and Technology
Teaching Assistant — EPFL
Spring 2018, 2019, 2020: EE559 Deep Learning
Fall 2019: CS433 Machine Learning