I recently completed my PhD at UNSW Sydney, where I was advised by Dr. Lina Yao and Dr. Dong Gong.
Prior to that, I completed my Erasmus Mundus Joint Master's Degree in Advanced Systems Dependability from the University of St Andrews, UK and l'Université de Lorraine, France. During my master's, I interned with the MULTISPEECH group at Inria Nancy where I worked with Dr. Emmanuel Vincent.
My current research interests span multimodal generative models, agentic frameworks, and continual learning.
Mentor: Dr. Shengju Qian.
Worked on controllable image generation and preference optimization for multi-modal LLMs.
Mentor: Dr. Shiqi Yang
Worked on continual personalization of pre-trained text-to-image diffusion models.
Supervisor: Dr. Joost van de Weijer
Worked on rehearsal-free continual learning (CL) for Vision Transformers (ViTs).
Supervisor: Dr. Emmanuel Vincent
Worked on learning domain-specific language models for speech recognition.
Mentor: Keval Dave
Worked on improving FactSet's named entity recognition service.
ICLR 2025
We propose using diffusion classifier scores for regularizing the parameter-space and function-space of text-to-image diffusion models, to achieve continual personalization.
Our work proposes Continual LeArning with Probabilistic finetuning (CLAP) - a probabilistic modeling frame- work over visual-guided text features per task, thus providing more calibrated CL finetuning.
We propose a neural process-based continual learning approach with task-specific modules arranged in a hierarchical latent variable model. We tailor regularizers on the learned latent distributions to alleviate forgetting.
ICCV 2023
To model attribute-object entanglement, we design a reverse-and-distill strategy that learns disentangled representations of elementary components in training data supervised by reverse attention and knowledge distillation.
We investigate the continual learning of Vision Transformers (ViT) for the challenging exemplar-free scenario, with special focus on how to efficiently distill the knowledge of its crucial self-attention mechanism.