Research Projects

My recent projects in deep learning, continual learning, and neural network interpretability.

Generative Sample Removal in Continual Learning (Published in CVPR 2025 Workshop SynData4CV)

We investigate using synthetic data (GANs, Diffusion) to replace natural samples in continual task learning. Our EpochLoss strategy removes uninformative examples, enhancing adversarial robustness and model generalization.

Network Compression

Explaining Deep Network Compression via Latent Spaces (Published in ACM Transactions on Probabilistic Machine Learning)

A theoretical framework to explain DNN pruning through information-theoretic divergence in latent spaces. Introduces novel projection metrics AP2 and AP3 and validates them on ResNet, VGG16, and ViT.