Robust knowledge distillation training pipeline that improves lightweight vision model predictive performance.
Python
2023
First ever kaggle competition. Submitted solutions ranked in the top 36% and 9% for the leaderboard track and efficiency track, respectively, out of more than 2.5k participants globally.
2022
Optimizing BERT model predictive performance through transfer learning, data augmentation and knowledge distillation.
2021
Data Science Consulting
Consulted a Malaysia-based SME to identify key findings and make recommendations for better targeting of sales efforts and optimising monthly purchases through advanced data analytics.