Welcome to the DeepWok Lab
The DeepWok Lab, is an ML research group led by Dr. Aaron Zhao, where the group members are mainly from Imperial College London and the University of Cambridge.
Members
- Aaron Zhao (Faculty, PI)
- Cheng Zhang (PhD Student, co-supervised with Prof. George Constantinides)
- Victor Zhao (PhD Student, co-supervised with Prof. Pietro Lio)
- Zehui Li (PhD Student, co-supervised with Prof. Guy-Bart Stan)
- Mingzhu Shen (PhD Student, co-supervised with Prof. Christos Bouganis)
- Pedro Gimense (PhD Student, co-supervised with Prof. George Constantinides)
- Timon Schneider (PhD Student, co-supervised with Prof. Tom Ellis, and Prof. Guy-Bart Stan)
- Eleanor Clifford (PhD Student, co-supervised with Prof. Ross Anderson and Prof. Robert Mullins)
- Can Xiao (PhD Student, co-supervised with Dr. Jianyi Cheng)
- Jeffrey Tsz Hang Wong (PhD Student, co-supervised with Prof. Wayne Luk)
- Hanna Foerster (PhD Student, co-supervised with Prof. Robert Mullins)
Student Projects
Each year, we run and supervise a number of students for their undergraduate and master projects at Imperial College London and the University of Cambridge. We also run a great number of summer research internships.
We finished hiring for the 2024 summer, please only contact me for 2025 MEng/Mphil/MSc/Part II/Part III/Intern projets
Instead of listing the projects here, I found it is generally easier to provide the students with our interested research topics so that interested students can reach out to us. Here are some of the topics and large projects we are interested in continously working on:
-
Project MASE: MASE aims to provide a unified representation for software-defined ML heterogeneous system exploration. We are looking for students who are interested in working on this project with an ideal background and interest in ML System, Efficient ML and ML hardware acceleration.
-
Beyond Structure Data: We are interested in working on projects that involve unstructured and multimodal data, such as graphs, hypergraphs and combinational complex. We envision these data types would be the eanbler for the next generation of AI systems that goes beyond simple images and text. And we are looking for students who are interested in working on these projects.
-
Efficient AI: We are interested in different aspects of efficient AI, including efficient training, efficient inference, efficient model search and efficient model deployment with state-of-the-art GenAI models (eg. language and diffusion models).
-
System-level AI Safety: with the increasing capability of GenAI models and the growing complexity of AI systems, we are interested in working on projects that focus on the system-level AI safety, including robustness, security, and red-teaming these models to understand new vulnerabilities.
I am also happy to host self-proposed projects if it matches the Lab’s research interests. Feel free to contact a.zhao@imperial.ac.uk if you would like to do a project with us!
Publication
Year 2024
MD-DiT: Step-aware Mixture-of-Depths for Efficient Diffusion Transformers; M Shen, P Chen, P Ye, G Xia, T Chen, C Bouganis, Y Zhao; NeurIPS 2024 Workshop on Adaptive Foundation Models (NeurIPS 2024, AFM Workshop)
Architectural Neural Backdoors from First Principles; H Langford, I Shumailov, Y Zhao, R Mullins, N Papernot; IEEE Symposium on Security and Privacy 2024 (S&P 2024)
GV-Rep: A Large-Scale Dataset for Genetic Variant Representation Learning; Z Li, V Subasi, G Stan, Y Zhao, B Wang; The Thirty-eight Conference on Neural Information Processing Systems Datasets and Benchmarks Track (NeurIPS 2024, Datasets and Benchmarks Track)
Absorb & Escape: Overcoming Single Model Limitations in Generating Heterogeneous Genomic Sequences; Z Li, Y Ni, G Xia, W Beardall, A Das, G Stan, Y Zhao; The Thirty-eight Conference on Neural Information Processing Systems (NeurIPS 2024)
AI models collapse when trained on recursively generated data; I Shumailov, Z Shumaylov, Y Zhao, N Papernot, R Anderson, Y Gal; Nature 631 (Front Cover)
Enhancing Node Representations for Real-World Complex Networks with Topological Augmentation; X Zhao, Z Li, M Shen, G Stan, P Lio, Y Zhao; European Conference on Artificial Intelligence (ECAI 2024)
Unlocking the Global Synergies in Low-Rank Adapters; Z Zhang, C Zhang, X Gao, R Mullins, G Constantinides, Y Zhao; (ICML 2024 Workshop ES-FoMo-II)
Optimised Grouped-Query Attention Mechanism for Transformers; Y Chen, C Zhang, X Gao, R Mullins, G Constantinides, Y Zhao; (ICML 2024 Workshop ES-FoMo-II)
HASS: Hardware-Aware Sparsity Search for Dataflow DNN Accelerators; Z Yu, S Sreeram, K Agrawal, J Wu, A Montgomerie-Corcoran, C Zhang, J Cheng, C Bouganis, Y Zhao; The International Conference on Field-Programmable Logic and Applications (FPL 2024)
LQER: Low-Rank Quantization Error Reconstruction for LLMs; C Zhang, J Cheng, G Constantinides, Y Zhao; International Conference on Machine Learning (ICML 2024)
ImpNet: Imperceptible and Blackbox-undetectable Backdoors in Compiled Neural Networks; E Clifford, I Shumailov, Y Zhao, R Anderson, R Mullins; 2nd IEEE Conference on Secure and Trustworthy Machine Learning (SaTML 2024)
Year 2023
Will More Expressive Graph Neural Networks do Better on Generative Tasks?; X Zou, X Zhao, P Lio, Y Zhao; The Second Learning on Graphs Conference (LOG 2023)
Latent Diffusion Model for DNA Sequence Generation; Z Li, Y Ni, T Huygelen, A Das, G Xia, G Stan, Y Zhao; Conference on Neural Information Processing Systems, AI for Science Workshop (NeurIPS 2023, AI for Science Workshop)
MASE: An Efficient Representation for Software-Defined ML Hardware System Exploration; C Zhang, J Cheng, Z Yu, Y Zhao; Conference on Neural Information Processing Systems, Machine Learning for Systems Workshop (NeurIPS 2023, ML for Systems Workshop)
Dynamic Stashing Quantization for Efficient Transformer Training; G Yang, D Lo, R Mullins, Y Zhao; The 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP 2023, findings)
Revisiting Block-based Quantisation: What is Important for Sub-8-bit LLM Inference?; C Zhang, J Cheng, I Shumailov, G Constantinides, Y Zhao; The 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP 2023)
MiliPoint: A Point Cloud Dataset for mmWave Radar; H Cui, S Zhong, J Wu, Z Shen, N Dahnoun, Y Zhao; Conference on Neural Information Processing Systems (NeurIPS 2023, Datasets and Benchmarks Track)
Revisiting Structured Dropout; Y Zhao, O Dada, X Gao, RD Mullins; The 15th Asian Conference on Machine Learning (ACML 2023)
Genomic Interpreter: A Hierarchical Genomic Deep Neural Network with 1D Shifted Window Transformer; Z Li, A Das, WAV Beardall, Y Zhao, GB Stan; The 2023 ICML Workshop on Computational Biology (ICML-WCB 2023, contributed talk, best paper award)
Revisiting Automated Prompting: Are We Actually Doing Better?; Y Zhou, Y Zhao, I Shumailov, R Mullins, Y Gal; Association for Computational Linguistics 2023 (ACL 2023)
Task-Agnostic Graph Neural Network Evaluation via Adversarial Collaboration; X Zhao, H Stärk, D Beaini, P Liò, Y Zhao; ICLR 2023 - Machine Learning for Drug Discovery workshop (ICLR 2023 MLDD workshop)
Augmentation Backdoors; J Rance, Y Zhao, I Shumailov, R Mullins; ICLR 2023 Workshop on Backdoor Attacks and Defenses in Machine Learning (ICLR 2023 BANDS Workshop)
Adaptive Channel Sparsity for Federated Learning under System Heterogeneity; X Gao, D Liao, Y Zhao, C Xu; The IEEE / CVF Computer Vision and Pattern Recognition Conference (CVPR 2023)
Architectural Backdoors in Neural Networks; M Bober-Irizar, I Shumailov, Y Zhao, R Mullins, N Papernot; The IEEE / CVF Computer Vision and Pattern Recognition Conference (CVPR 2023)
Year 2022
Revisiting Embeddings for Graph Neural Networks; S Purchase, Y Zhao, R Mullins; The First Learning on Graphs Conference (LOG 2022)
Wide Attention Is The Way Forward For Transformers; J R Brown, Y Zhao, I Shumailov, R Mullins; All Things Attention: Bridging Different Perspectives on Attention, Oral, (NeurIPS 2022 Workshop)
DARTFormer: Finding The Best Type Of Attention; J R Brown, Y Zhao, I Shumailov, R Mullins; ICBINB, (NeurIPS 2022 Workshop)
Rapid Model Architecture Adaption for Meta-Learning; Y Zhao, X Gao, I Shumailov, N Fusi, R Mullins; Advances in Neural Information Processing Systems 35 (NeurIPS 2022)
DAdaQuant: Doubly-adaptive quantization for communication-efficient Federated Learning; R Hönig, Y Zhao, R Mullins; International Conference on Machine Learning (ICML 2022)
Past and Current Students
Academic Year 2023/2024
- Henry Li (Summer Research Intern, University of Cambridge)
- Ben Zhang (Summer Research Intern, University of Cambridge)
- Harry Langford (Summer Research Intern, University of Cambridge)
- Sanjit Raman (Summer Research/Teaching Intern, Imperial College London)
- Kevin Lau (Summer Research/Teaching Intern, Imperial College London)
- Xiandong Zou (Summer Research Intern, Imperial College London)
- Roshan Aekote(Summer Research Intern, Imperial College London)
- Li Wang(MSc project, Imperial College London)
- Charles Jin(MSc project, Imperial College London)
- Yichen Li(MSc project, Imperial College London)
- Przemyslaw Forys (MSc project, Imperial College London)
- Yuhe Zhang (Final Year Project, Imperial College London)
- Bryan Tan (Final Year Project, Imperial College London)
- Balint Szekely (Final Year Project, Imperial College London)
- Derek Lai (Final Year Project, Imperial College London)
- Bakhtiar Mohammadzadeh (Final Year Project, Imperial College London)
- TszHang Wong (Final Year Project, Imperial College London)
- Ben Zhang (Part II Project, University of Cambridge)
- Bradley Chen (Part II Project, University of Cambridge)
- Kate Liang (Part II Project, University of Cambridge)
Academic Year 2022/2023
- David Gyulamiryan (Summer Research Intern, University of Cambridge)
- Eduard Burlacu (Summer Research Intern, University of Cambridge)
- Harry Langford (Summer Research Intern, University of Cambridge)
- Ben Zhang (Summer Research Intern, University of Cambridge)
- Leah He (Summer Research Intern, University of Cambridge)
- Junyi Wu (Summer Research Intern, Imperial College London)
- Harry Ni (Summer Research Intern, Imperial College London)
- Xiandong Zou (Summer Research Intern, Imperial College London)
- Anthony Bolton (Summer Research Intern, Imperial College London)
- Aaron Thomas (Summer Research Intern, Imperial College London)
- Sudarshan Sreeram (Summer Research Intern, Imperial College London)
- Diego Van Overberghe (Summer Research Intern, Imperial College London)
- Bryan Tan (Summer Research Intern, Imperial College London)
- TszHang Wong (Summer Research Intern, Imperial College London)
- Aman Vernekar (Summer Research Intern, University of Cambridge)
- Eduard Burlacu (Summer Research Intern, University of Cambridge)
- Harry Langford (Summer Research Intern, University of Cambridge)
- Haoliang Shang (BEng Project, Imperial College London / ETH Zurich)
- Jacky Choi (BEng Project, Imperial College London / ETH Zurich)
- Can Xiao (MSc Project, Imperial College London)
- Sheng Luo (MSc Project, Imperial College London)
- Chuiyu Wang (MSc Project, Imperial College London)
- Pedro Gimense (Final Year Project, Imperial College London)
- Nickolaos Ilioudis (Final Year Project, Imperial College London)
- Issa Bqain (Final Year Project, Imperial College London)
- Tobias Cook (Final Year Project, Imperial College London)
- Peter Barabas (Final Year Project, Imperial College London)
- Ritvik Shyam(Final Year Project, Imperial College London)
- Harry Knighton (Part II project, University of Cambridge)
- Fredrik Ekholm (Part II project, University of Cambridge)
- Thomas Yuan (Part II project, University of Cambridge)
- Kyra Zhou (Part II project, University of Cambridge)
Academic Year 2021/2022
- Eleanor Clifford (Summer Research Intern, from University of Cambridge)
- Joseph Rance (Summer Research Intern, from University of Cambridge)
- Victor Zhao (Summer Research Intern, from University of Cambridge)
- Skye Purchase (Summer Research Intern, from University of Cambridge)
- Cindy Wu (Summer Research Intern, from University of Cambridge)
- Guo Yang (Summer Research Intern, from University of Cambridge)
- Prisha Satwani (Summer Research Intern, from LSE)
- Jason Brown (Summer Research Intern, from University of Cambridge)