CV
Thank you for interests in my experiences. Here is my full cv with more details. My transcripts for both undergraduate and graduate studies are also available here. For a brief summary of my previous research experience, please refer to these slides (updated until April 2024).
Education
- MS in Data Science and Information Technology, Tsinghua University, China (2020-2023)
- BEng in Information Security, Wuhan University, China (2016-2020)
*Note: Before transferring to the School of Cyber Science and Engineering in 2018, I spent two years studying in the School of Computer Science and Engineering in Wuhan University.
National Awards
- National Scholarship (10/2018)
- Second Prize, National College Student Information Security Contest (08/2019)
- Cyber Security Scholarship (10/2019) [Scholarship Details Page][Awardee List]
Work Experience
- Fall 2023: Research Assistant
- Tsinghua-Berkeley Shenzhen Institute
- Duties: research and team management
- Supervisor: Prof. Wenbo Ding
- Summer 2022: Research Intern
- Meituan UAV Lab
- Contributions: a 2D to 3D platform for large-scale UAV simulations
- Mentor: Dr. Tianjian Chen
- Summer 2021: Research Intern
- Tencent Robotics-X Lab
- Contributions: an open-source JAX-based rigid body dynamics algorithm library
- Mentor: Dr. Cheng Zhou
English Proficiency
- IELTS overall Band 8.0
Skills
- Coding
- Python/C/C++/CUDA
- PyTorch/Tensorflow
- Ubuntu/Docker
- Academic
- LaTex/Zotero
- Paper Writing
- Presentation
Publications
- Mao, Y., Ping, S., Zhao, Z., Liu, Y., & Ding, W. (2024). Enhancing parameter efficiency and generalization in large-scale models: A regularized and masked low-rank adaptation approach. arXiv preprint arXiv:2407.12074 [paper]
- Li, J., Zhao, C., Mao, Y., Chen, X., Ding, W., Qu, X., & Wang, J. (2024). FormerReckoning: Physics inspired transformer for accurate inertial navigation. In the 7th International Workshop on Physics Embedded AI Solutions in Mobile Computing (MobiCom Picasso Workshop 2024)
- Ping, S.*, Mao, Y.*, Liu, Y., Zhang, X. P., & Ding, W. (2024). FL-TAC: Enhanced fine-tuning in federated learning via low-rank, task-specific adapter clustering. In ICLR 2024 Workshop on Large Language Model (LLM) Agents [paper]
- Zhao, Z.*, Mao, Y.*, Shi, Z., Liu, Y., Lan, T., Ding, W., & Zhang, X. P. (2023). AQUILA: Communication efficient federated learning with adaptive quantization in device selection strategy. IEEE Transactions on Mobile Computing (TMC). DOI: 10.1109/TMC.2023.3332901 [paper]
- Mao, Y., Zhao, Z., Yang, M., Liang, L., Liu, Y., Ding, W., Lan, T., & Zhang, X. P. (2023). SAFARI: Sparsity-enabled federated learning with limited and unreliable communications. IEEE Transactions on Mobile Computing (TMC). DOI: 10.1109/TMC.2023.3296624 [paper]
- Zhao, Z., Mao, Y., Liu, Y., Song, L., Ouyang, Y., Chen, X., & Ding, W. (2023). Towards efficient communications in federated learning: A contemporary survey. Journal of the Franklin Institute (JFI), 360(12), 8669-8703. DOI: 10.1016/j.jfranklin.2022.12.053 [paper]
- Mao, Y., Zhao, Z., Yan, G., Liu, Y., Lan, T., Song, L., & Ding, W. (2022). Communication-efficient federated learning with adaptive quantization. ACM Transactions on Intelligent Systems and Technology (TIST), 13(4), 1-26. DOI: 10.1145/3510587 [paper][video]
(* indicates equal contribution)
Services and Leadership
- Reviewer Services: UbiComp, TIST, TMC
- Currently leading an active and productive research group of 7 master students