Haoyu Wang

personal.jpg

I am an Assistant Professor of Computer Science at SUNY Albany. Before that, I received my Ph.D. degree under the advisory of Prof. Jing Gao in School of Electrical and Computer Engineering, Purdue University. I got my B.Eng. degree from the University of Electronic Science and Technology of China under the advisory of Prof. Defu Lian, and got my MS degree from SUNY Buffalo. Feel free to drop me an email if you are interested in my research or want to collaborate.


Awards

  • Future Leaders in Data Science and Artificial Intelligence, University of Michigan - Ann Arbor, 2024

  • Bilsland Dissertation Fellowship in the School of ECE, Purdue Univerity

  • Distinguished Paper Award, AAAI 2023

  • Ross Fellowship in the School of ECE, Purdue Univerity

news

May 20, 2025 Recruiting PhDs and Interns: I am seeking students for Ph.D. in 26 Spring/Fall or research intern roles. Please email me with your CV and brief descriptions of your preferred research topics. Kindly mark the subject with [PhD/Research Intern Application]. NOTE: Unfortunately, due to the large amount of emails, I won’t be able to reply to them all. But I encourage you to submit your application.
May 15, 2025 Our paper “RoseRAG: Robust Retrieval-augmented Generation with Small-scale LLMs via Margin-aware Preference Optimization” was accepted at ACL’25 Findings.
May 01, 2025 Our paper “Mitigating Heterogeneous Token Overfitting in LLM Knowledge Editing” was accepted at ICML’25.

selected publications

  1. EMNLP’24
    BlendFilter: Advancing Retrieval-Augmented Large Language Models via Query Generation Blending and Knowledge Filtering
    Haoyu Wang, Ruirui Li, Haoming Jiang, Jinjin Tian, Zhengyang Wang, Chen Luo, Xianfeng Tang, Monica Cheng, Tuo Zhao, and Jing Gao
    In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, 2024
  2. EMNLP’24
    RoseLoRA: Row and Column-wise Sparse Low-rank Adaptation of Pre-trained Language Model for Knowledge Editing and Fine-tuning
    Haoyu Wang, Tianci Liu, Ruirui Li, Monica Cheng, Tuo Zhao, and Jing Gao
    In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, 2024
  3. ICML’25
    Mitigating Heterogeneous Token Overfitting in LLM Knowledge Editing
    Tianci Liu, Ruirui Li, Zihan Dong, Hui Liu, Xianfeng Tang, Qingyu Yin, Linjun Zhang, Haoyu Wang, and Jing Gao
    In The Fourty-Second International Conference on Machine Learning, 2025
  4. ACL’25
    RoseRAG: Robust Retrieval-augmented Generation with Small-scale LLMs via Margin-aware Preference Optimization
    Tianci Liu*, Haoxiang Jiang*, Tianze Wang, Ran Xu, Yue Yu, Linjun Zhang, Tuo Zhao, and Haoyu Wang
    In Findings of the Association for Computational Linguistics: ACL 2025, 2025