I am a final-year Ph.D. student at Michigan State University supervised by Dr. Jiliang Tang. I also collaborate with Neil Shah, Yao Ma, Michael Galkin, and Rongrong Wang. Each of them provides a unique perspective on scientific thinking throughout my research journey. I am an organizer of the Learning on Graph Conference 2024, an exciting new conference focused on graph and geometry. The call for paper is available here, with a submission deadline of September 11th, 2024. Do not hesitate to submit your work!

My research focuses on developing better graph deep learning model (especially Graph Foundation Models) grounded in the network science insights. Check our recent blog and paper for more details. I believe GFMs can be the next graph learning paradigm, and the key to achieve scaling behavior is to clever leverage graph transferability principles such as network analysis and expressiveness. During my research road, I answer the following research questions:

  • What is the drawback of current GNNs?
  • What can be transferable across diverse graphs that seem irrelevant, ranging from social networks to molecular graphs?
  • Can GFMs benefit from pre-training on large-scale graph data and large model scale?
  • How can we develop practical GFMs leveraging LLMs with rich knowledge and emerging world understanding capabilites?
  • Is there a universal structure space among graphs from different domains?
  • What should be an ideal GFM? A good vocabulary design with a transformer could be a viable solution
  • How can we apply GFMs on industry-level large graphs, balancing effectiveness and efficiency?
  • Can we adapt the general GFMs towards specific ai4science application?
  • What is the unique advantage of GFMs over FMs from other modalities? How can we combine FMs towards more general intelligence.

Selected Publications

The order indicates my personal preference.

  • Position: Graph Foundation Models Are Already Here
    Haitao Mao*, Zhikai Chen*, Wenzhuo Tang, Jianan Zhao, Yao Ma, Tong Zhao, Neil Shah, Mikhail Galkin, Jiliang Tang
    ICML 2024 Spotlight (335/9473)
    Collaboration with SnapChat and Intel lab
    [pdf] [Primary Blog] [Blog] [Reading List 1] [Reading List 2]

  • On the Intrinsic Self-Correction Capability of LLMs: Uncertainty and Latent Concept
    Haitao Mao*, Guangliang Liu*, Bochuan Cao, Zhiyu Xue, Kristen Johnson, Jiliang Tang, Rongrong Wang
    Preprint [pdf]

  • Revisiting Link Prediction: A Data Perspective
    Haitao Mao, Juanhui Li, Harry Shomer, Bingheng Li, Wenqi Fan, Yao Ma, Tong Zhao, Neil Shah, Jiliang Tang
    ICLR 2024
    Collaboration with SnapChat
    [pdf] [Slides] [Video]

  • Cross-Domain Graph Data Scaling: A Showcase with Diffusion Models
    Wenzhuo Tang, Haitao Mao, Danial Dervovic, Ivan Brugere, Saumitra Mishra, Yuying Xie, Jiliang Tang
    Collaboration with JP Morgan
    Preprint [pdf]

  • Demystifying Structural Disparity in Graph Neural Networks: Can One Size Fit All?
    Haitao Mao, Zhikai Chen, Wei Jin, Haoyu Han, Yao Ma, Tong Zhao, Neil Shah, Jiliang Tang
    NeurIPS 2023
    Collaboration with SnapChat
    [pdf] [Code] [Slides] [Poster] [Video]

  • Neuron Campaign for Initialization Guided by Information Bottleneck Theory
    Haitao Mao, Xu Chen, Qiang Fu, Lun Du, Shi Han, Dongmei Zhang
    CIKM2021 Best Short Paper (1/626)
    Work During Internship in Microsoft Research Asia
    [pdf] [Code] [Blog] [Chinese Blog] [Poster] [Slides] [Video]

  • Text-space Graph Foundation Models: A Comprehensive Benchmark and New Insights
    Zhikai Chen, Haitao Mao, Jingzhe Liu, Yu Song, Bingheng Li, Wei Jin, Bahare Fatemi, Anton Tsitsulin, Bryan Perozzi, Hui Liu, Jiliang Tang
    Collaboration with Google
    Preprint [pdf][Code]

  • A Data Generation Perspective to the Mechanism of In-Context Learning
    Haitao Mao, Guangliang Liu, Yao Ma, Rongrong Wang, Jiliang Tang
    Preprint [pdf]

  • Exploring the Potential of Large Language Models (LLMs) in Learning on Graphs
    Zhikai Chen, Haitao Mao, Hang Li, Wei Jin, Hongzhi Wen, Xiaochi Wei, Shuaiqiang Wang, Dawei Yin, Wenqi Fan, Hui Liu, Jiliang Tang
    SIGKDD Explorations 2023
    Collaboration with Baidu
    [pdf] [Code] [Slides]

  • Source Free Graph Unsupervised Domain Adaptation
    Haitao Mao, Lun Du, Yujia Zheng, Qiang Fu, Zelin Li, Xu Chen, Shi Han, Dongmei Zhang
    WSDM 2024 Best Paper Honor Mention (3/615)
    Work During Internship in Microsoft Research Asia
    [pdf] [Blog] [Code]

Awards:

  • WSDM 2024 Best Paper Honor Mentioned Award (First Author) (3/615)
  • CIKM 2021 Best Short Paper Award (First Author) (1/626)
  • ICML 2024 Spotlight (First Author) (335/9473)
  • WSDM 2024 Student Travel Award
  • NeurIPS 2023 Scholar Award
  • Excellent Student of High Education in Sichuan Province (30/763)
  • Outstanding Graduate in University of Electronic Science and Technology of China (74/763)
  • Star of Tomorrow Intern Award in Microsoft Research Asia (Top 10%)
  • National First Prize in Chinese Software Cup (20/45,000) [Github]

Professional Experience

  • Visiting Scholar at Hong Kong Polytechnic University (March, 2023 - September 2023): Mentored by Research Assistant Professor Wenqi Fan and Professor Qing Li
  • Research Intern at Baidu (March, 2022 - September, 2022): Search Strategy Department, Mentored by Dr. Lixin Zou.
  • Research Intern at Microsoft Research Asia (January, 2021 - November, 2021)

Support

This page is supported by Hanlin Lan, one of my best friends in undergraduate period. Thanks for his great help.