![]() |
[Email] [Github] [Google Scholar] |
I am a third-year Ph.D. student at the Department of Computer Science in the University of Hong Kong (HKU), supervised by Dr. Lingpeng Kong and Dr. Tao Yu in HKUNLP. I received my Master degree in Fudan University, supervised by Prof. Qi Zhang in FudanNLP Group, and my Bachelor degree in Sun Yat-sen University.
My recent research focuses on below topics:
Text diffusion
Complex reasoning and planning
Interacting and understanding LLMs
Data synthesis principles
(*: equal contribution)
Text Diffusion
Implicit Search via Discrete Diffusion: A Study on Chess
Jiacheng Ye, Zhenyu Wu, Jiahui Gao, Zhiyong Wu, Xin Jiang, Zhenguo Li, Lingpeng Kong.
ICLR 2025. [code] [chess-agent]
Beyond Autoregression: Discrete Diffusion for Complex Reasoning and Planning
Jiacheng Ye, Jiahui Gao, Shansan Gong, Lin Zheng, Xin Jiang, Zhenguo Li, Lingpeng Kong.
ICLR 2025. [code]
Scaling Diffusion Language Models via Adaptation from Autoregressive Models
Shansan Gong*, Shivam Agarwal*, Yizhe Zhang, Jiacheng Ye, Lin Zheng, Mukai Li, Chenxin An, Peilin Zhao, Wei Bi, Jiawei Han, Hao Peng, Lingpeng Kong.
ICLR 2025. [code]
Diffusion of Thoughts: Chain-of-Thought Reasoning in Diffusion Language Models
Jiacheng Ye*, Shansan Gong*, Liheng Chen*, Lin Zheng, Jiahui Gao, Han Shi, Chuan Wu, Zhenguo Li, Wei Bi, Lingpeng Kong.
NeurIPS 2024. [code]
In-context Learning
Compositional Exemplars for In-context Learning
Jiacheng Ye, Zhiyong Wu, Jiangtao Feng, Tao Yu, Lingpeng Kong.
ICML 2023. [code]
OpenICL: An Open-Source Framework for In-context Learning
Zhenyu Wu*, YaoXiang Wang*, Jiacheng Ye*, Jiangtao Feng, Jingjing Xu, Yu Qiao, Zhiyong Wu.
ACL 2023, demo. [code]
Self-adaptive In-context Learning
Zhiyong Wu*, Yaoxiang Wang*, Jiacheng Ye*, Lingpeng Kong.
ACL 2023. [code]
Data Synthesis
G-LLaVA: Solving Geometric Problem with Multi-Modal Large Language Model
Jiahui Gao*, Renjie Pi*, Jipeng Zhang, Jiacheng Ye, Wanjun Zhong, Yufei Wang, Lanqing Hong, Jianhua Han, Hang Xu, Zhenguo Li, Lingpeng Kong.
ICLR 2025. [code]
Generating Data for Symbolic Language with Large Language Models
Jiacheng Ye, Chengzu Li, Lingpeng Kong, Tao Yu.
EMNLP 2023. [code]
Self-Guided Noise-Free Data Generation for Efficient Zero-Shot Learning
Jiahui Gao*, Renjie Pi*, Yong Lin, Hang Xu, Jiacheng Ye, Zhiyong Wu, Xiaodan Liang, Zhenguo Li, Lingpeng Kong.
ICLR 2023, spotlight. [code]
ProGen: Progressive Zero-shot Dataset Generation via In-context Feedback
Jiacheng Ye, Jiahui Gao, Zhiyong Wu, Jiangtao Feng, Tao Yu, and Lingpeng Kong.
EMNLP-Findings 2022. [code]
ZeroGen: Efficient Zero-shot Learning via Dataset Generation
Jiacheng Ye*, Jiahui Gao*, Qintong Li, Hang Xu, Jiangtao Feng, Zhiyong Wu, Tao Yu and Lingpeng Kong.
EMNLP 2022. [code] [PaperDigest Most Influential Papers]]
Understanding LLMs
Language Versatilists vs. Specialists: An Empirical
Revisiting on Multilingual Transfer Ability
Jiacheng Ye, Xijia Tao, Lingpeng Kong.
Preprint. [code]
Before LLMs
Heterogeneous Graph Neural Networks for Keyphrase Generation
Jiacheng Ye*, Ruijian Cai*, Tao Gui and Qi Zhang.
EMNLP 2021. [code]
Keyphrase Generation with Fine-Grained Evaluation-Guided Reinforcement Learning
Yichao Luo*, Yige Xu*, Jiacheng Ye, Xipeng Qiu and Qi Zhang.
EMNLP-Findings 2021. [code]
TextFlint: Unified Multilingual Robustness Evaluation Toolkit for Natural Language Processing
ACL 2021. [platform] [code] [blog (zh)]
One2Set: Generating Diverse Keyphrases as a Set
Jiacheng Ye, Tao Gui, Yichao Luo, Yige Xu, and Qi Zhang.
ACL 2021. [code] [blog (zh)]
Leveraging Document-Level Label Consistency for Named Entity Recognition
Tao Gui*, Jiacheng Ye*, Qi Zhang, Yaqian Zhou, Yeyun Gong, Xuanjing Huang.
IJCAI 2020. [code]
Uncertainty-Aware Label Refinement for Sequence Labeling
Tao Gui*, Jiacheng Ye*, Qi Zhang, Zhengyan Li, Zichu Fei, Yeyun Gong and Xuanjing Huang.
EMNLP 2020. [code]
Constructing Multiple Tasks for Augmentation: Improving Neural Image Classification with K-means Features
Tao Gui*, Lizhi Qing*, Qi Zhang, Jiacheng Ye, Hang Yan, Zichu Fei and Xuanjing Huang.
AAAI 2020. [code]
Nov. 2021 - Jul. 2023
Research Intern, Shanghai AI Lab.
Mentor: Lingpeng Kong.
Research about Pre-trained Language Model and Text Generation.
Jun. 2021 - Nov. 2021
Research Intern, Tencent.
Mentor: Zhihui Lao and Lifeng Wang.
Research about a better pre-ranking paradigm for Advertising System and Recommendation System.
Aug. 2018 - Dec. 2018
Engineer Intern, Netease.
Worked on data engineering.
Outstanding graduate of Shanghai, Shanghai, 2022.
National Scholarship (1%), Ministry of Education of China, 2021.
Glarun Scholarship of CETC-NRIET (5%), Fudan University, 2020.
The second-grade scholarship (10%), Sun Yat-sen University, 2016/2017/2018.
The faculty scholarship (10%), Sun Yat-sen University, 2016/2017.
Conference Reviewer: EMNLP (2022 - 2023), ACL (2022 - 2023), NeurIPS (2023 - 2024), ICLR (2024 - 2025), ICML (2025)