Hi! I currently work at Alibaba DAMO Academy. I received my Ph.D from the joint program of ShanghaiTech University and University of Chinese Academy of Sciences. I was very fortunate to be advised by Prof. Kewei Tu. I am interested in machine learning and natural language processing.
My current research mainly focuses on entity understanding tasks, information retrieval (query/doc understanding), language model pretraining, multilingual NLP, structured prediction and so on. Furthermore, I also ship these cutting-edge technologies to real products and platforms.
In my PhD time, I mainly worked on learning latent variable models for NLP problems and ML problems.
Spotlight of our recent work:
We have some research intern positions available in Alibaba DAMO Academy. If you are interested in NLP and ML, please feel free to contact me: jiangyong.ml@gmail.com.
PhD in Computer Science, 2019
ShanghaiTech University
PhD in Computer Science, 2019
University of Chinese Academy of Sciences
A SOTA system that can perform entity typing tasks of 10k entity types.
We utilize the wikipedia to improve the RaNER model, which wins the SemEval 2022 competition and obtains the best system paper award.
Our first work on entity linking. Stay tuned for follow-up works.
This paper achieves SOTA performance over 24 datasets of 6 tasks, spanning over NER, POS, chunking, dependency parsing, semantic parsing, aspect extraction, following the More Embeddings, Better Sequence Labelers paper.
The first retrieval-aug NER (RaNER) system that achieves SOTA performance over multiple domains.
One of my favorite work on the cross-lingual structured prediction task. The idea is super intuitive.
The current SOTA model for unsupervised dependency parsing.
One model for multiple languages.
The first autoencoder model to unsupervised dependency parsing.
The first neural approach to unsupervised dependency parsing.
Conference Program Committee/Reviewer:
2022: AAAI, ICLR
2021: AAAI, EACL, EMNLP, ICLR, ICML, NAACL, NeuIPS, CCL
2020: AAAI, AACL, EMNLP, IJCAI, NeurIPS
2019: AAAI, ACL, EMNLP, NAACL
I am very lucky to collaborate with the following Research Interns and Co-Mentored Students:
Xinyu Wang (ShanghaiTech, 2019.10-Now): 10 papers published during internship, including ACL*4 , EMNLP*3 and NAACL*1 papers.
Jiong Cai (ShanghaiTech, 2020.10-Now): EMNLP 2017, SemEval 2022, EMNLP 2022
Chengyue Jiang (ShanghaiTech, 2021.8-Now): EMNLP 2022
Wei Liu (ShanghaiTech, 2022.7-Now)
Zixia Jia (ShanghaiTech, 2022.7-Now)
Chaoyi Ai (ShanghaiTech, 2022.7-Now)
Yinghui Li (THU, 2022.6-Now)
Yuchen Zhai (ZJU, 2022.1-Now)
Zeqi Tan (ZJU, 2022.5-Now)
Xiaoze Liu (ZJU, 2022.9-Now)
Xin Zhang (TJU, 2021.11-2022.4, 2022.11-): COLING 2022
Yupeng Zhang (BUAA, 2022.11-)
Xuming Hu (THU, 2021.8-2022.10)
Jinyuan Fang (SYSU, 2022.4-2022.10)
Zhichao Lin (TJU, 2022.5-2022.10)
Yu Zhang (SUDA, 2020.7-2021.3): COLING 2022
Zechuan Hu (ShanghaiTech, 2019.1-2021.7): 3 papers published, including 2 ACL papers and 1 EMNLP paper.
Yongliang Shen (ZJU, 2021.11-2022.3): SemEval 2022
Tao Ji (ECNU, 2020-2021): 2 EMNLP papers.
Xinyin Ma (ZJU, 2021): 1 EMNLP paper.
Jun Mei (ShanghaiTech, 2017-2018): AAAI 2018
Songlin Yang (ShanghaiTech, 2019-2020): COLING 2020
Yunzhe Yuan (ShanghaiTech, 2018-2019): AAAI 2019
Jun Li (ShanghaiTech, 2018-2019): ACL 2020
We are hiring research interns in Alibaba DAMO Academy. Please send me an email if you are interested!
I closely collaborate(d) with the following researchers:
Nguyen Bach, Wenjuan Han, Fei Huang, Zhongqiang Huang, Kewei Tu, Pengjun Xie.