About Me
I am currently a third-year PhD student in the Department of Computing at the The Hong Kong Polytechnic University, advised by Prof. Jing Li. Before that, I received my B.S. and M.S. degrees in Computer Science from Jilin University, where I was supervised by Prof. Yi Chang. During my graduate studies, I also work closely with Prof. Lan Du from Monash University.
What's new:
[09/2024] Our paper "CoSafe: Evaluating Large Language Model Safety in Multi-Turn Dialogue Coreference" has been accepted by EMNLP 2024. Hope to see you in Miami.
[05/2024] Check our latest work on LLM coreference safety: "CoSafe: Evaluating Large Language Model Safety in Multi-Turn Dialogue Coreference".
[05/2024] Our paper "RePALM: Popular Quote Tweet Generation via Auto-Response Augmentation" has been accepted by Findings of ACL 2024.
[02/2024] Our paper "PopALM: Popularity-Aligned Language Models for Social Media Trendy Response Prediction" has been accepted by COLING 2024.
[10/2022] Our paper "Learning Semantic Textual Similarity via Topic-informed Discrete Latent Variables" has been accepted by EMNLP 2022.
[09/2022] Our paper "Towards Unified Representations of Knowledge Graph and Expert Rules for Machine Learning and Reasoning" has been accepted for presentation at AACL-IJCNLP 2022.
[10/2020] Our work on distantly relation extraction was accepted by COLING 2020!
Publications
- “CoSafe: Evaluating Large Language Model Safety in Multi-Turn Dialogue Coreference.”, Erxin Yu, Jing Li, Ming Liao, Siqi Wang, Zuchen Gao, Fei Mi, Lanqing Hong, EMNLP 2024 [paper]s
- “RePALM: Popular Quote Tweet Generation via Auto-Response Augmentation.”, Erxin Yu, Jing li, Chunpu Xu, in Findings of ACL 2024.
- “PopALM: Popularity-Aligned Language Models for Social Media Trendy Response Prediction.”, Erxin Yu, Jing li, Chunpu Xu, in COLING 2024 [paper]
- “Learning Semantic Textual Similarity via Topic-informed Discrete Latent Variables.”, Erxin Yu, Lan Du, Yuan Jin, Zhepei Wei, Yi Chang, in EMNLP 2022 [paper]
- “Towards Unified Representations of Knowledge Graph and Expert Rules for Machine Learning and Reasoning.”, Zhepei Wei, Yue Wang, Jinnan Li, Zhining Liu, Erxin Yu, Yuan Tian, Xin Wang, Yi Chang, in AACL-IJCNLP 2022 [paper]
- “ToHRE: A Top-Down Classification Strategy with Hierarchical Bag Representation for Distantly Supervised Relation Extraction”, Erxin Yu, Wenjuan Han, Yuan Tian, Yi Chang, in Proceedings of the 28th International Conference on computational Linguistics (COLING 2020), Barcelona, Spain, December 2020. (Virtual Event). [paper]
- “Context and Type Enhanced Representation Learning for Relation Extraction”, Erxin Yu, Yantao Jia, Shang Wang, Fengfu Li, Yi Chang, in 11th IEEE International Conference on Knowledge Graph (ICKG 2020), Nanjing, China, August 2020. (Virtual Event). [paper]
- “A Two-Level Noise-Tolerant Model for Relation Extraction with Reinforcement Learning”, Erxin Yu, Yantao Jia, Yuan Tian, Yi Chang, in 11th IEEE International Conference on Knowledge Graph (ICKG 2020), Nanjing, China, August 2020. (Virtual Event). [paper]