Follow
Xing Wang
Xing Wang
Tencent AI Lab
Verified email at tencent.com - Homepage
Title
Cited by
Cited by
Year
Is ChatGPT a good translator? A preliminary study
W Jiao, W Wang, J Huang, X Wang, Z Tu
arXiv preprint arXiv:2301.08745, 2023
798*2023
Encouraging divergent thinking in large language models through multi-agent debate
T Liang, Z He, W Jiao, X Wang, Y Wang, R Wang, Y Yang, Z Tu, S Shi
EMNLP 2024, 2024
2422024
Context-aware self-attention networks
B Yang, J Li, DF Wong, LS Chao, X Wang, Z Tu
AAAI 2019, 2019
1292019
Neural machine translation advised by statistical machine translation
X Wang, Z Lu, Z Tu, H Li, D Xiong, M Zhang
Proceedings of the AAAI Conference on Artificial Intelligence 31 (1), 2017
1152017
Exploiting Deep Representations for Neural Machine Translation
ZY Dou, Z Tu, X Wang, S Shi, T Zhang
EMNLP 2018, 2018
932018
Modeling Recurrence for Transformer
J Hao, X Wang, B Yang, L Wang, J Zhang, Z Tu
NAACL 2019, 2019
922019
Self-attention with structural position representations
X Wang, Z Tu, L Wang, S Shi
EMNLP 2019, 2019
802019
Exploring Human-Like Translation Strategy with Large Language Models
Z He, T Liang, W Jiao, Z Zhang, Y Yang, R Wang, Z Tu, S Shi, X Wang
Transactions of the Association for Computational Linguistics, 2024
732024
Translating Phrases in Neural Machine Translation
X Wang, Z Tu, D Xiong, M Zhang
EMNLP 2017, 2017
712017
Multi-granularity self-attention for neural machine translation
J Hao, X Wang, S Shi, J Zhang, Z Tu
EMNLP 2019, 2019
672019
ParroT: Translating during chat using large language models tuned with human translation and feedback
W Jiao, J Huang, W Wang, Z He, T Liang, X Wang, S Shi, Z Tu
EMNLP 2023, 2023
63*2023
On the diversity of multi-head attention
J Li, X Wang, Z Tu, MR Lyu
Neurocomputing 454, 14-24, 2021
592021
Incorporating statistical machine translation word knowledge into neural machine translation
X Wang, Z Tu, M Zhang
IEEE/ACM Transactions on Audio, Speech, and Language Processing 26 (12 …, 2018
592018
Dynamic Layer Aggregation for Neural Machine Translation with Routing-by-Agreement
ZY Dou, Z Tu, X Wang, L Wang, S Shi, T Zhang
AAAI 2019, 2019
542019
Towards understanding neural machine translation with word importance
S He, Z Tu, X Wang, L Wang, MR Lyu, S Shi
EMNLP 2019, 2019
472019
Information Aggregation for Multi-Head Attention with Routing-by-Agreement
J Li, B Yang, ZY Dou, X Wang, MR Lyu, Z Tu
NAACL 2019, 2019
462019
Understanding and Improving Sequence-to-Sequence Pretraining for Neural Machine Translation
W Wang, W Jiao, Y Hao, X Wang, S Shi, Z Tu, M Lyu
ACL 2022, 2022
452022
Self-training sampling with monolingual data uncertainty for neural machine translation
W Jiao, X Wang, Z Tu, S Shi, MR Lyu, I King
ACL 2021, 2021
44*2021
Topic-based coherence modeling for statistical machine translation
D Xiong, M Zhang, X Wang
IEEE/ACM Transactions on Audio, Speech, and Language Processing 23 (3), 483-493, 2015
392015
How does selective mechanism improve self-attention networks?
X Geng, L Wang, X Wang, B Qin, T Liu, Z Tu
ACL 2020, 2020
342020
The system can't perform the operation now. Try again later.
Articles 1–20