


Остановите войну!
for scientists:
Zhaopeng Tu
Person information

Refine list

refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
showing all ?? records
2020 – today
- 2022
- [j11]Xinwei Geng, Longyue Wang, Xing Wang, Mingtao Yang, Xiaocheng Feng, Bing Qin
, Zhaopeng Tu:
Learning to refine source representations for neural machine translation. Int. J. Mach. Learn. Cybern. 13(8): 2199-2212 (2022) - [j10]Wenxiang Jiao
, Xing Wang, Shilin He, Zhaopeng Tu, Irwin King
, Michael R. Lyu:
Exploiting Inactive Examples for Natural Language Generation With Data Rejuvenation. IEEE ACM Trans. Audio Speech Lang. Process. 30: 931-943 (2022) - [c83]Liang Ding, Longyue Wang, Shuming Shi, Dacheng Tao, Zhaopeng Tu:
Redistributing Low-Frequency Words: Making the Most of Monolingual Data in Non-Autoregressive Translation. ACL (1) 2022: 2417-2426 - [c82]Wenxuan Wang, Wenxiang Jiao, Yongchang Hao, Xing Wang, Shuming Shi, Zhaopeng Tu, Michael R. Lyu:
Understanding and Improving Sequence-to-Sequence Pretraining for Neural Machine Translation. ACL (1) 2022: 2591-2600 - [c81]Zhiwei He, Xing Wang, Rui Wang, Shuming Shi, Zhaopeng Tu:
Bridging the Data Gap between Training and Inference for Unsupervised Neural Machine Translation. ACL (1) 2022: 6611-6623 - [i67]Zhiwei He, Xing Wang, Rui Wang, Shuming Shi, Zhaopeng Tu:
Bridging the Data Gap between Training and Inference for Unsupervised Neural Machine Translation. CoRR abs/2203.08394 (2022) - [i66]Wenxuan Wang, Wenxiang Jiao, Yongchang Hao, Xing Wang, Shuming Shi, Zhaopeng Tu, Michael R. Lyu:
Understanding and Improving Sequence-to-Sequence Pretraining for Neural Machine Translation. CoRR abs/2203.08442 (2022) - [i65]Wenxuan Wang, Wenxiang Jiao, Shuo Wang, Zhaopeng Tu, Michael R. Lyu:
Understanding and Mitigating the Uncertainty in Zero-Shot Translation. CoRR abs/2205.10068 (2022) - [i64]Shuo Wang, Peng Li, Zhixing Tan, Zhaopeng Tu, Maosong Sun, Yang Liu:
A Template-based Method for Constrained Neural Machine Translation. CoRR abs/2205.11255 (2022) - 2021
- [j9]Jian Li, Xing Wang, Zhaopeng Tu, Michael R. Lyu:
On the diversity of multi-head attention. Neurocomputing 454: 14-24 (2021) - [j8]Baosong Yang, Longyue Wang, Derek F. Wong, Shuming Shi, Zhaopeng Tu:
Context-aware Self-Attention Networks for Natural Language Processing. Neurocomputing 458: 157-169 (2021) - [j7]Xintong Li
, Lemao Liu, Zhaopeng Tu, Guanlin Li
, Shuming Shi, Max Q.-H. Meng
:
Attending From Foresight: A Novel Attention Mechanism for Neural Machine Translation. IEEE ACM Trans. Audio Speech Lang. Process. 29: 2606-2616 (2021) - [c80]Liang Ding, Longyue Wang, Xuebo Liu, Derek F. Wong, Dacheng Tao, Zhaopeng Tu:
Progressive Multi-Granularity Training for Non-Autoregressive Translation. ACL/IJCNLP (Findings) 2021: 2797-2803 - [c79]Wenxiang Jiao, Xing Wang, Zhaopeng Tu, Shuming Shi, Michael R. Lyu, Irwin King:
Self-Training Sampling with Monolingual Data Uncertainty for Neural Machine Translation. ACL/IJCNLP (1) 2021: 2840-2850 - [c78]Liang Ding, Longyue Wang, Xuebo Liu, Derek F. Wong, Dacheng Tao, Zhaopeng Tu:
Rejuvenating Low-Frequency Words: Making the Most of Parallel Data in Non-Autoregressive Translation. ACL/IJCNLP (1) 2021: 3431-3441 - [c77]Xuebo Liu, Longyue Wang, Derek F. Wong, Liang Ding, Lidia S. Chao, Shuming Shi, Zhaopeng Tu:
On the Copying Behaviors of Pre-Training for Neural Machine Translation. ACL/IJCNLP (Findings) 2021: 4265-4275 - [c76]Shuo Wang, Zhaopeng Tu, Zhixing Tan, Shuming Shi, Maosong Sun, Yang Liu:
On the Language Coverage Bias for Neural Machine Translation. ACL/IJCNLP (Findings) 2021: 4778-4790 - [c75]Xuebo Liu, Longyue Wang, Derek F. Wong, Liang Ding, Lidia S. Chao, Shuming Shi, Zhaopeng Tu:
On the Complementarity between Pre-Training and Back-Translation for Neural Machine Translation. EMNLP (Findings) 2021: 2900-2907 - [c74]Jie Hao, Linfeng Song, Liwei Wang, Kun Xu, Zhaopeng Tu, Dong Yu:
RAST: Domain-Robust Dialogue Rewriting as Sequence Tagging. EMNLP (1) 2021: 4913-4924 - [c73]Xuebo Liu, Longyue Wang, Derek F. Wong, Liang Ding, Lidia S. Chao, Zhaopeng Tu:
Understanding and Improving Encoder Layer Fusion in Sequence-to-Sequence Learning. ICLR 2021 - [c72]Liang Ding, Longyue Wang, Xuebo Liu, Derek F. Wong, Dacheng Tao, Zhaopeng Tu:
Understanding and Improving Lexical Choice in Non-Autoregressive Translation. ICLR 2021 - [c71]Cunxiao Du, Zhaopeng Tu, Jing Jiang:
Order-Agnostic Cross Entropy for Non-Autoregressive Machine Translation. ICML 2021: 2849-2859 - [c70]Yongchang Hao, Shilin He, Wenxiang Jiao, Zhaopeng Tu, Michael R. Lyu, Xing Wang:
Multi-Task Learning with Shared Encoder for Non-Autoregressive Machine Translation. NAACL-HLT 2021: 3989-3996 - [c69]Longyue Wang, Mu Li, Fangxu Liu, Shuming Shi, Zhaopeng Tu, Xing Wang, Shuangzhi Wu, Jiali Zeng, Wen Zhang:
Tencent Translation System for the WMT21 News Translation Task. WMT@EMNLP 2021: 216-224 - [c68]Xing Wang, Zhaopeng Tu, Shuming Shi:
Tencent AI Lab Machine Translation Systems for the WMT21 Biomedical Translation Task. WMT@EMNLP 2021: 874-878 - [i63]Guoping Huang, Lemao Liu, Xing Wang, Longyue Wang, Huayang Li, Zhaopeng Tu, Chengyan Huang, Shuming Shi:
TranSmart: A Practical Interactive Machine Translation System. CoRR abs/2105.13072 (2021) - [i62]Liang Ding, Longyue Wang, Xuebo Liu, Derek F. Wong, Dacheng Tao, Zhaopeng Tu:
Rejuvenating Low-Frequency Words: Making the Most of Parallel Data in Non-Autoregressive Translation. CoRR abs/2106.00903 (2021) - [i61]Wenxiang Jiao, Xing Wang, Zhaopeng Tu, Shuming Shi, Michael R. Lyu, Irwin King:
Self-Training Sampling with Monolingual Data Uncertainty for Neural Machine Translation. CoRR abs/2106.00941 (2021) - [i60]Shuo Wang, Zhaopeng Tu, Zhixing Tan, Shuming Shi, Maosong Sun, Yang Liu:
On the Language Coverage Bias for Neural Machine Translation. CoRR abs/2106.03297 (2021) - [i59]Cunxiao Du, Zhaopeng Tu, Jing Jiang:
Order-Agnostic Cross Entropy for Non-Autoregressive Machine Translation. CoRR abs/2106.05093 (2021) - [i58]Liang Ding, Longyue Wang, Xuebo Liu, Derek F. Wong, Dacheng Tao, Zhaopeng Tu:
Progressive Multi-Granularity Training for Non-Autoregressive Translation. CoRR abs/2106.05546 (2021) - [i57]Shuo Wang, Zhaopeng Tu, Zhixing Tan, Wenxuan Wang, Maosong Sun, Yang Liu:
Language Models are Good Translators. CoRR abs/2106.13627 (2021) - [i56]Xuebo Liu, Longyue Wang, Derek F. Wong, Liang Ding, Lidia S. Chao, Shuming Shi, Zhaopeng Tu:
On the Copying Behaviors of Pre-Training for Neural Machine Translation. CoRR abs/2107.08212 (2021) - [i55]Xuebo Liu, Longyue Wang, Derek F. Wong, Liang Ding, Lidia S. Chao, Shuming Shi, Zhaopeng Tu:
On the Complementarity between Pre-Training and Back-Translation for Neural Machine Translation. CoRR abs/2110.01811 (2021) - 2020
- [j6]Zi-Yi Dou, Xing Wang, Shuming Shi, Zhaopeng Tu:
Exploiting deep representations for natural language processing. Neurocomputing 386: 1-7 (2020) - [c67]Jian Li, Xing Wang, Baosong Yang, Shuming Shi, Michael R. Lyu, Zhaopeng Tu:
Neuron Interaction Based Representation Composition for Neural Machine Translation. AAAI 2020: 8204-8211 - [c66]Yong Wang, Longyue Wang, Shuming Shi, Victor O. K. Li, Zhaopeng Tu:
Go From the General to the Particular: Multi-Domain Translation with Domain Transformation Networks. AAAI 2020: 9233-9241 - [c65]Xinwei Geng, Longyue Wang, Xing Wang, Bing Qin, Ting Liu, Zhaopeng Tu:
How Does Selective Mechanism Improve Self-Attention Networks? ACL 2020: 2986-2995 - [c64]Shuo Wang, Zhaopeng Tu, Shuming Shi, Yang Liu:
On the Inference Calibration of Neural Machine Translation. ACL 2020: 3070-3079 - [c63]Yongquan He, Zhihan Wang
, Peng Zhang, Zhaopeng Tu, Zhaochun Ren:
VN Network: Embedding Newly Emerging Entities with Virtual Neighbors. CIKM 2020: 505-514 - [c62]Deyu Zhou, Shuangzhi Wu, Qing Wang, Jun Xie, Zhaopeng Tu, Mu Li:
Emotion Classification by Jointly Learning to Lexiconize and Classify. COLING 2020: 3235-3245 - [c61]Liang Ding, Longyue Wang, Di Wu, Dacheng Tao, Zhaopeng Tu:
Context-Aware Cross-Attention for Non-Autoregressive Translation. COLING 2020: 4396-4402 - [c60]Qintong Li, Hongshen Chen, Zhaochun Ren, Pengjie Ren, Zhaopeng Tu, Zhumin Chen:
EmpDG: Multi-resolution Interactive Empathetic Dialogue Generation. COLING 2020: 4454-4466 - [c59]Wenxuan Wang, Zhaopeng Tu:
Rethinking the Value of Transformer Components. COLING 2020: 6019-6029 - [c58]Yong Wang, Longyue Wang, Victor O. K. Li, Zhaopeng Tu:
On the Sparsity of Neural Machine Translation Models. EMNLP (1) 2020: 1060-1066 - [c57]Wenxiang Jiao, Xing Wang, Shilin He, Irwin King, Michael R. Lyu, Zhaopeng Tu:
Data Rejuvenation: Exploiting Inactive Training Examples for Neural Machine Translation. EMNLP (1) 2020: 2255-2266 - [c56]Yilin Yang, Longyue Wang, Shuming Shi, Prasad Tadepalli, Stefan Lee, Zhaopeng Tu:
On the Sub-Layer Functionalities of Transformer Decoder. EMNLP (Findings) 2020: 4799-4811 - [c55]Jinhuan Liu, Xuemeng Song, Zhaochun Ren, Liqiang Nie, Zhaopeng Tu, Jun Ma:
Auxiliary Template-Enhanced Generative Compatibility Modeling. IJCAI 2020: 3508-3514 - [c54]Chuan Meng, Pengjie Ren, Zhumin Chen, Weiwei Sun, Zhaochun Ren, Zhaopeng Tu, Maarten de Rijke
:
DukeNet: A Dual Knowledge Interaction Network for Knowledge-Grounded Conversation. SIGIR 2020: 1151-1160 - [c53]Shuangzhi Wu, Xing Wang, Longyue Wang, Fangxu Liu, Jun Xie, Zhaopeng Tu, Shuming Shi, Mu Li:
Tencent Neural Machine Translation Systems for the WMT20 News Translation Task. WMT@EMNLP 2020: 313-319 - [c52]Longyue Wang, Zhaopeng Tu, Xing Wang, Li Ding, Liang Ding, Shuming Shi:
Tencent AI Lab Machine Translation Systems for WMT20 Chat Translation Task. WMT@EMNLP 2020: 483-491 - [c51]Xing Wang, Zhaopeng Tu, Longyue Wang, Shuming Shi:
Tencent AI Lab Machine Translation Systems for the WMT20 Biomedical Translation Task. WMT@EMNLP 2020: 881-886 - [i54]Shilin He, Xing Wang, Shuming Shi, Michael R. Lyu, Zhaopeng Tu:
Assessing the Bilingual Knowledge Learned by Neural Machine Translation Models. CoRR abs/2004.13270 (2020) - [i53]Shuo Wang, Zhaopeng Tu, Shuming Shi, Yang Liu:
On the Inference Calibration of Neural Machine Translation. CoRR abs/2005.00963 (2020) - [i52]Xinwei Geng, Longyue Wang, Xing Wang, Bing Qin, Ting Liu, Zhaopeng Tu:
How Does Selective Mechanism Improve Self-Attention Networks? CoRR abs/2005.00979 (2020) - [i51]Wenxiang Jiao
, Xing Wang, Shilin He, Irwin King, Michael R. Lyu, Zhaopeng Tu:
Data Rejuvenation: Exploiting Inactive Training Examples for Neural Machine Translation. CoRR abs/2010.02552 (2020) - [i50]Yong Wang, Longyue Wang, Victor O. K. Li, Zhaopeng Tu:
On the Sparsity of Neural Machine Translation Models. CoRR abs/2010.02646 (2020) - [i49]Yilin Yang, Longyue Wang, Shuming Shi, Prasad Tadepalli, Stefan Lee, Zhaopeng Tu:
On the Sub-Layer Functionalities of Transformer Decoder. CoRR abs/2010.02648 (2020) - [i48]Yongchang Hao, Shilin He, Wenxiang Jiao, Zhaopeng Tu, Michael R. Lyu, Xing Wang:
Multi-Task Learning with Shared Encoder for Non-Autoregressive Machine Translation. CoRR abs/2010.12868 (2020) - [i47]Liang Ding, Longyue Wang, Di Wu, Dacheng Tao, Zhaopeng Tu:
Context-Aware Cross-Attention for Non-Autoregressive Translation. CoRR abs/2011.00770 (2020) - [i46]Wenxuan Wang, Zhaopeng Tu:
Rethinking the Value of Transformer Components. CoRR abs/2011.03803 (2020) - [i45]Jie Hao, Linfeng Song, Liwei Wang, Kun Xu, Zhaopeng Tu, Dong Yu:
Robust Dialogue Utterance Rewriting as Sequence Tagging. CoRR abs/2012.14535 (2020) - [i44]Liang Ding, Longyue Wang, Xuebo Liu, Derek F. Wong, Dacheng Tao, Zhaopeng Tu:
Understanding and Improving Lexical Choice in Non-Autoregressive Translation. CoRR abs/2012.14583 (2020) - [i43]Xuebo Liu, Longyue Wang, Derek F. Wong, Liang Ding, Lidia S. Chao, Zhaopeng Tu:
Understanding and Improving Encoder Layer Fusion in Sequence-to-Sequence Learning. CoRR abs/2012.14768 (2020)
2010 – 2019
- 2019
- [c50]Zi-Yi Dou, Zhaopeng Tu, Xing Wang, Longyue Wang, Shuming Shi, Tong Zhang:
Dynamic Layer Aggregation for Neural Machine Translation with Routing-by-Agreement. AAAI 2019: 86-93 - [c49]Baosong Yang, Jian Li, Derek F. Wong, Lidia S. Chao, Xing Wang, Zhaopeng Tu:
Context-Aware Self-Attention Networks. AAAI 2019: 387-394 - [c48]Xiang Kong, Zhaopeng Tu, Shuming Shi, Eduard H. Hovy, Tong Zhang:
Neural Machine Translation with Adequacy-Oriented Learning. AAAI 2019: 6618-6625 - [c47]Baosong Yang, Longyue Wang, Derek F. Wong, Lidia S. Chao, Zhaopeng Tu:
Assessing the Ability of Self-Attention Networks to Learn Word Order. ACL (1) 2019: 3635-3644 - [c46]Xing Wang, Zhaopeng Tu, Longyue Wang, Shuming Shi:
Exploiting Sentential Context for Neural Machine Translation. ACL (1) 2019: 6197-6203 - [c45]Jie Hao, Xing Wang, Shuming Shi, Jinfeng Zhang, Zhaopeng Tu:
Multi-Granularity Self-Attention for Neural Machine Translation. EMNLP/IJCNLP (1) 2019: 887-897 - [c44]Longyue Wang, Zhaopeng Tu, Xing Wang, Shuming Shi:
One Model to Learn Both: Zero Pronoun Prediction and Translation. EMNLP/IJCNLP (1) 2019: 921-930 - [c43]Zaixiang Zheng, Shujian Huang, Zhaopeng Tu, Xin-Yu Dai, Jiajun Chen:
Dynamic Past and Future for Neural Machine Translation. EMNLP/IJCNLP (1) 2019: 931-941 - [c42]Shilin He, Zhaopeng Tu, Xing Wang, Longyue Wang, Michael R. Lyu, Shuming Shi:
Towards Understanding Neural Machine Translation with Word Importance. EMNLP/IJCNLP (1) 2019: 953-962 - [c41]Jie Hao, Xing Wang, Shuming Shi, Jinfeng Zhang, Zhaopeng Tu:
Towards Better Modeling Hierarchical Structure for Self-Attention with Ordered Neurons. EMNLP/IJCNLP (1) 2019: 1336-1341 - [c40]Xing Wang, Zhaopeng Tu, Longyue Wang, Shuming Shi:
Self-Attention with Structural Position Representations. EMNLP/IJCNLP (1) 2019: 1403-1409 - [c39]Deng Cai, Yan Wang, Wei Bi, Zhaopeng Tu, Xiaojiang Liu, Shuming Shi:
Retrieval-guided Dialogue Response Generation via a Matching-to-Generation Framework. EMNLP/IJCNLP (1) 2019: 1866-1875 - [c38]Jie Hao, Xing Wang, Baosong Yang, Longyue Wang, Jinfeng Zhang, Zhaopeng Tu:
Modeling Recurrence for Transformer. NAACL-HLT (1) 2019: 1198-1207 - [c37]Deng Cai, Yan Wang, Wei Bi, Zhaopeng Tu, Xiaojiang Liu, Wai Lam, Shuming Shi:
Skeleton-to-Response: Dialogue Generation Guided by Retrieval Memory. NAACL-HLT (1) 2019: 1219-1228 - [c36]Jian Li, Baosong Yang, Zi-Yi Dou, Xing Wang, Michael R. Lyu, Zhaopeng Tu:
Information Aggregation for Multi-Head Attention with Routing-by-Agreement. NAACL-HLT (1) 2019: 3566-3575 - [c35]Baosong Yang, Longyue Wang, Derek F. Wong, Lidia S. Chao, Zhaopeng Tu:
Convolutional Self-Attention Networks. NAACL-HLT (1) 2019: 4040-4045 - [i42]Baosong Yang, Jian Li, Derek F. Wong, Lidia S. Chao, Xing Wang, Zhaopeng Tu:
Context-Aware Self-Attention Networks. CoRR abs/1902.05766 (2019) - [i41]Zi-Yi Dou, Zhaopeng Tu, Xing Wang, Longyue Wang, Shuming Shi, Tong Zhang:
Dynamic Layer Aggregation for Neural Machine Translation with Routing-by-Agreement. CoRR abs/1902.05770 (2019) - [i40]Jie Hao, Xing Wang, Baosong Yang, Longyue Wang, Jinfeng Zhang, Zhaopeng Tu:
Modeling Recurrence for Transformer. CoRR abs/1904.03092 (2019) - [i39]Jian Li, Baosong Yang, Zi-Yi Dou, Xing Wang, Michael R. Lyu, Zhaopeng Tu:
Information Aggregation for Multi-Head Attention with Routing-by-Agreement. CoRR abs/1904.03100 (2019) - [i38]Baosong Yang, Longyue Wang, Derek F. Wong, Lidia S. Chao, Zhaopeng Tu:
Convolutional Self-Attention Networks. CoRR abs/1904.03107 (2019) - [i37]Zaixiang Zheng, Shujian Huang, Zhaopeng Tu, Xin-Yu Dai, Jiajun Chen:
Dynamic Past and Future for Neural Machine Translation. CoRR abs/1904.09646 (2019) - [i36]Baosong Yang, Longyue Wang, Derek F. Wong, Lidia S. Chao, Zhaopeng Tu:
Assessing the Ability of Self-Attention Networks to Learn Word Order. CoRR abs/1906.00592 (2019) - [i35]Xing Wang, Zhaopeng Tu, Longyue Wang, Shuming Shi:
Exploiting Sentential Context for Neural Machine Translation. CoRR abs/1906.01268 (2019) - [i34]Shilin He, Zhaopeng Tu, Xing Wang, Longyue Wang, Michael R. Lyu, Shuming Shi:
Towards Understanding Neural Machine Translation with Word Importance. CoRR abs/1909.00326 (2019) - [i33]Longyue Wang, Zhaopeng Tu, Xing Wang, Shuming Shi:
One Model to Learn Both: Zero Pronoun Prediction and Translation. CoRR abs/1909.00369 (2019) - [i32]Xing Wang, Zhaopeng Tu, Longyue Wang, Shuming Shi:
Self-Attention with Structural Position Representations. CoRR abs/1909.00383 (2019) - [i31]Jie Hao, Xing Wang, Shuming Shi, Jinfeng Zhang, Zhaopeng Tu:
Towards Better Modeling Hierarchical Structure for Self-Attention with Ordered Neurons. CoRR abs/1909.01562 (2019) - [i30]Jie Hao, Xing Wang, Shuming Shi, Jinfeng Zhang, Zhaopeng Tu:
Multi-Granularity Self-Attention for Neural Machine Translation. CoRR abs/1909.02222 (2019) - [i29]Qintong Li, Hongshen Chen, Zhaochun Ren, Zhumin Chen, Zhaopeng Tu, Jun Ma:
EmpGAN: Multi-resolution Interactive Empathetic Dialogue Generation. CoRR abs/1911.08698 (2019) - [i28]Jian Li, Xing Wang, Baosong Yang, Shuming Shi, Michael R. Lyu, Zhaopeng Tu:
Neuron Interaction Based Representation Composition for Neural Machine Translation. CoRR abs/1911.09877 (2019) - [i27]Yong Wang, Longyue Wang, Shuming Shi, Victor O. K. Li, Zhaopeng Tu:
Go From the General to the Particular: Multi-Domain Translation with Domain Transformation Networks. CoRR abs/1911.09912 (2019) - 2018
- [j5]Zaixiang Zheng, Hao Zhou, Shujian Huang, Lili Mou, Xinyu Dai, Jiajun Chen, Zhaopeng Tu:
Modeling Past and Future for Neural Machine Translation. Trans. Assoc. Comput. Linguistics 6: 145-157 (2018) - [j4]Zhaopeng Tu, Yang Liu, Shuming Shi, Tong Zhang:
Learning to Remember Translation History with a Continuous Cache. Trans. Assoc. Comput. Linguistics 6: 407-420 (2018) - [j3]Xing Wang
, Zhaopeng Tu, Min Zhang
:
Incorporating Statistical Machine Translation Word Knowledge Into Neural Machine Translation. IEEE ACM Trans. Audio Speech Lang. Process. 26(12): 2255-2266 (2018) - [c34]Longyue Wang, Zhaopeng Tu, Shuming Shi, Tong Zhang, Yvette Graham, Qun Liu:
Translating Pro-Drop Languages With Reconstruction Models. AAAI 2018: 4937-4945 - [c33]Yong Cheng, Zhaopeng Tu, Fandong Meng, Junjie Zhai, Yang Liu:
Towards Robust Neural Machine Translation. ACL (1) 2018: 1756-1766 - [c32]Jian Li, Zhaopeng Tu, Baosong Yang, Michael R. Lyu, Tong Zhang:
Multi-Head Attention with Disagreement Regularization. EMNLP 2018: 2897-2903 - [c31]Longyue Wang, Zhaopeng Tu, Andy Way, Qun Liu:
Learning to Jointly Translate and Predict Dropped Pronouns with a Shared Reconstruction Mechanism. EMNLP 2018: 2997-3002 - [c30]Zi-Yi Dou, Zhaopeng Tu, Xing Wang, Shuming Shi, Tong Zhang:
Exploiting Deep Representations for Neural Machine Translation. EMNLP 2018: 4253-4262 - [c29]Baosong Yang, Zhaopeng Tu, Derek F. Wong, Fandong Meng, Lidia S. Chao, Tong Zhang:
Modeling Localness for Self-Attention Networks. EMNLP 2018: 4449-4458 - [c28]Fandong Meng, Zhaopeng Tu, Yong Cheng, Haiyang Wu, Junjie Zhai, Yuekui Yang, Di Wang:
Neural Machine Translation with Key-Value Memory-Augmented Attention. IJCAI 2018: 2574-2580 - [c27]Xintong Li, Lemao Liu, Zhaopeng Tu, Shuming Shi, Max Meng:
Target Foresight Based Attention for Neural Machine Translation. NAACL-HLT 2018: 1380-1390 - [i26]Longyue Wang, Zhaopeng Tu, Shuming Shi, Tong Zhang, Yvette Graham, Qun Liu:
Translating Pro-Drop Languages with Reconstruction Models. CoRR abs/1801.03257 (2018) - [i25]Zhaopeng Tu, Xiaojiang Liu, Lei Shu, Shuming Shi:
Generative Stock Question Answering. CoRR abs/1804.07942 (2018) - [i24]Yong Cheng, Zhaopeng Tu, Fandong Meng, Junjie Zhai, Yang Liu:
Towards Robust Neural Machine Translation. CoRR abs/1805.06130 (2018) - [i23]Fandong Meng, Zhaopeng Tu, Yong Cheng, Haiyang Wu, Junjie Zhai, Yuekui Yang, Di Wang:
Neural Machine Translation with Key-Value Memory-Augmented Attention. CoRR abs/1806.11249 (2018) - [i22]Deng Cai, Yan Wang, Victoria Bi, Zhaopeng Tu, Xiaojiang Liu, Wai Lam, Shuming Shi:
Skeleton-to-Response: Dialogue Generation Guided by Retrieval Memory. CoRR abs/1809.05296 (2018) - [i21]Longyue Wang, Zhaopeng Tu, Andy Way, Qun Liu:
Learning to Jointly Translate and Predict Dropped Pronouns with a Shared Reconstruction Mechanism. CoRR abs/1810.06195 (2018) - [i20]Zi-Yi Dou, Zhaopeng Tu, Xing Wang, Shuming Shi, Tong Zhang:
Exploiting Deep Representations for Neural Machine Translation. CoRR abs/1810.10181 (2018) - [i19]Baosong Yang, Zhaopeng Tu, Derek F. Wong, Fandong Meng, Lidia S. Chao, Tong Zhang:
Modeling Localness for Self-Attention Networks. CoRR abs/1810.10182 (2018) - [i18]Jian Li, Zhaopeng Tu, Baosong Yang, Michael R. Lyu, Tong Zhang:
Multi-Head Attention with Disagreement Regularization. CoRR abs/1810.10183 (2018) - [i17]Xiang Kong, Zhaopeng Tu, Shuming Shi, Eduard H. Hovy, Tong Zhang:
Neural Machine Translation with Adequacy-Oriented Learning. CoRR abs/1811.08541 (2018) - [i16]Xinwei Geng, Longyue Wang, Xing Wang, Bing Qin, Ting Liu, Zhaopeng Tu:
Learning to Refine Source Representations for Neural Machine Translation. CoRR abs/1812.10230 (2018) - 2017
- [j2]Longyue Wang
, Zhaopeng Tu, Xiaojun Zhang, Siyou Liu, Hang Li, Andy Way
, Qun Liu:
A novel and robust approach for pro-drop language translation. Mach. Transl. 31(1-2): 65-87 (2017) - [j1]Zhaopeng Tu, Yang Liu, Zhengdong Lu, Xiaohua Liu, Hang Li:
Context Gates for Neural Machine Translation. Trans. Assoc. Comput. Linguistics 5: 87-99 (2017) - [c26]Zhaopeng Tu, Yang Liu, Lifeng Shang, Xiaohua Liu, Hang Li:
Neural Machine Translation with Reconstruction. AAAI 2017: 3097-3103 - [c25]Xing Wang, Zhengdong Lu, Zhaopeng Tu, Hang Li, Deyi Xiong, Min Zhang:
Neural Machine Translation Advised by Statistical Machine Translation. AAAI 2017: 3330-3336 - [c24]Hao Zhou, Zhaopeng Tu, Shujian Huang, Xiaohua Liu, Hang Li, Jiajun Chen:
Chunk-Based Bi-Scale Decoder for Neural Machine Translation. ACL (2) 2017: 580-586 - [c23]Junhui Li, Deyi Xiong
, Zhaopeng Tu, Muhua Zhu, Min Zhang, Guodong Zhou:
Modeling Source Syntax for Neural Machine Translation. ACL (1) 2017: 688-697 - [c22]Xing Wang, Zhaopeng Tu, Deyi Xiong
, Min Zhang:
Translating Phrases in Neural Machine Translation. EMNLP 2017: 1421-1431 - [c21]