[2017-1-10]Towards Better Adequacy of Neural Machine Translation

文章来源:  |  发布时间:2017-01-09  |  【打印】 【关闭

  

  Seminar Announcement 

    

  Title: Towards Better Adequacy of Neural Machine Translation 

  Speaker:  Zhaopeng Tu (Huawei Noah’s Ark Lab, Hong Kong, China  

                    http://www.zptu.net/         

  Time: 10:00am, January 10th, 2017 

  Venue: __Room 337__, Level 3, Building 5, 

        Institute of Software, Chinese Academy of Sciences.

    

  Abstract: 

  Although end-to-end Neural Machine Translation (NMT) has achieved remarkable progress in the past two years, it suffers from a major drawback: translations generated by NMT systems often lack of adequacy. We attribute this to the following limitations of NMT:  

  1. Source representation misses necessary decoding information (e.g., coverage); 

  2. Rough transformation of undervalued source information; 

  3. The likelihood objective favors translation fluency over adequacy. 

  In response to these problems, we propose three models to enhance adequacy of NMT, which are complementary to each other.  The works have been published in ACL 2016, TACL 2017, and AAAI 2017, respectively.  

    

  Biography: 

  Zhaopeng Tu is a researcher at Huawei Noah’s Ark Lab, Hong Kong. Prior to that, he was a postdoctoral researcher in University of California at Davis from 2013 to 2014. He obtained his PhD from Institute of Computing Technology, CAS at 2013, and bachelor from Beihang University at 2008. His research focuses on the natural language processing (NLP) and its applications. He has published 15 papers in top-tier conferences and journals, including ACL, TACL, NAACL, AAAI, FSE, and ICSE.