WNMT 18

Workshop on Neural Machine Translation and Generation


Location

  • Melbourne, Australia

Links

Schedule

   
9:00 Welcome and opening remarks
Findings of the Second Workshop on Neural Machine Translation and Generation
Alexandra Birch, Andrew Finch, Minh-Thang Luong, Graham Neubig, Yusuke Oda
550.0 Keynote 1
Real-time High-quality Neural MT Decoding on Mobile Devices
Jacob Devlin
10:00 Shared Task Overview
10:30 ☕️
11:00 Marian: Fast Neural Machine Translation in C++
Marcin Junczys-Dowmunt, Roman Grundkiewicz, Tomasz Dwojak, Hieu Hoang, Kenneth Heafield, Tom Neckermann, Frank Seide, Ulrich Germann, Alham Fikri Aji, Nikolay Bogoychev, André F. T. Martins, Alexandra Birch
11:30 Keynote 2
Why the Time Is Ripe for Discourse in Machine Translation(https://homepages.inf.ed.ac.uk/rsennric/wnmt2018.pdf)
Rico Sennrich
12:20 🍴
13:20 Best Paper Session
13:50 Keynote 3
Beyond Softmax: Sparsity, Constraints, Latent Structure -- All End-to-End Differentiable!(https://blogs.helsinki.fi/language-technology/files/2018/10/FoTran2018-martins.pdf)
André Martins
14:40 Keynote 4
Towards Flexible but Controllable Language Generation
Yulia Tsvetkov
15:30 ☕️
16:00 Poster session
A Shared Attention Mechanism for Interpretation of Neural Automatic Post-Editing Systems
Inigo Jauregi Unanue, Ehsan Zare Borzeshi, Massimo Piccardi
Iterative Back-Translation for Neural Machine Translation
Vu Cong Duy Hoang, Philipp Koehn, Gholamreza Haffari, Trevor Cohn
Inducing Grammars with and for Neural Machine Translation
Yonatan Bisk, Ke Tran
Regularized Training Objective for Continued Training for Domain Adaptation in Neural Machine Translation
Huda Khayrallah, Brian Thompson, Kevin Duh, Philipp Koehn
Controllable Abstractive Summarization
Angela Fan, David Grangier, Michael Auli
Enhancement of Encoder and Attention Using Target Monolingual Corpora in Neural Machine Translation
Kenji Imamura, Atsushi Fujita, Eiichirō Sumita
Document-Level Adaptation for Neural Machine Translation
Sachith Sri Ram Kothur, Rebecca Knowles, Philipp Koehn
On the Impact of Various Types of Noise on Neural Machine Translation
Huda Khayrallah, Philipp Koehn
Bi-Directional Neural Machine Translation with Synthetic Parallel Data
Xing Niu, Michael Denkowski, Marine Carpuat
Multi-Source Neural Machine Translation with Missing Data
Yuta Nishimura, Katsuhito Sudoh, Graham Neubig, Satoshi Nakamura
Towards one-shot learning for rare-word translation with external experts
Ngoc-Quan Pham, Jan Niehues, Alexander Waibel
NICT Self-Training Approach to Neural Machine Translation at NMT-2018
Kenji Imamura, Eiichirō Sumita
Fast Neural Machine Translation Implementation
Hieu Hoang, Tomasz Dwojak, Rihards Krislauks, Daniel Torregrosa, Kenneth Heafield
OpenNMT System Description for WNMT 2018: 800 words/sec on a single-core CPU
Jean Senellart, Dakun Zhang, Bo WANG, Guillaume Klein, Jean-Pierre Ramatchandirin, Josep Crego, Alexander Rush
Marian: Cost-effective High-Quality Neural Machine Translation in C++
Marcin Junczys-Dowmunt, Kenneth Heafield, Hieu Hoang, Roman Grundkiewicz, Anthony Aue
On Individual Neurons in Neural Machine Translation
D. Anthony Bau, Yonatan Belinkov, Hassan Sajjad, Nadir Durrani, Fahim Dalvi, James Glass Parameter Sharing Strategies in Neural Machine Translation
Sébastien Jean, Stanislas Lauly, Kyunghyun Cho
Modeling Latent Sentence Structure in Neural Machine Translation
Joost Bastings, Wilker Aziz, Ivan Titov, Khalil Simaan
Extreme Adaptation for Personalized Neural Machine Translation
Paul Michel, Graham Neubig
Exploiting Semantics in Neural Machine Translation with Graph Convolutional Networks
Diego Marcheggiani, Joost Bastings, Ivan Titov
17:30 Closing remarks

Tasks

  • Efficiency
  • Accuracy

Want to learn more about WNMT 18?


Edit this article →

Machine Translate is created and edited by contributors like you!

Learn more about contributing →

Licensed under CC-BY-SA-4.0.

Cite this article →