Neural Machine TranslationCambridge University Press, 18 июн. 2020 г. - Всего страниц: 406 Deep learning is revolutionizing how machine translation systems are built today. This book introduces the challenge of machine translation and evaluation - including historical, linguistic, and applied context -- then develops the core deep learning methods used for natural language applications. Code examples in Python give readers a hands-on blueprint for understanding and implementing their own machine translation systems. The book also provides extensive coverage of machine learning tricks, issues involved in handling various forms of data, model enhancements, and current challenges and methods for analysis and visualization. Summaries of the current research in the field make this a state-of-the-art textbook for undergraduate and graduate classes, as well as an essential reference for researchers and developers interested in other applications of neural methods in the broader field of human language processing. |
Содержание
The Translation Problem | 3 |
Uses of Machine Translation | 19 |
History | 29 |
Evaluation | 41 |
Part II | 47 |
Neural Networks | 67 |
3 | 78 |
Computation Graphs | 89 |
Adaptation | 239 |
Beyond Parallel Corpora | 263 |
15 | 281 |
Current Challenges | 293 |
20 | 294 |
33 | 307 |
Analysis and Visualization | 311 |
Bibliography | 343 |
Neural Language Models | 103 |
8 | 125 |
Decoding | 143 |
Machine Learning Tricks | 171 |
11 | 193 |
Revisiting Words | 213 |
41 | 359 |
52 | 375 |
385 | |
388 | |
Другие издания - Просмотреть все
Часто встречающиеся слова и выражения
adaptation adding additional alignment allows Association for Computational attention beam better called Chapter combination common computation graph Computational Linguistics conditioning Conference consider context convolutional corpus correct cost decoder deep distribution domain encoder English error et al evaluation example FF FF FF Figure function give given gradient hidden human idea improve input sentence input word International language model layer learning Long machine translation models machine translation systems mapping match matrix meaning methods monolingual multiple Natural Language Processing neural machine translation node objective optimization output word parallel parameters performance phrase prediction probability problem Proceedings produce propose recurrent neural network relevant representation requires score semantic sentence sentence pairs sequence Short similar single Softmax space statistical machine translation step task Technologies token training data typically updates values vector Volume weight word embeddings