Neural Machine Translation and Sequence-to-sequence Models: A Tutorial

    xiaoxiao2021-04-17  39

    https://arxiv.org/abs/1703.01619 Graham Neubig (Submitted on 5 Mar 2017) This tutorial introduces a new and powerful set of techniques variously called "neural machine translation" or "neural sequence-to-sequence models". These techniques have been used in a number of tasks regarding the handling of human language, and can be a powerful tool in the toolbox of anyone who wants to model sequential data of some sort. The tutorial assumes that the reader knows the basics of math and programming, but does not assume any particular experience with neural networks or natural language processing. It attempts to explain the intuition behind the various methods covered, then delves into them with enough mathematical detail to understand them concretely, and culiminates with a suggestion for an implementation exercise, where readers can test that they understood the content in practice. Comments: 65 Pages Subjects: Computation and Language (cs.CL); Learning (cs.LG); Machine Learning (stat.ML) Cite as: arXiv:1703.01619 [cs.CL]   (or arXiv:1703.01619v1 [cs.CL] for this version)

    Submission history

    From: Graham Neubig [ view email]  [v1] Sun, 5 Mar 2017 16:10:11 GMT (619kb,D)
    转载请注明原文地址: https://ju.6miu.com/read-673270.html

    最新回复(0)