Google
¡¿
Past month
  • Any time
  • Past hour
  • Past 24 hours
  • Past week
  • Past month
  • Past year
All results
6 days ago ¡¤ This paper introduces a novel neural network architecture called RNN Encoder-Decoder for improving phrase-based Statistical Machine Translation (SMT). The model ...
Oct 4, 2024 ¡¤ Learning phrase representations using rnn encoder-decoder for statistical ma- chine translation. In EMNLP, 2014. Shmuel Bar David, Itamar Zimerman, Eliya ...
Sep 21, 2024 ¡¤ In this work, we evaluate two approaches that use neural networks to generate obfuscated texts. The first approach uses Generative Adversarial Networks.
Oct 2, 2024 ¡¤ In this section, we review recurrent neural networks (RNNs). RNNs are recurrent sequence models that maintain a hidden state across time steps, capturing ...
Sep 16, 2024 ¡¤ Schwenk, and Y. Bengio, Learning phrase representations using RNN encoder-decoder for statistical machine translation, arXiv preprint 1406.1078, 2014. Crossref.
5 days ago ¡¤ The Encoder-Decoder architecture enabled flexible processing of these sequences, making it effective for tasks like machine translation, where input and output ...
Oct 2, 2024 ¡¤ The encoder RNN processes an input sequence into a sequence of hidden vectors, and the decoder RNN processes the sequence of hidden vectors to an output ...
Sep 15, 2024 ¡¤ Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction.
Video for Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation
Sep 19, 2024 ¡¤ proposed gated recurrent unit (GRU) in their paper "Learning Phrase Representations using ...
Duration: 14:22
Posted: Sep 19, 2024
Sep 30, 2024 ¡¤ Learning phrase representations using RNN encoder-decoder for statistical machine translation. ChungJ. et al.