Examples of using
递归神经网络
in Chinese and their translations into English
{-}
Political
Ecclesiastic
Programming
我们将首先讨论一些关于“正常”前馈神经网络的重要事实,你需要知道,以正确理解递归神经网络。
We will first discuss some important facts about the„normal“ Feed Forward Neural Networks, that you need to know, to understand Recurrent Neural Networks properly.
至于回归的技术方面,各种类型的递归神经网络效果最好。
As for technical aspects of regression, various types of recurrent neural networks work best.
最常见的技术称为Word2Vec,它会告诉您如何使用递归神经网络来创建词向量。
The most common technique for this is called Word2Vec, but I will show you how recurrent neural networks can also be used for creating word vectors.
最后,我们还将使用双向递归神经网络(BRNN)。
Finally, we will also be making use of bidirectional recurrent neural networks(BRNN).
它成立了一所AI学校,提供各种各样的课程,从哲学到伦理学,再到为排序问题开发递归神经网络。
It has launched an AI school that offers classes in everything from philosophy and ethics to building recurrent neural networks for sequencing problems.
超过60种可用的中枢网络模块,允许建立卷积和递归神经网络以及任意图形结构的网络。
Over 60 types of available neural network modules that allow to build convolutional and recurrent neural networks and networks with arbitrary graph structure.
This code pattern provides an introduction to a generative language model that uses long short-term memory(LSTM) layers and a recursive neural network(RNN).
说到递归神经网络,我们不应该提及我们前面提到的LSTM模型。
Speaking of recursive neural networks, we would be remiss not to bring up the LSTM model we mentioned earlier.
明天是属于机器学习、递归神经网络以及相似技术的,但是今天,长期建立的语言工程方法仍占上风。
Tomorrow belongs to deep learning- to recurrent neural networks and the like- but for today, long-established language-engineering approaches still prevail.
长期短期记忆网络(LSTM)是递归神经网络的延伸,其基本上扩展了它们的记忆。
Long Short-Term Memory(LSTM) networks are an extension for recurrent neural networks, which basically extends their memory.
这是一个利用递归神经网络(RNN)来预测图像在二维空间中缺失像素的系统。
This is a system that makes use of a Recurrent Neural Network(RNN) to predict the missing pixels in an image along two spatial dimensions.
递归神经网络等模型因其识别时间的能力而在自然语言理解中得以应用。
Models such as recurrent neural networks have seen applicability into natural language understanding because of its ability to recognize time.
这个很简单的想法真的起作用了--训练递归神经网络是可能的。
This fairly simple idea actually worked- it was possible to train recurrent neural nets.
The method they proposed, presented in a paper published in Elsevier's Applied Energy journal, is based on a long short-term memory(LSTM) recurrent neural network;
These models are a special extension of recurrent neural networks(RNNs) that are more accurate, especially in noisy environments, and they are blazingly fast!
递归神经网络*数据可向任何方向流动的地方。
Recurrent Neural Networks- Where data can flow in any direction.
一个简单的递归神经网络只适用于短期记忆。
The simple recurrent neural network works well only for short-term memory.
在2014年底,递归神经网络获得了更多的关注。
At the end of 2014, recurrent neural networks gained much more emphasis.
年4月,Facebook发布Caffe2,加入了递归神经网络等新功能。
In April 2017, Facebook announced Caffe2, which includes new features such as Recurrent Neural Networks.
几乎所有基于递归神经网络中令人兴奋的成果都是用它们实现的。
Almost all exciting results based on recurrent neural networks are achieved with them.
中文
Bahasa indonesia
日本語
عربى
Български
বাংলা
Český
Dansk
Deutsch
Ελληνικά
Español
Suomi
Français
עִברִית
हिंदी
Hrvatski
Magyar
Italiano
Қазақ
한국어
മലയാളം
मराठी
Bahasa malay
Nederlands
Norsk
Polski
Português
Română
Русский
Slovenský
Slovenski
Српски
Svenska
தமிழ்
తెలుగు
ไทย
Tagalog
Turkce
Українська
اردو
Tiếng việt