LSTMS in Chinese translation

lstms
LSTM

Examples of using Lstms in English and their translations into Chinese

{-}
  • Political category close
  • Ecclesiastic category close
  • Programming category close
The CNN LSTM architecture involves using Convolutional Neural Network(CNN) layers for feature extraction on input data combined with LSTMs to support sequence prediction.
CNNLSTM结构涉及在输入数据中使用卷积神经网络(CNN)层做特征提取并结合LSTM来支持序列预测。
RNN(If here is a densely connected unit and a nonlinearity, nowadays f is generally LSTMs or GRUs).
RNN(如果这里是密集连接的单元与非线性,那么现在f一般是LSTM或GRU)。
With that in mind, lets take a moment to think what RNNs and particularly LSTMs provide that make them so ubiquitous in NLP.
考虑到这一点,让我们花点时间来思考RNN,特别是LSTM提供了什么,使它们在NLP中无处不在。
(2015) tested more than ten thousand RNN architectures, finding some that worked better than LSTMs on certain tasks.
(2015)则在超过1万中RNN架构上进行了测试,发现一些架构在某些任务上也取得了比LSTM更好的结果。
Yu et al.(2019) find winning ticket initialisations also for LSTMs and Transformers in NLP and RL models.
Yu等人,(2019)也在NLP和RL模型中发现了LSTM和Transformer的“中奖彩票”初始化。
RNN(If there is a densely connected unit and a nonlinearity, nowadays f is generally LSTMs or GRUs).
RNN(如果存在密集连接的单元和非线性,则现在的f通常是LSTM或GRU)。
Gated networks like LSTMs and GRUs on the other hand can handle comparatively longer sequences, but even these networks have their limits!!
另一方面,像LSTM和GRU这样的门控网络可以处理相对较长的序列,但是即使这些网络也有其局限性!!为了更好地理解这一问题,还可以研究消失和爆炸的梯度。!
However, unlike standard RNN units, LSTMs can hang on to their memories, which have read/write properties akin to memory registers in a conventional computer.
然而,与标准的RNN单元不同,LSTM可以挂载在它们的存储器上,这些存储器具有类似于常规计算机中的存储器寄存器的读/写属性。
Thankfully, LSTMs don't have this problem!
然而,幸运的是,LSTM并没有这个问题!!
Thankfully, LSTMs don't have this problem!
幸运的是,LSTM没有这个问题!!
Long short-term memory networks(LSTMs) addresses this problem.
长短期记忆(LSTM)RNN解决了这个问题。
LSTMs are explicitly designed to avoid the long-term dependency problem.
LSTM被明确设计为避免长期依赖问题。
LSTMs can deal with sequences of hundreds of past inputs.
同时,LSTMs也可以处理数百个过去输入的序列。
This way, LSTMs can selectively remember or forget things.
这样,LSTM可以选择性地记住或忘记事物。
LSTMs are specifically designed to avoid the problem of long-term dependencies.
LSTM被明确设计为避免长期依赖问题。
LSTMs are explicitly designed to avoid the long-term dependency problem.
LSTM被明确设计用来避免长期依赖性问题。
Deep Learning for NLP: ANNs, RNNs and LSTMs explained!
NLP的深度学习:ANN,RNN和LSTM详解!!
LSTMs are specifically designed to avoid the problem of long-term dependencies.
LSTM就是专门为避免长程依赖问题而设计的。
LSTMs are specifically designed to avoid the problem of long-term dependencies.
LSTM被明确地设计以避免长期依赖性问题。
LSTMs are specifically designed to avoid the problem of long-term dependencies.
LSTM被明确设计用来避免长期依赖性问题。
Results: 105, Time: 0.0251

Top dictionary queries

English - Chinese