Examples of using
的激活函数
in Chinese and their translations into English
{-}
Political
Ecclesiastic
Programming
例如,以下是一些常用的预测建模问题类型,以及它们可以在输出层使用的结构和标准的激活函数:.
For example, below are some common predictive modeling problem types and the structure and standard activation function that you can use in the output layer.
我们继续用softmax来作为最后一层的激活函数,因为它在分类问题中工作的最好。
We keep softmax as the activation function on the last layer because that is what works best for classification.
它们使用了线性修正单位(ReLUs,使用的激活函数是max(0,x))。
They have rectified linear units(ReLUs, with activation function max(0, x)).
One big problem with RNNs is the vanishing(or exploding) gradient problem where, depending on the activation functions used, information rapidly gets lost over time.
One big problem with RNNs is the vanishing(or exploding) gradient problem where, depending on the activation functions used, information rapidly gets lost over time.
这里我们使用的是tanh作为隐藏层的激活函数。
The tanh function is called the hidden layer activation function.
非线性的激活函数可以帮助我们处理非线性的假设。
A nonlinear activation function is what allows us to fit nonlinear hypotheses.
浅谈深度学习中的激活函数-TheActivationFunctioninDeepLearning.
Most commonly used activation functions in Deep Learning→.
在中间层上,我们将使用最经典的激活函数:sigmoid:.
On intermediate layers however we will use the the most classical activation function: the sigmoid.
为此,我们将使用名为“softmax”的激活函数。
Instead, we are going to use the softmax function.
但在中间层,我们要使用最经典的激活函数:sigmoid函数。
On intermediate layers however we will use the the most classical activation function: the sigmoid.
我们初始化一种使用的RNN单元格(大小100)和我们想要的激活函数的类型。
We initialize a type of RNN cell to use(size 100) and the type of activation function we want.
我们只将Dropout作用在模型的训练阶段,即我们可以把模型的激活函数修改为:.
It's possible to model the application of Dropout, in the training phase only, to the given projection as a modified activation function.
在他们的网络中,研究人员使用了一种有效的激活函数,称为整流线性单元(relu)。
In their network, the researchers used an efficient activation function called a rectified linear unit(ReLU).
当我们使用一个不同的激活函数,最大的变化是公式(5)中用于偏导数的特定值的改变。
The main thing that changes when we use a different activation function is that the particular values for the partial derivatives in Equation(5).
CEC被连接到许多非线性自适应单元上(有一些单元具有乘法的激活函数),因此需要学习非线性行为。
CECs are connected to several nonlinear adaptive units(some with multiplicative activation functions) needed for learning nonlinear behavior.
Even leading techniques can calculate only polynomial functions- a nonstarter for the many activation functions in machine learning that are non-polynomial.
在早期,Sigmoid函数和tanh函数是人们经常使用的激活函数。
Earlier Sigmoid and Tanh were the most widely used activation function.
中文
Bahasa indonesia
日本語
عربى
Български
বাংলা
Český
Dansk
Deutsch
Ελληνικά
Español
Suomi
Français
עִברִית
हिंदी
Hrvatski
Magyar
Italiano
Қазақ
한국어
മലയാളം
मराठी
Bahasa malay
Nederlands
Norsk
Polski
Português
Română
Русский
Slovenský
Slovenski
Српски
Svenska
தமிழ்
తెలుగు
ไทย
Tagalog
Turkce
Українська
اردو
Tiếng việt