ACTIVATION FUNCTIONS - 翻译成中文

[ˌækti'veiʃn 'fʌŋkʃnz]
[ˌækti'veiʃn 'fʌŋkʃnz]
激活函数
激活功能

在 英语 中使用 Activation functions 的示例及其翻译为 中文

{-}
  • Political category close
  • Ecclesiastic category close
  • Programming category close
Although newer activation functions are gaining traction, most deep neural networks these days use ReLU or one of its closely related variants.
尽管新的激活函数正在发展,大多数深度神经网络还是在用ReLU或者它的一个很接近的变体。
ReLUs are often used as activation functions in Deep Neural Networks.
最近几年,ReLU函数被经常使用作为深度神经网络的激活函数
Indeed, the problem is harder even than I have described, for there are infinitely many possible activation functions.
实际上,这个问题比我已经讲过的还要困难,因为其实是有无穷多的可能的激活函数
The basic structure of Artificial Neural Networks was presented, as well as some of the most commonly used activation functions.
其中介绍了人工神经网络的基本结构以及一些最常用的激活函数
These assumptions appear everywhere in deep learning literature, from weight initialization, to activation functions, to the optimization algorithms which train the network.
这个假设在深度学习中的每个地方都会出现,从权重因子的初始化,到活化函数,再到训练网络的优化算法。
In particular, neural layers, cost functions, optimizers, initialization schemes, activation functions, regularization schemes are all standalone modules that you can combine to create new models.
特别是神经网络层、损失函数、优化器、初始化方法、激活函数、正则化方法,它们都是可以结合起来构建新模型的模块。
Because most common activation functions of the network's neurons such as tanh or sigmoid are defined on the[-1, 1] or[0, 1] interval respectively.
大多数神经网络架构都需要标准化数据,因为tanh和sigmoid等大多数神经元的激活函数都定义在[-1,1]或[0,1]区间内。
Most common activation functions of the network's neurons such as tanh or sigmoid are defined on the[-1, 1] or[0, 1] interval respectively.
大多数神经网络架构都需要标准化数据,因为tanh和sigmoid等大多数神经元的激活函数都定义在[-1,1]或[0,1]区间内。
One big problem with RNNs is the vanishing(or exploding) gradient problem where, depending on the activation functions used, information rapidly gets lost over time.
循环神经网络的一个问题是,正在消失的(或者正在爆发的)梯度问题--取决于已使用的激活函数,信息会随着时间的流逝而快速丢失。
One big problem with RNNs is the vanishing(or exploding) gradient problem where, depending on the activation functions used, information rapidly gets lost over time.
RNNs的一个大问题是梯度消失(或爆炸)问题,其中取决于所使用的激活函数,随着时间的推移信息会迅速丢失。
Option A is true, because activation function can be reciprocal function..
选项A是正确的,因为激活函数可以是互反函数.
What is an activation function and why to use them?
什么是激活功能,为什么需要它们??
In fact, the screen switch also has the system activation function.
事实上,屏幕开关也具有系统激活功能
There are other functions which can replace this activation function.
还有其他功能可以代替这种激活功能:.
You still need to add a bias and feed the result through an activation function.
您仍然需要添加偏差并通过激活功能提供结果。
What other activation function can we use besides the Sigmoid function?.
除了Sigmoid函数之外,我们还可以使用哪些激活函数??
Why an Activation Function?
为什么要用activationfunctions??
Why an Activation Function?
为什么会有这么多activationfunction?
Selection of Activation Function.
选择不同的activationfunction.
If we use the ReLU activation function to predict the price of a house based on its size, this is how the predictions may look.
如果我们根据房子大小并使用ReLU激活函数来预测一个房子的价格,这就是我们可能看到的预测结果。
结果: 60, 时间: 0.0264

单词翻译

顶级字典查询

英语 - 中文