SOFTMAX in English translation

Examples of using Softmax in Chinese and their translations into English

{-}
  • Political category close
  • Ecclesiastic category close
  • Programming category close
如果你很熟悉logistic回归,可以把softmax看作是它在多类别上的一般化。
If you're familiar with the logistic function you can think of softmax as its generalization to multiple classes.
如果模型要解决多类别分类问题,则对数通常变成softmax函数的输入。
If the model is solving a multi-class classification problem, logits typically become an input to the softmax function.
在前一节中,我们利用自编码器来学习输入至softmax或logistic回归分类器的特征。
In the previous section, you used an autoencoder to learn features that were then fed as input to a softmax or logistic regression classifier.
其它常见激活函数还有对数几率(又称作sigmoid),tanh和softmax
Other activation functions you will see are the logistic(often called the sigmoid), tanh, and softmax functions.
基于采样的方法则是完全去掉softmax层,优化其它目标函数来近似softmax
Sampling-based approaches on the other hand completely do away with the softmax layer and instead optimise some other loss function that approximates the softmax.
我们输入一个33层卷积的结构,然后是完全连接的层和softmax
We arrive at an architecture which is 33 layers of convolution followed by a fully connected layer and a softmax.
基于sampling的方法则完全抛弃了softmax层,而是优化其它形式的损失函数来代替softmax
Sampling-based approaches on the other hand completely do away with the softmax layer and instead optimise some other loss function that approximates the softmax.
然后,将证据转换成我们预测的概率y通过使用“softmax”函数:.
We then convert the evidence tallies into our predicted probabilities, y using the softmax function.
例如,ImageNet上经过预先训练的网络带有1000个类别的softmax层。
For example, pre-trained network on ImageNet comes with a softmax layer with 1000 categories.
后续在第6章中,我们有时会使用softmax输出层搭配log-likelihood代价函数。
Later, in Chapter 6, we will sometimes use a softmax output layer, with log-likelihood cost.
但是如果我们使用不同的,我们会得到不同的函数,尽管如此,最后得到的结果也和softmax很相似。
But if we use a different value of$c$ we get a different function, which is nonetheless qualitatively rather similar to the softmax.
而不是通过逻辑函数,它通过softmax函数,它被写为,.
Instead of passing through logistic function, it passes through the softmax function, which is written as.
要将这些对数转换为每个类别的概率,请使用softmax函数:.
To convert these logits to a probability for each class, use the softmax function.
例如,在PyTorch中,我会混淆NLLLoss和CrossEntropyLoss,因为一个需要softmax输入,而另一个不需要。
For example, in PyTorch I would mix up the NLLLoss and CrossEntropyLoss as the former requires a softmax input and the latter doesn't.
在第一个例子中,三个类别是互斥的,因此更适于选择softmax回归分类器。
In the first case, the classes are mutually exclusive, so a softmax regression classifier would be appropriate.
这次,Softmax的40人的团队和NamcoBandai合作,通过使用最新的虚幻引擎3技术来创建针对Xbox360平台的Magnacarta2。
This time around, a team of 40 at Softmax partnered with Namco Bandai to create Magnacarta 2 for Xbox 360 using the latest Unreal Engine 3 technology.
softmax回归.
Softmax Regression.
Softmax函数定义为:.
The softmax function is defined as.
LogSumExpSoftmax函数广义均值.
LogSumExp Softmax function Generalized mean.
其中softmax定义为:.
Softmax is defined as.
Results: 153, Time: 0.0163

Top dictionary queries

Chinese - English