IN NEURAL NETWORKS in Chinese translation

[in 'njʊərəl 'netw3ːks]
[in 'njʊərəl 'netw3ːks]
神经网络

Examples of using In neural networks in English and their translations into Chinese

{-}
  • Political category close
  • Ecclesiastic category close
  • Programming category close
If you have been told many times that hidden layers in neural networks“abstract functions”, you should be a little bit surprised by this.
如果你曾数次听说过神经网络中的隐藏层可以「抽象化函数」,那么你应该对此感到惊讶。
The resurgent research in neural networks has given rise to the invention of models with thousands of layers.
神经网络的复兴研究导致了具有数千层的模型的发明。
Dropout Layers Now, dropout layers have a very specific function in neural networks.
现在,DropoutLayers在神经网络中有一个非常特殊的功能。
In neural networks, we restrict the search to a continuous subset of the program space.
在神经网络示例中,我们将搜索限制在程序空间的一个连续子集里。
In 2017, we will see more reinforcement learning in neural networks, more research on neural networks in NLP& vision.
年,我们将看到更多的强化学习在神经网络上的应用,以及更多神经网络领域的自然语言处理和视觉的研究。
But in neural networks there are large numbers of parameters and hyper-parameters, and extremely complex interactions between them.
但是在神经网络中,存在大量的参数和超参数及其间极其复杂的交互。
Attention in Neural Networks has a long history, particularly in image recognition.
注意机制在神经网络中的使用由来已久,特别是用在图像识别中。
The dot product is a very common operation in neural networks so let's see it in action.
点乘在神经网络中是一种非常常用的运算,所以一起看看它。
New and improved training techniques such as unsupervised pre-training and layer-wise training have caused a resurgence in neural networks.
近年来,新的、改进的训练技术,如无监督的预训练和分层贪婪训练,复苏了人们对神经网络的兴趣。
Many of the most magical pieces of consumer technology we have today are thanks to advances in neural networks and machine learning.
我们今天所拥有的许多最神奇的消费技术,都要归功于神经网络和机器学习的进步。
Optical computers are also especially suitable for fast matrix multiplication, one of the key operations in neural networks.
光学计算机也特别适用于快速矩阵乘法,这是神经网络中的关键运算之一。
In the mid-1980s and early 1990s, many important architectural advancements were made in neural networks.
在20世纪80年代中期和90年代初期,许多重要的模型架构进步都是在神经网络中进行的。
Hence, switching from a sigmoid activation function to ReLU(Rectified Linear Unit) is one of the biggest breakthroughs we have seen in neural networks.
因此,从sigmoid激活函数转换到ReLU(整流线性单元)是我们在神经网络中看到的最大突破之一。
It works directly in the browser, supports multiple learning techniques, and is rather low-level, making it suitable for people with bigger experience in neural networks.
它可以直接在浏览器中跑起来,支持多种学习技术,而且相当底层,所以适合于对神经网络具有较多经验的人来使用。
To gain expertise in working in neural network try out our deep learning practice problem- Identify the Digits.
想要获取神经网络的专业知识,请尝试深度学习的练习题:IdentifytheDigits。
Recent developments in neural network(aka“deep learning”) approaches have greatly advanced the performance of these state-of-the-art visual recognition systems.
近期神经网络(也就是“深度学习”)方法上的进展极大地提升了这些代表当前发展水平的视觉识别系统的性能。
The development of stable and speedy optimizers is a major field in neural network an deep learning research.
稳定、快速的优化器的开发,一直是神经网络和深度学习领域的重要研究。
Overcoming catastrophic forgetting in neural networks.
论文:克服神经网络中的灾难性遗忘.
Next Post Overcoming catastrophic forgetting in neural networks.
论文:克服神经网络中的灾难性遗忘.
Activation functions are an important concept in neural networks.
激活函数是神经网络中极其重要的概念。
Results: 5997, Time: 0.0362

Word-for-word translation

Top dictionary queries

English - Chinese