TENSORS in Chinese translation

张量
tensor
tensors

Examples of using Tensors in English and their translations into Chinese

{-}
  • Political category close
  • Ecclesiastic category close
  • Programming category close
Constant(12) Tensor object will promote all math operations to tensor operations, and as such all return values with be tensors..
Constant(12)Tensor对象将把所有数学运算转化为张量运算,因此所有返回值皆将为张量。
Js uses the GPU to accelerate math operations, it's necessary to manage GPU memory when working with tensors and variables.
Js使用GPU来加速数学运算,因此在处理张量和变量时需要管理GPU内存。
Batch_first- If True, then the input and output tensors are provided as(batch, seq, feature).
Batch_first-如果为True,那么输入和输出Tensor的形状为(batch,seq,feature).
This explains why you often hear that scalars are tensors of rank 0: since they have no direction, you can represent them with one number.
这就是为什么你常会听到人们说标量是0秩的张量:因为没有方向,你可以使用一个数字表示它。
Multiprocessing to have all the tensors sent through the queues or shared via other mechanisms, moved to shared memory.
Multiprocessing就可以所有张量通过队列发送,或通过其它机制共享,移动到共享内存.
The modern(component-free) approach views tensors initially as abstract objects, expressing some definite type of multi-linear concept.
现代(无分量)方法把张量首先视为抽象对象,表达了多线性概念的某种确定类型。
Remember that that our model accepts tensors of the shape[N, 28, 28,1].
记住,我们的模型接受形状的张量[N,28,28,1]。
Feeding is a mechanism in the tf. Session API that allows you to substitute different values for one or more tensors at run time.
Feeding是TensorFlowSessionAPI的一种机制,它允许你在运行时用不同的值替换一个或多个tensor的值。
Tidy executes a function and purges any intermediate tensors created, freeing up their GPU memory.
Tidy执行一个函数并清除所有创建中间张量,释放它们的GPU内存。
TensorFlow programs use the tensor data structure to represent all data only tensors are passed between operations in the computation graph.
TensorTensorFlow程序使用tensor数据结构来代表所有的数据,计算图中,操作间传递的数据都是tensor.
And then you can have tensors with 3, 4, 5 or more dimensions.
然后你可以有3,4,5或者更多维的张量
Because tensors are immutable, these ops do not change their values; instead, ops return new tensors.
由于Tensor是不可改变的,这些Op不会改变它们的值,而会返回新的Tensor
It is possible, however, to serialize arbitrary data structures as strings and store those in tf. Tensors.
但是,可以将任意数据结构序列化为string并将其存储在tf.Tensor中。
Add(a, b) will create an operation node that takes two tensors a and b as input and produces their sum c as output.
Add(a,b)将创建需要两个张量的操作的节点a和b作为输入,并产生它们的总和c作为输出。
TensorFlow and Numpy are friends: when preparing the computation graph, you only manipulate TensorFlow tensors and commands such as tf. matmul, tf. reshape and so on.
TensorFlow和Numpy是朋友:在准备计算图时,你只需要操纵TensorFlow张量和命令,比如tf.matmul,tf.reshape等。
A second order tensor is a matrix(2 indices) and third-order tensors(3 indices) and higher are called higher-order tensors(more than 3 indices).
二阶张量是矩阵(2个指数)和三阶张量(3个指数),更高的称为高阶张量(超过3个指数)。
In an N-dimensional space, scalars will still require only one number, while vectors will require N numbers, and tensors will require N^R numbers.
在一个N维空间中,标量仍然只需要一个数字,而向量则需要N个数字,张量则需要N^R个数字。
However, for constructing low-rank tensors, we recommend using the following functions to enhance code readability: tf. scalar, tf. tensor1d, tf. tensor2d, tf. tensor3d and tf. tensor4d.
但有时候,为了方便构造更简单的Tensors,建议使用tf.scalar,tf.tensor1d,tf.tensor2d,tf.tensor3d和tf.tensor4d,这样也会更强代码的可读性。
Tensors are important in physics and engineering.
张量在物理和工程学中很重要。
Returns the current strategy for sharing CPU tensors.
返回用于共享CPU张量的当前策略.
Results: 547, Time: 0.0311

Top dictionary queries

English - Chinese