随机森林 - 翻译成英语

random forest
随机 森林
random forests
随机 森林

在 中文 中使用 随机森林 的示例及其翻译为 英语

{-}
  • Political category close
  • Ecclesiastic category close
  • Programming category close
在每个分割点处要搜索的特征的数量被指定为随机森林算法的参数。
The number of features to be searched at each split point is specified as a parameter to the random forest algorithm.
在这个比较中,在训练数据中表现最好的预测方法是排名方法和随机森林
Within this comparison, the best-performing prediction methods on the training data turn out to be the ranking methods and the random forests.
我使用R语言“随机森林包”中的Randomforest()函数,该函数采用非参数Breiman随机森林算法来产生回归模型。
I used the randomForest() function from the R package"randomForest", which uses the non-parametric Breiman random forest algorithm to produce regression models.
在每个分叉点要搜索的特征数量被指定为随机森林算法的一个参数。
The number of features to be searched at each split point is specified as a parameter to the random forest algorithm.
黑匣子(不透明)模型:深度神经网络、随机森林和梯度增强机器可以在此类别中考虑。
Black box(opaque) models: Deep neural networks, random forests, and gradient boosting machines can be considered in this category.
Jared:最近看到的最流行的算法和模型包括弹性网络(ElasticNet)、决策树和随机森林
Jared: The most popular algorithms and models I have seen lately are the Elastic Net, Decision Trees and Random Forests.
集成的方法,如随机森林(RF)或梯度提升树(GBM),则能结合许多独立训练树的预测。
Ensemble methods, such as Random Forests(RF) and Gradient Boosted Trees(GBM), combine predictions from many individual trees.
随机森林的缺点是,学习速度可能会很慢(取决于参数设置),并且不能对已经生成的模型进行迭代改进。
With Random Forest however, learning may be slow(depending on the parameterization) and it is not possible to iteratively improve the generated models.
所以随机森林模型产生的最终决策是所有决策树投票的结果。
So the final decision produced by the random forest model is the result of voting by all the decision trees.
随机森林(RF)和梯度增强树(GBM)等集成方法结合了许多单独树的特性。
Ensemble methods, such as Random Forests(RF) and Gradient Boosted Trees(GBM), combine predictions from many individual trees.
基于树的方法,例如随机森林和梯度提升机(分别是左下角和右上角的图)比一般线性模型更好。
Tree-based methods, such as Random Forests and Gradient Boosted Machines, in the lower left and upper right, respectively, perform better than a General Linear Model.
随机森林的学习速度可能很慢(取决于参数化),而且不可能迭代地改进生成模型。
With Random Forest however, learning may be slow(depending on the parameterization) and it is not possible to iteratively improve the generated models.
定义随机森林建立多个决策树并将它们合并在一起以获得更准确和稳定的预测。
A random forest develops multiple decision trees and merges them to acquire a more stable and accurate prediction.
随机森林是通过组合多棵决策树分类器进行预测的,因此形成了“森林”,这也就是其名称的由来。
With random forests, the predicted outcomes from many decision trees are combined, thus the term-“Forest.”.
例如,随机森林算法将随机决策树与Bagging相结合,以实现更高的分类准确度。
As an example, the random forest algorithm combines random decision trees with bagging to achieve very high classification accuracy.
随机森林训练算法把bagging的一般技术应用到树学习中。
The training algorithm for random forests applies the general technique of bagging to tree learners.
例如:随机森林算法中树的个数或K-近邻算法中设定的邻居数。
Examples would be the number of trees in a random forest or the number of neighbors used in K-nearest neighbors algorithm.
随机森林算法就是一种集合方法,结合了许多用不同数据集样本训练的决策树。
The Random Forest algorithms is an ensemble method that combines many Decision Trees trained with different samples of the data sets.
集成建模的最佳例子是随机森林树,其中许多决策树用于预测结果。
One of the most suitable examples of ensemble modeling is the random forest trees where several decision trees are used to predict outcomes.
但是,我可以尝试一下随机森林(或者SVM或者……这里脑补自己偏爱的技术)也能够工作的。
On the other hand, I can try a random forest[or SVM or$\ldots$ insert your own favorite technique] and it just works.
结果: 157, 时间: 0.0215

单词翻译

顶级字典查询

中文 - 英语