说明:双击或选中下面任意单词,将显示该词的音标、读音、翻译等;选中中文或多个词,将显示翻译。
您的位置:首页 -> 词典 -> 负梯度下降法
1)  negative gradient descent
负梯度下降法
1.
In view of the problem that the nearest neighbor-clustering algorithm can not reflect the distribution of study samples,an improved learning algorithm is proposed that the center,width and weight are adjusted on-line using negative gradient descent.
针对最近邻聚类学习算法不能很好地反映学习样本的分布规律,提出采用负梯度下降法在线修正网络的中心、宽度和权值的最近邻聚类学习算法,使径向基函数神经网络不仅可以自动确定网络结构,而且可以自动适应学习样本的分布规律,从而有效地提高了径向基函数神经网络的学习精度。
2)  method of negative gradient descent
负梯度下降
1.
To reduce the iterative time and speed constringency in conventional BP algorithm,a improved weighted algorithm,which combines the method of negative gradient descent with DFP variable scale algorithm,is presented in the paper.
为了减小标准BP算法中迭代次数并提高其收敛速度,提出了将负梯度下降法与DFP变尺度算法相结合进行权值修正的方法,在误差寻优初期,首先采用标准BP算法进行迭代,每迭代一次的工作量较小、所需存贮量较少,且对初始点的要求不高。
3)  gradient descent method
梯度下降法
1.
On the basis of data mining,a new method is developed for identifying fuzzy model,updating its parameters and determining optimal division of output space simultaneously by means of fuzzy sets theory and the improved gradient descent method.
该方法不仅能够修剪冗余和冲突的初始模糊规则,而且通过引入动态误差传递因子,解决了梯度下降法中存在的收敛速度和振荡之间的冲突问题。
2.
Levenberg Marquardt (LM) algorithm is a combination of neural network, Gauss Newton method and gradient descent method, so it has the abilities of self learning, fast convergence and global search.
Levenberg Marquardt算法是神经网络、高斯 牛顿法与梯度下降法的结合 ,既有神经网络的自学习特性 ,也有高斯 牛顿法的快速收敛特性 ,还有梯度下降法的全局搜索特性。
3.
Two ways are used to design the network,the one is the direct energy descent method,and the other is the gradient descent method.
有两种方法可以用来设计网络 ,一是能量下降法 ,另一个是梯度下降法。
4)  gradient descent algorithm
梯度下降法
1.
Comparison between GA and gradient descent algorithm in parameter optimization of UPFC fuzzy damping controller;
基于遗传算法的UPFC模糊阻尼控制器参数优化及与梯度下降法的比较
2.
A few or all of the parameters of the controller are adjusted by using the gradient descent algorithm to minimize the output error.
采用梯度下降法调节部分或全部参数以减小输出误差。
3.
A robust speech recognition system based on discriminative learning of environmental features is proposed for recognition of environmental features in high noise background, and a gradient descent algorithm was adopted for parameters optimization.
介绍了一种环境特征判别学习的Robust语音识别方法 ,该方法基于最小分类错误准则利用梯度下降法迭代地学习环境特征 ,实现了高噪声背景下命令语音识别系统 。
5)  gradient descending method
梯度下降法
1.
Aimed at the feature of weak local searching ability of the genetic algorithm,let it be combined with the gradient descending method so as to raise its partial searching ability,and a test has been carried out.
文中针对遗传算法局部搜索能力弱的特点,将其与梯度下降法结合,提高其局部搜索能力,并进行了测试。
2.
Then,the gradient descending method is applied to optimize the weights by iterative arithmetic to minimize the errors of objective value.
运用最优梯度下降法使目标的计算值与期望值误差最小来迭代优选权重,建立迭代算法模型。
6)  gradient-descent
梯度下降法
1.
It needs to work out parameters of the base function in the process of finding bias field,but conventional methods such as gradient-descent method often find local best.
在求偏移场的过程中,需要求解基函数的参数,由于传统的梯度下降法易陷入局部最优,为解决此问题,提出将遗传算法引入到参数求解过程中,然而传统的遗传算法不仅时间复杂度高,且易陷入局部最优,为此需对遗传算法进行改进,使得不仅更容易得到全局最优解,且时间复杂度较低。
2.
The authors estimate the bias field by Legendre polynomials to find the parameters with minimum entropy,conventional ways such as gradient-descent method often find local best,to find global best,the authors present genetics algorithm to find best parameters to estimate the bias field,but it can not always find global best neither.
求偏移场的过程中需要求解基函数的参数,由于传统的梯度下降法易陷入局部最优,将遗传算法引入到参数求解过程中,然而传统的遗传算法时间复杂度高,易陷入局部最优,对遗传算法进行了改进,更容易得到全局最优解且时间复杂度较低。
补充资料:负负
1.犹言惭愧﹑惭愧;对不起﹑对不起。
说明:补充资料仅用于学习参考,请勿用于其它任何用途。
参考词条