说明:双击或选中下面任意单词,将显示该词的音标、读音、翻译等;选中中文或多个词,将显示翻译。
您的位置:首页 -> 词典 -> 多层激活函数
1)  multilevel activation function
多层激活函数
1.
The training algorithm and the structure of quantum neural network that based on multilevel activation function are presented in this dissertation.
(2)分析了多层激活函数的量子神经网络模型,给出了量子神经网络模型的量子间隔训练算法的详细推导;提出了改进的量子神经网络模型,引进了比sigmoid函数更为陡峭的反正切函数作为隐含层神经元的激活函数,同时引入了假饱和预防函数,避免网络陷入饱和状态,提高了模型的收敛速度。
2)  Multi-level transfer function
多层激励函数
1.
An approach to facial expression recognition based on multi-level transfer function quantum neural networks(QNN)and multi-layer classifiers is presented in order to improve the recognition rate and recognition reliability.
为了提高识别率和可靠性,提出了一种基于多层激励函数的量子神经网络和多级分类器组合的人脸表情识别方法。
2.
Aiming at the data overlapping of different patterns on pattern recognition,a pattern recognition algorithm is presented based on the multi-level transfer function quantum neural network(QNN).
针对不同样本之间存在交叉数据的模式识别问题,将多层激励函数的量子神经网络引入模式识别之中,提出一种基于量子神经网络的模式识别算法。
3.
An approach to handwritten digital recognition is presented based on multi-level transfer function quantum neural networks (QNN) and multi-layer classifiers.
提出了一种基于多层激励函数的量子神经网络和多级分类器组合的手写体数字识别方法,采用MNIST数据库进行训练和测试。
3)  activating function
激活函数
1.
A fast method and formula for computing activating function was provided.
给出一种激活函数快速计算方法和公式 ,推导出磁刺激作用下神经纤维的 Hodgkin- Hux-ley模型 ,并建立了神经兴奋与磁刺激仪电路参数之间的联系 。
2.
A modified cable equation and an activating function are obtained to describe the response of the neuraxon under the magnetic field based on the traditional model.
在传统电缆方程基础上,增加径向电场的作用,提出了一种能够描述磁场刺激神经轴突兴奋的改进的电缆方程和激活函数,仿真结果验证了其正确性。
3.
The method to avoid local minimum and the selections of learning method, learning step length, learning samples and activating function are introduced especially.
尤其对学习方法的选择、隐层数和隐层单元数的选择、学习步长的选择、避免局部最小的方法、学习样本的选择、激活函数的选择等都作了详细的介
4)  activation function
激活函数
1.
This paper presents some improvements on the convergent criterion and activation function of the traditional BP neural network algorithm,and also the measures to prevent vibration,accelerate convergence and avoid falling into local minimum.
针对传统BP(back propagetion)算法存在的缺陷,分别对其收敛性标准、激活函数等进行改进,并采取措施防止振荡、加速收敛以及防止陷入局部极小。
2.
Perfect artificial neural network (ANN) learning should include the optimization of neural activation function types, and the tradition of optimizing the network weights only in ANN learning is not consistent with biology.
以典型的前馈网络设计为例,对网络学习中神经元激活函数类型优化的重要性做了进一步的探讨。
3.
This paper studies the influence of different activation function on the speed of convergence of BP algorithm,and reaches a conclusion:The combined activation function can improve the speed of convergence of BP algorithm.
研究了不同激活函数选取对BP 网络收敛速度的影响,得出了采用组合激活函数可改善BP网络的收敛性的结论。
5)  multi-wavelet incentive function
多层小波激励函数
1.
The quantum neurons of hidden layer of the model using a linear superposition of wavelet function as incentive function,called multi-wavelet incentive function,such hidden layer neurons not only can express more of .
该模型隐层量子神经元采用小波基函数的线性叠加作为激励函数,称之为多层小波激励函数,这样隐层神经元既能表示更多的状态和量级,又能提高网络收敛精度和速度。
6)  neuronal activation function
神经元激活函数
1.
Optimizing neuronal activation function types based on GP in constructive FNN design;
前馈网络构造性设计中基于GP实现神经元激活函数类型优化
补充资料:多层沉积层
分子式:
CAS号:

性质:由两种或两种以上相继沉积的金属构成的沉积层。这些沉积层可以由不同特性的同一金属或不同金属构成。如为了提高镍镀层的防护性,有时采用双层镍或三层镍镀层。又如为防护-装饰目的采用的铜/镍/铬三层镀层。

说明:补充资料仅用于学习参考,请勿用于其它任何用途。
参考词条