论文标题
反向传播神经树
Backpropagation Neural Tree
论文作者
论文摘要
我们提出了一种新型算法,称为返回神经树(bneuralt),这是一种随机计算树突树。 Bneuralt通过其叶子进行随机重复的输入,并通过其内部连接施加了树突非线性,就像生物树突树一样。考虑到类似于合理的生物学特性的树突树木,Bneuralt是一个单个神经神经树模型,其内部子树类似于树突非线性。 Bneuralt算法产生了一个临时神经树,该神经树是使用随机梯度下降优化器(例如梯度下降(GD),动量GD,Nesterov加速GD,Adagrad,Adagrad,RMSProp或Adam训练的。 Bneuralt训练有两个阶段,每个阶段都以深度优先的搜索方式进行计算:正向通行在后阶遍历中计算神经树的输出,而在向后pass期间的误差反向传播则在预阶遍历中递归执行。 bneuralt模型可以被认为是神经网络(NN)的最小子集,这意味着它是一种“稀薄”的nn,其复杂性低于普通NN。我们的算法在各种机器学习问题上产生了高性能和简约的模型与描述能力的平衡:分类,回归和模式识别。
We propose a novel algorithm called Backpropagation Neural Tree (BNeuralT), which is a stochastic computational dendritic tree. BNeuralT takes random repeated inputs through its leaves and imposes dendritic nonlinearities through its internal connections like a biological dendritic tree would do. Considering the dendritic-tree like plausible biological properties, BNeuralT is a single neuron neural tree model with its internal sub-trees resembling dendritic nonlinearities. BNeuralT algorithm produces an ad hoc neural tree which is trained using a stochastic gradient descent optimizer like gradient descent (GD), momentum GD, Nesterov accelerated GD, Adagrad, RMSprop, or Adam. BNeuralT training has two phases, each computed in a depth-first search manner: the forward pass computes neural tree's output in a post-order traversal, while the error backpropagation during the backward pass is performed recursively in a pre-order traversal. A BNeuralT model can be considered a minimal subset of a neural network (NN), meaning it is a "thinned" NN whose complexity is lower than an ordinary NN. Our algorithm produces high-performing and parsimonious models balancing the complexity with descriptive ability on a wide variety of machine learning problems: classification, regression, and pattern recognition.