Quantcast
Channel: Active questions tagged feed-forward+neural-network+backpropagation - Stack Overflow
Viewing all articles
Browse latest Browse all 35

Backpropagation neural network, too many neurons in layer causing output to be too high

$
0
0

Having neural network with alot of inputs causes my network problems like

Neural network gets stuck and feed forward calculation always gives output as 1.0 because of the output sum being too big and while doing backpropagation, sum of gradients will be too high what causes the learning speed to be too dramatic.

Neural network is using tanh as an active function in all layers. Giving alot of thought, I came up with following solutions:

  1. Initalizing smaller random weight values ( WeightRandom / PreviousLayerNeuronCount )

or

  1. After calculation the sum of either outputs or gradients, dividing the sum with the number of 'neurons in previus layer for output sum' and number of 'neurons in next layer for gradient sum' and then passing sum into activation/derivative function.

I don't feel comfortable with solutions I came up with.

Solution 1. does not solve problem entirely. Possibility of gradient or output sum getting to high is still there. Solution 2. seems to solve the problem but I fear that it completely changes network behavior in a way that it might not solve some problems anymore.

What would you suggest me in this situation, keeping in mind that reducing neuron count in layers is not an option?

Thanks in advance!


Viewing all articles
Browse latest Browse all 35

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>