Quantcast
Viewing latest article 10
Browse Latest Browse All 35

Adding momentum term in online back propagation weight update?

I have implemented the ANN for two-layer network, I need to modify my weight update code with momentum, but i need to know how can i update it. below is the code snap of only weight update. The below code updates weight for each example it has seen, The hiddenWights are the hidden layer weight and the outputWeights are output layer weights.

 for examplen = 1: nTrainingExamples
           inputVector = inputs(:,examplen);
           HiddenLayerOutput = sigmoid( hiddenWeights * inputVector);
           OutputLayerOutput = sigmoid( outputWeights * HiddenLayerOutput);

           l2_error = OutputLayerOutput - targets(:, examplen);
           l2_delta = learningRates(1, i) .* (OutputLayerOutput .* (1 - OutputLayerOutput)) .* l2_error;
           l2_v = mu * l2_v - l2_delta * t;

           l1_delta = learningRates(1, i) .* (HiddenLayerOutput .* (1 - HiddenLayerOutput)) .* (outputWeights' * l2_delta);
           l1_v = mu * l1_v - l1_delta * t;
           % weights = weights + v
           outputWeights = outputWeights + (l2_v*HiddenLayerOutput');
           hiddenWeights = hiddenWeights + (l1_v*inputVector');
       end

Viewing latest article 10
Browse Latest Browse All 35

Trending Articles