I am trying to solve the 3-bit parity problem using the functional link neural network (Pao,1988). I am performing backpropagation to update the weights and extended the input using the outerproduct model proposed by Pao, ie. x1,x1x2,x1x3,x2x3,x1x2x3 is the input as shown below:
Learning rate 0.01, momentum 0.1, transfer function log-sigmoid.
But still after 1000 iteration the weights are not able to classify properly. The FLNN fails for
0,0,0 input. If there is any idea to improve the result I’d be highly appreciate it.
This is from example 2 of the paper: Klassen, Myungsook, Yoh Han Pao, and Victor Chen. "Characteristics of the functional link net: a higher order delta rule net." Neural Networks, 1988., IEEE International Conference on. IEEE, 1988.
The problem was solved using the perceptron learning rule
W_new = W_old + learning_rate * error * input
, instead of using the generalized delta rule as mentioned in the paper. With learning rate .9 it converged within 100 iterations, whereas the neural network with hidden layer took about 1000 iterations.
Answered By – iqbalnaved
Answer Checked By – Marie Seifert (AngularFixing Admin)