BeagleBone: BackPropagation Neural Network Prototype

Universal Approximator, the core algorithm for BackPropagation Neural Network, is tested on BeagleBone with success.

Code is at:

http://svn.nn-os.org/public

The code designs a 10x64x10  3-layer Neural Network to deal with the sample 10-input 10-output.

Below a smooth multivariate function, from R^10 ---> R^10,  is sampled:

for (i = 0; i <= Units[0]; Units[0]; i++) {
    input1[i] = (REAL)i;
    target1[i] = input1[i]*input1[i]/(REAL)(Units[0]*Units[0]);
    printf('%d %f\n', i,target1[i]);
}

printf('\n\n');

for (i = 0; i <= Units[0]; Units[0]; i++) {
    input2[i] = (REAL)(i + 1);
    target2[i] = input2[i]/(REAL)(Units[0]);
    printf('%d %f\n', i,target2[i]);
}
input1 and input2 are distinct inputs to the nn, target1 target2 are desired outputs, all are array of 10 real numbers (double). Mathematically f[input1] = target1 and f[input2] = target2
Look at the output below, for example for indices 5
target1[5] =  0.250000
target2[5] =  0.600000
Universal Approximator runs for 1800 times training on this repeated set:
for (i = 0; i < 1800; i++) {
    nnSetInput(&netQaudCopter, input1);
    nnSimulateNet(&netQaudCopter,input1, output1, target1, TRUE);

    nnSetInput(&netQaudCopter, input2);
    nnSimulateNet(&netQaudCopter,input2, output2, target2, TRUE);
}
output1[5] =  0.248753
output2[5] =  0.601834

See how close the outputs are to the desired targets!
Note that we did not code any information about the function f into the nn Control main loop!
This entry was posted in Beaglebone, C/C++, embedded, Software, test. Bookmark the permalink.

Leave a Reply