A Porting of Back Propagation to Konro
Back Propagation is a machine-learning algorithm that trains the weights of connecting nodes on a layered neural network. The application is comprised of two phases: the Forward Phase, in which the activations are propagated from the input to the output layer, and the Backward Phase, in which the error between the observed and requested values in the output layer is propagated backward to adjust the weights and bias values. In each layer, the processing of all the nodes can be done in parallel.
Our code implementation is an excerpt from backpropagation described in this link(Machine Learning, Tom Mitchell, McGraw Hill, 1997), and implements CUDA/OCL versions of bpnn_train kernel.
- Convert Back Propagation from c89 to c++20
- Insert DLB initialization in facetrain.cpp::setup()
- Insert Konro Add request in facetrain.cpp::setup()
- Insert the main computation kernel inside a loop and each time send new feedback following a fixed schedule in facetrain.cpp::setup()
Back Propagation sends feedback to Konro at the start of each new iteration. Konro interpret these number as follow
- 0 - 70 Ask for new resources
- 70 - 130 Nothing to be done
- 130 - 200 Ask for less resources
- CMake 3.1 or later
- Clang++ 14 or later
- dlb 3.2 or later
- konro
Create a build directory and inside it run the CMake con figuration after had set all correct paths
cmake -DCMAKE_CXX_COMPILER=path_to_compiler -DPATH_TO_KONROLIB=path_to_libkonrolib.a -DCMAKE_INSTALL_PREFIX="path_to_install_directory" -DCMAKE_BUILD_TYPE=Release ..To build and install run
cmake --build . --target installWith Konro running in the background as daemon go to ./dist/bin and run
export OMP_NUM_THREADS=1
export DLB_ARGS="--drom --ompt"
./backprop 16500000
