Using the Backpropagation Library in your Application

In this small short document I’ll show you how to quickly setup the backpropagation library for you application.

Visual Studio Users

First off, you have to get the library. If you don’t have it you can get it here.

  • Start a new project on your visual studio IDE.
  • If you are having the MLBP libraries in a different folder than your project, make sure to include the (.h) header file location to the folder in your Visual Studio Configuration. Make sure to include libs folder too.Visual Studio Include FolderLib Path Visual Studio
  • Defining Global Macros: If you want to use the double precision library make sure to include into the global defines, the macro DOUBLE_PRECISION. I also recommend including the macro SHARED_LIBRARY. This in case you get linker errors.Defines Global on visual studio
  • Adding libraries to project: In order Visual Studio may find which library it should link to, first, in your project properties -> VC++ Directories, add the directory where the .lib files are stored.Visual Studio Add Library 1Visual Studio Library Adding Directory
  • Add the macro #pragma(lib,”mlbp_stsfp.lib”)  or #pragma(lib,”mlbp_stdfp.lib”) if you enabled DOUBLE_PRECISION on any of your source files.

Qt Creator Users

  • Start a new project on Qt Creator.
  • On the PRO file make sure to include with INCLUDEPATH the folder where all header files location. Example: INCLUDEPATH+=c:/mlbp/include
  • Include the linking library using with LIBS. Example: LIBS+=c:/mlbp/libs/vc_x86/mlbp_stsfp.lib
  • Defining Global Macros: If you want to use the double precision library include this statement: DEFINES+=DOUBLE_PRECISION. If you are having linker errors include macro SHARED_LIBRARY.

Now you are ready to start coding your project.

Initializing Neural Network

1. First, you have to set the name space where the library is grouped: mlbp;

using namespace mlbp;

2. Declare a variable as a bp object: the neural network object.

bp net;

3. Initialize neural network to your needs with function bp::create()

if(!net.create(PATTERN_SIZE,OUTPUT_SIZE,INPUTNEURON_COUNT))
{
        cout << "Could not create network";
        return 0;
}

4-Incialize all neuron weights to random values with function bp::setStartValues()

net.setStartValues ();

5-If you have the multithreaded version of the library and you want to use it you can do it by calling function bp::setMultiThreaded(true);

Network ready for training

Now you have, the network is ready for training. Use function bp::train() or any variation of it to start training the network. The train function is used inside a loop and all training patterns must pass inside that loop. The idea of this loop is to wait until bp::train() returns an error near close to zero.

You can do it in this way.

while( error < 0.001f )
    {
        error=0;
        for(int i=0;i<patterncount;i++)
        {
            error+=net.train(desiredOutput[i],input[i],0.09f,0.1f);
        }
        error/=patterncount;
        cout << "ERROR:" << error << endl;
    }

Or you just can do it by setting a fixed number of iterations:

for(int i=0;i<45000;i++)
    {
        error=0;
        for(int i=0;i<patterncount;i++)
        {
            error+=net.train(desiredOutput[i],input[i 0.09f,0.1f);
        }
        error/=patterncount;
        cout << "ERROR:" << error << endl;
    }

I recommend using the second way because in the first one if by any reason the network never reaches that minimum value it would hang in an infinite loop.

After Training

Training is usually intensive when you do it with large patterns. And even more when you include many of these patterns. So you may want to save all you have done. There are two ways to do it.

Using Save Function

The easiest way is by using the bp::save() function.

if(!net.save("network.net",USERID1,USERID2))
    {
        if(net.getError()==BP_E_FILEWRITE_ERROR)
        {
            cout << "Could not open file for writing";
        }
        else if(net.getError()==BP_E_EMPTY)
        {
            cout << "Network is empty";
        }
    }

Getting a linear buffer of the network

The second way is easy too and useful if you want to save the network in your own file format. You only have to call function bp::getRawData() and pass it to a bpBuffer. Use function bpBuffer::get() to get the buffer as an unsigned char array. In this way you can save the buffer on a personalized file.

bpBuffer buff;
    buff=net.getRawData();
    if(!buff.isEmpty())
    {
        FILE *f;
        fopen("yourfile.dat","rb");
        fwrite(buff.get(),sizeof(char),buff.size(),f);
        fclose(f);
    }

To create a network from a buffer, just load a bpBuffer with the unsigned char array and use bp::setRawData(bpBuffer).

bpBuffer buff;
    unsigned char *ucbuffer;
    unsigned int usize;
   LOAD ucbuffer here
   .....
    buff.set(ucbuffer,usize);
    if(!net.setRawData(buff))
    {
        if(net.getError()==BP_E_CORRUPTED_DATA)
        {
            cout << "Invalid buffer";
        }
       //...
    }

Using for Production

If your application won’t perform training every time it is used by the final user, or if it just would do it once. You only have to change the steps of starting random values of the network and training to just loading the neural network data from a file, and call bp::run every time your application would need the services of the neural network.

Check an example program here.

Check Reference Documentation here

Multi-Layer Backpropagation Library v1.0

If by any chance you wanted to create any application that needs the power of a neural network, then I have a solution for you. I introduce you the Multi-Layer Feed-Forward Library using the back-propagation training algorithm.

This library for sure will make easier the implementation of a neural network in your application. For now, the library only supports feed-forward networks.

You probably know what a feed-forward network is, if not I’ll explain it briefly. A feed-forward neural network is a structure where the inputs are propagated and processed thought the neurons from the input layer to the output layer. This is far one of the simplest and useful neural networks around.

What you can do with this library

This library allows you to create, train and implement neural networks in your application, and everything by just writing a few lines of code.

Supported programming Languages

So far the library is supported for C++ and Windows, and it will available for Linux and mobile devices soon.

Features

  • Creation of and training neural networks of unlimited sizes and layers.
  • An easy way to store the neural network data via files with a customized ID so your application could only access it.
  • Easy to use data type structures for safely handling lists and floating point arrays
  • Multithreaded execution during training
  • Source code examples
  • Full documentation
  • Full support via email

Requirements

  • Visual C++ compiler or GNU C++
  • Microsoft Visual C++ 2008 runtimes for 32 bits version, Microsoft Visual C++ 201 runtime for 64 bit version or MinGW runtimes depending on which compiler you have.
  • Available on dynamic link library

Price

The Multi-Layer feed forward library is free.

You can use accepting this software is provided ‘as-is’, without any express or implied warranty. In no event will the authors be held liable for any damages arising from the use of this software.
You can use it on non commercial and commercial applications under the following restrictions:

  • You must not take owner ship or say that you wrote this library.
  • An acknowledgment in you application documentation is required along with a link to NeuroAI website URL:
  • You may not reverse engineer, decompile, or disassemble any of the binary files included in this package.
  • You may not modify any source files and libraries included in this package and republish them as yours.
  • Copyright and license notices on source files may not be removed or altered.

Download

UPDATE

Download Most Recent Version MLBP version 1.0.1a.
FIXED SOME BUGS!

Older version libraries here. Feel free to report any bug if you find one.

Documentation: Multi-Layer Backpropagation Library Reference

Examples: Simple Implementation Source code