Implementing the Neural Network Trainer

Thursday,April 03 ,2008

During the design of neural networks trainer, I decided to implement the design of backpropagation algorithm for testing, but before explaining the implementation of this algorithm, I will explain the operation of the neural network trainer.

The trainer consists of a user interface that facilitates interaction with training models. Initially the system is being developed for the Windows operating system; I will later implement it in other operating systems.

The program is developed in C++, and the architecture is as follows:

The program consists of a core that connects the user interface independent libraries representing training algorithms. In this case I decided to call the library MLBP.DLL (Mulitilayer Backpropagation) is a dynamic link library. The generality of the library allows that it can be used by any program whenever the export functions are published.

The core of the application NNCORE.DLL is designed to link at design time and it connects directly with the user interface, but the libraries representing training models like MLBP.DLL are used at runtime using the generic pointer (* VOID ) as a parameter of the neural network so we only need to provide inputs and to receive the outputs, and when training is we only need to provide the inputs and the desired output for each set of entries.

Generic structure of training model libraries

The generic structure refers to the functions of MLBP.DLL and other models used. On whether consists of a command script specifically for each model, in other words, the developer of the library provides a kind of scripting language for implementation. Example:

LIBRARY BACKPROPAGATION
INPUT-SIZE 3600
INPUT-NEURONS 2
HIDDEN-LAYERS 3
HIDDEN-LAYER-NEURONS 4
OUTPUT-SIZE 4
BEGIN-TRAINING-PATTERNS
INPUT-FILE NUMBER1.BMP BITMAP
INPUT-FILE NUMBER2.BMP BITMAP
INPUT-FILE NUMBER3.BMP BITMAP
END-TRAINING-PATTERNS

The reason for implementing it in this way is because each neural network is different, if we compare with Hopfield Backpropagacion, we see that the two are very different hence the scripts would be very different. But if we want a trained, we must implement a generic method.

Implementing Backpropagation

I realized that many visitors to the site are looking for a simple solution to the neural networks, it is important to note that the neural networks seem simple but are not. And before copying the source code without understanding it and implement it, I recommend understand first.



Category Neural network trainer project

Comments (0)

You may use html tags <a href="url">Your link</a>,<b></b>,<ul><li>.

Name:
Email(Optional):
URL(Optional):
Comment:
Are you human? if you are answer this simple question:
What color was Napolion's white horse?