Thursday, December 11, 2008

Sample code for CvANN_MLP

Here is a sample for using CvANN_MLP:

// CvANN_MLPDemo.cpp : Defines the entry point for the console application.
//

// Sample code for CvANN_MLP Artificial Neural Network module of OpenCV.
// We stick pretty much to the basics here. See full reference at
// http://www.seas.upenn.edu/~bensapp/opencvdocs/ref/opencvref_ml.htm
//
// Author : Feroz M Basheer
//
// DISCLAIMER:THIS CODE IS PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, WHETHER
// EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NON-INFRINGEMENT. THE
// ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THIS CODE IS WITH YOU.

#include "stdafx.h"
#include "cv.h"
#include "ml.h"

// The neural network
CvANN_MLP machineBrain;

// Read the training data and train the network.
void trainMachine()
{
int i;
//The number of training samples.
int train_sample_count;

//The training data matrix.
//Note that we are limiting the number of training data samples to 1000 here.
//The data sample consists of two inputs and an output. That's why 3.
float td[1000][3];

//Read the training file
/*
A sample file contents(say we are training the network for generating
the mean given two numbers) would be:

5
12 16 14
10 5 7.5
8 10 9
5 4 4.5
12 6 9

*/
FILE *fin;
fin = fopen("train.txt", "r");

//Get the number of samples.
fscanf(fin, "%d", &train_sample_count);
printf("Found training file with %d samples...\n", train_sample_count);

//Create the matrices

//Input data samples. Matrix of order (train_sample_count x 2)
CvMat* trainData = cvCreateMat(train_sample_count, 2, CV_32FC1);

//Output data samples. Matrix of order (train_sample_count x 1)
CvMat* trainClasses = cvCreateMat(train_sample_count, 1, CV_32FC1);

//The weight of each training data sample. We'll later set all to equal weights.
CvMat* sampleWts = cvCreateMat(train_sample_count, 1, CV_32FC1);

//The matrix representation of our ANN. We'll have four layers.
CvMat* neuralLayers = cvCreateMat(4, 1, CV_32SC1);

CvMat trainData1, trainClasses1, neuralLayers1, sampleWts1;

cvGetRows(trainData, &trainData1, 0, train_sample_count);
cvGetRows(trainClasses, &trainClasses1, 0, train_sample_count);
cvGetRows(trainClasses, &trainClasses1, 0, train_sample_count);
cvGetRows(sampleWts, &sampleWts1, 0, train_sample_count);
cvGetRows(neuralLayers, &neuralLayers1, 0, 4);

//Setting the number of neurons on each layer of the ANN
/*
We have in Layer 1: 2 neurons (2 inputs)
Layer 2: 3 neurons (hidden layer)
Layer 3: 3 neurons (hidden layer)
Layer 4: 1 neurons (1 output)
*/
cvSet1D(&neuralLayers1, 0, cvScalar(2));
cvSet1D(&neuralLayers1, 1, cvScalar(3));
cvSet1D(&neuralLayers1, 2, cvScalar(3));
cvSet1D(&neuralLayers1, 3, cvScalar(1));

//Read and populate the samples.
for (i=0;i<train_sample_count;i++)
fscanf(fin,"%f %f %f",&td[i][0],&td[i][1],&td[i][2]);

fclose(fin);

//Assemble the ML training data.
for (i=0; i<train_sample_count; i++)
{
//Input 1
cvSetReal2D(&trainData1, i, 0, td[i][0]);
//Input 2
cvSetReal2D(&trainData1, i, 1, td[i][1]);
//Output
cvSet1D(&trainClasses1, i, cvScalar(td[i][2]));
//Weight (setting everything to 1)
cvSet1D(&sampleWts1, i, cvScalar(1));
}

//Create our ANN.
machineBrain.create(neuralLayers);

//Train it with our data.
//See the Machine learning reference at http://www.seas.upenn.edu/~bensapp/opencvdocs/ref/opencvref_ml.htm#ch_ann
machineBrain.train(
trainData,
trainClasses,
sampleWts,
0,
CvANN_MLP_TrainParams(
cvTermCriteria(
CV_TERMCRIT_ITER+CV_TERMCRIT_EPS,
100000,
1.0
),
CvANN_MLP_TrainParams::BACKPROP,
0.01,
0.05
)
);
}

// Predict the output with the trained ANN given the two inputs.
void Predict(float data1, float data2)
{
float _sample[2];
CvMat sample = cvMat(1, 2, CV_32FC1, _sample);
float _predout[1];
CvMat predout = cvMat(1, 1, CV_32FC1, _predout);
sample.data.fl[0] = data1;
sample.data.fl[1] = data2;

machineBrain.predict(&sample, &predout);

printf("%f \n",predout.data.fl[0]);

}

int _tmain(int argc, _TCHAR* argv[])
{
int wait;

// Train the neural network with the samples
trainMachine();

// Now try predicting some values with the trained network
Predict(15.0,20.0);
Predict(1.0,5.0);
Predict(12.0,3.0);

//I'll wait for an integer. :)
scanf("%d",&wait);
return 0;
}

Best Project... :)

I forgot to update this here: We did win the Award for the Best Academic Project of Computer Science and Engineering department. The project was officially named "Monoscopic Vision Based Autonomous Ground Vehicle".

The award was presented to us during our convocation.

Here is the link to the newspaper article.

Wednesday, December 10, 2008

More...

See Eohan's homepage for some more details on Project AGV.

Monday, November 17, 2008

AGV : Another Video

Eohan has added this video on Youtube. Here is another glimpse of our AGV :)

Monday, November 12, 2007

Some action :)

This video was taken in April. I have been thinking of posting it for a while now... Thought i shall do it now.. :)

Friday, March 23, 2007

BlogDay 71:Of Serial, Steering and Software Lab

Hmmm... that's a long delay in posting, i agree. But we weren't idle.

The most important thing from my part is the fact that the serial port communication part is now fully operational. I spend a month on it. Really, one whole month. There were all these example code in the net for serial port communication but it seemed to evade me for all these days. The hyperterminal in windows is communicating with the code programmed in the AVR ( by sri raj and eoeo) alright but whatever code i wrote, it seemed to ignore it.

The code wasn't the only difficulty. The laptop provided to us from the department does not have a serial port. So we bought this small smart device which gave us a serial port on one end and a usb port on the other. The driver for the thing is pretty nasty though. ( Lots of blue screens in windows... )

I was pretty stuck at this point and no important coding was done for all these days. Two days back i stumbled upon the idea of searching the sourceforge again. It had given us opencv a few months back. I simply did it and voila! got a bunch of serial port libraries. These were much different from the commercial ones i had been trying... no nag screens, no complex stuff...

It was just like love at first sight! ;) The very first run of the code told me that this would work.
The UI was pretty professional and i was able to send even sentences to the port and read back.
So the next day was spend on integrating the serial port thing into our code. It wasn't easy at first. The code was a good one with MFC and stuff and miranda was a small (?) console application. Then i did the most obvious thing.

Now the serial port code contains miranda. Yes. I put miranda into it rather than putting it into miranda.

Miranda V 0.3 has got a gui now a got a serial communication part and two threads communicating with each other. Professional enough.

Now to the mech side of it.

The 'thing' we ( 'we'...? well i'm just a spectator in the hardware part... read 'we' as eoeo, sri raj, bipin, abhilash, maruthu...) made has got problems with the front wheel and steering.
There isn't enough turning. Either we'll have to fine tune the steering and front wheels or we need a much more powerful motor. Time constraints tell us that ordering a new motor is not feasible. We'll have to find some way of making it work better.

But we have got the rear wheels all fixed. Even a brake. Our AGV is taking shape.

Of software lab? hmmm... its this small thing.. i shifted all my work to the lab again. Working in the room isn't paying off well... At least we got much more computing power there. One month of serial porting here didn't give me anything. I took another session in the lab to make things right. ;)

So almost two weeks to go. I mean, for the evaluation. Will we make it?

Thursday, February 8, 2007

BlogDay 28: Miranda Version 0.2

Finally added a pretty much working model of neural networks onto the code...
The ml libraries are really a boon...they are pretty much optimized and efficient. I was amazed by the the accuracy of predictions.... More abt it later...

I send the whole of yesterday optimizing our code... it runs much faster now...