From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Neural networks using Machine Learning Toolkit

I'm trying to do handwritten character recognition using Machine Learning Toolkit but it seems to give me random outputs. The database and classes seem fine. But the output is either zero or random numbers. 

0 Kudos
Message 1 of 12
(5,801 Views)

Could you please upload an example? So we can identify the problem. Thanks.

0 Kudos
Message 2 of 12
(5,783 Views)

Thanks for your concern.

New.vi is the program, book.xls is the database of handwritten numbers (0-9) in binary format. It has 8 rows of 1-d arrays representing each number.

Book class.xls is the class value for each number.

 

-$hiv@
Download All
0 Kudos
Message 3 of 12
(5,761 Views)

The data from xls files are 80-by-x matrices. Each row represents one sample. That is exactly the format for MLT. So you do not have to tranpose the matrices.

 

At the mean time, I found a bug in MLT. I attach the new VIs in the zip file. Please copy NILabs_MLT.lvlib to <LabVIEW>\vi.lib\addons\Machine Learning and the other VIs to <LabVIEW>\vi.lib\addons\Machine Learning\Supervised Learning.

 

However, I do not know what is the best parameters for this particular application. You may have to try different parameters.

Download All
Message 4 of 12
(5,748 Views)

Thank you for sending the updated vi. It seems to work fine when i give test inputs from that training test vi but when i give different inputs (not the test data output) to the system it gives the same class value for different inputs.

For e.g: 200154 is read as 111111 or 222222 or 888888.

Does the size of the input array affect it?

 

0 Kudos
Message 5 of 12
(5,721 Views)

The input data, 'Training data samples', is a 2D array. Each row represents one sample. The number of columns is usually determined by the model. In your case, that is the number of pixels in the image. Of course, if you can preprocess your data to reduce the size, it will be helpful. The number of rows is equal to the number of samples you have for training. Usually the more, the better. However larger data set will cause longer training time.

 

You mentioned: 'when you give different inputs to the system it gives the same class value for different inputs'. I would suggest you to add more training samples, and increase the value of 'max iteration'.

0 Kudos
Message 6 of 12
(5,704 Views)

Hi duetcat,

 

Has this bug been resolved in the newer versions of the toolkit, or we still need to replace the VIs?

 

Thank you.

0 Kudos
Message 7 of 12
(5,409 Views)

Hi Socrat,

 

We do not have a new release for this toolkit recently. So we still have to replace the vi manually.

 

Regards,

duetcat

0 Kudos
Message 8 of 12
(5,394 Views)

hi..  my work on accelerometer based hand gesture recognition has resemblence to your work.. it would be of great help if you could help me out ..  have generated a database array comprising of 6 features for 50 samples.  what modification should i make to the above vi as my test data is the feature set generated from an real time  accerometer signal.. 

 

0 Kudos
Message 9 of 12
(5,073 Views)

i have connected  the datebase array  and trainging output directly to BP Learn without using training and testing block.  similarly have applied test data array to BP evaluate block. dont know what to do with the test output terminals..

 

 

0 Kudos
Message 10 of 12
(5,060 Views)