04-17-2014 03:38 AM - edited 04-17-2014 04:01 AM
Best to all NI users .
We are closing an application that acquires and stores data from various sensors at a base firebird data.
The database I have running and entering data in different tables is done correctly. The problem I have data stored in one of the tables where I introduce 4 primary keys and 6 variables of the sensors , so each record is 10 columns and each test performed is compound around of 1500 records.
Keep the data is correct, but the time of the recording of data is unfeasible cycle , takes about 2 -3 minutes about the saved data . Attachment I leave a picture of the code of how this query is run.
I wonder if there is any way to optimize this stored data because since then two or three minutes of recorded data , the implementation becomes unfeasible .
I hope you can help me because I have more than two weeks stuck with this table.
Thanks and greetings to all.
04-20-2014 06:32 AM
Taking DB Open Connection, DB List Columns and DB Close Connection out of the loop and just inserting the DB Insert inside the loop should really speed up things a lot!
04-23-2014 08:53 AM
rolfk,
Thank you for your response. I changed the code according your suggestion and the speed has been improved a little bit but is not correct. The first test introduced (around of the 1500 lines) the time is 30 seconds, and in the test number 10 that we are introduce in the database the time is arround 3 or 4 minuts, and I think that this time is exponential. Is normal this comportament in the database? Can be solved?
Thanks and regards.
04-23-2014 09:15 AM
04-24-2014 06:24 AM
Hi Dennis,
May be I don't explain properly. If the table is empty and I introduce the first adquired test, the time is around 30 seconds that is a perfect time for the application. The problem is when the number of the adquired test stored increase, this 30 seconds, are increased considerable and exponentialty in the time. Now we have around of 120 test and 900.000 lines in the table and the insert bucle for a new test is up to 5 minuts.
I don't know if is normal this incremental time depending of the number of lines or size of this table.
IS possible in Labview compact a table in a DataBase or compact all the information of this DB? I reviewed the information of the DB toolkit manual and I don't find.
I review the stored procedured that you comment may be the solution is program the insertion in the DB inside of the DB and in Lavbiew only call to this procedure in the database. I go to review it.
Thanks for your help.
BR.
04-24-2014 10:24 AM
I don't know if is normal this incremental time depending of the number of lines or size of this table.
It's not normal for a good database engine. But I have never heard about Firebird and from your observations it would seem this database engine was not written to handle efficiently the amount of data you want to throw at it.
Maybe evaluate a different database engine that is capable of handling your data requirements?!