10-15-2010 07:34 AM
Hi All
I am LabVIEW 2009. Application is typically running on Windows XP.
I have a sub VI which is inserting data in MS SQL server. This vi is using database toolkit for inserting data into database.
I made this VI reentrant, and my toplevel is dynamically calling its clone.
Now suppose if my top level vi opens 20 clones of sub vi, everything seems to be ok. As per logic all 20 clones are inserting data into database parallely/second. (20 records/second)
Now, if my top level vi opens 100 or 1000 clones of sub vi, sub vi's are not inserting 100 or 1000 records/second into database. It means their executions speed decreased. (approx. 35 records/second).
Are my database VI's are creating this overhead or jitter. My SUB vi is reentrant, but when I diagnosed database toolkit, I found that almost all VI's are not reentrant.
After that, I edited all DATABASE toolkits VI's, I made all VI's reentrant, but still problem is unsolved.
10-15-2010 07:47 AM
Deepak,
the database itself is a limiting factor. Since access to it has to be managed by the database provider, it would even say that it cannot handle parallel access at all. This is the reason for the database connectivity VIs to be non reentrant by default. Changing this can probably lead to unwanted behavior, not only to mention even a slowdown.
If the VIs are non reentrant, access is "streamlined" by LV itself, so if any lock occurs, LV can dissolve this by its own (about 2ms). If the lock occurs in the database, you will most probable have a much longer time for resolving this situation.
Therefore i suggest you not to clone access to a single database. I suggest you to collect everything you want to pass to the database in LV and simply write to the database from a single writer.
hope this helps,
Norbert
10-15-2010 08:01 AM
The DB itself could be the bottle-neck.
I'd first try moving the DB to another machine and also make sure you are using a server that is efficient (I don't rember which version was fast maybe jet ???)
Ben
10-15-2010 08:41 AM
As I understand it, the DB Toolkit .vis work through ActiveX, which itself is not re-entrant and may be a bottleneck.
We are battling the same problem, haven't found a good solution. It looks like we will have to go with two queues, one high
priority, one lower that only accesses the DB on a best effort basis. Not ideal.
From conversations with NI, I believe the LV 2010 DSC add-on is multi-threaded regarding DB access. Don't know whether
the improvements are available through the DB Toolkit, though, and we are stuck on 8.5.1 for this particular project.
Matt
10-15-2010 08:45 AM
@Matthew Williams wrote:
As I understand it, the DB Toolkit .vis work through ActiveX, which itself is not re-entrant and may be a bottleneck.
We are battling the same problem, haven't found a good solution. It looks like we will have to go with two queues, one high
priority, one lower that only accesses the DB on a best effort basis. Not ideal.
From conversations with NI, I believe the LV 2010 DSC add-on is multi-threaded regarding DB access. Don't know whether
the improvements are available through the DB Toolkit, though, and we are stuck on 8.5.1 for this particular project.
Matt
Mike Porter has figured out how to get at DB using ADO and aviods the toolkits completely. He is way beyond me in this area but maybe that is an avenue that could help you.
Ben
10-15-2010 09:01 AM
If you have lots of queries, maybe you can combine them to fewer but bigger ones? Maybe all of these 200 inserts can be combined to 1 large? The database is as mentioned a limiting factor so throwing more threads at it wont help.
/Y
10-15-2010 09:04 AM
@Yamaeda wrote:
If you have lots of queries, maybe you can combine them to fewer but bigger ones? Maybe all of these 200 inserts can be combined to 1 large? The database is as mentioned a limiting factor so throwing more threads at it wont help.
/Y
Similar to trying to speed up a soda jerk that isn't scooping fast enough by adding more customers to the counter screaming for chocolate.
Ben
10-16-2010 01:30 AM
Thanks All,
Is there any other method to insert data into database, without using DATABASE toolkit.
My database is installed on another machine and cloned applications are running on different machine.
According to the application requirement we cannot hold instantaneous data into ques. We need to insert into database as it comes.
Also individual clone has it own frequency of inserting data (ms to sec), if I will club data of a group of clones, requirement doesnt fulfill.
I also inquiry with some other programming languages, they manages individual threads for each new connection, hence they are easily capable of inserting data into true parallel fashion for more than 1000 connections.
Is labview is lagging in this type of database operation requirements?
This is quite critical for our application development. So that without wasting so much of time, we select any other language for this application requirement.
10-18-2010 04:46 AM
I dont think the language is the limiting factor. Database and network speeds affect alot. What database are you using, how good network? How big is the data inserted?
"Now suppose if my top level vi opens 20 clones of sub vi, everything seems to be ok. As per logic all 20 clones are inserting data into database parallely/second. (20 records/second)"
Do you open and close the database connection between each or do you keep it open? That's a big difference in speed.
/Y
10-18-2010 07:47 AM
We are using MS SQL server 2008, database machine is directly connect to our local LAN.
Size of data is approx 500 bytes/second.
We need to insert data every second from all clones, thats why we are not closing database connection every time.
Each clone has its own connection with database.