06-25-2015 08:33 AM
I have numerous tables on a SQL server and I want to do some analysis of the data. I have created parameratized queries to get only the data I want.
However, the data is take an extremely long time to return the results.
Query run in Microsoft SQL Mangement Studio - Avg 7 secs.
Fetch All - 9.6 seconds
Convert Data from database variant to datatype - 55 seconds.
The current dataset I am running is 52 columns, by 5227 rows (4 days of data). We would like to be able to analyze longer periods, but then it takes minutes just to convert the data.
To compile everything into an array for analysis, I am converting everything to a string. The datatypes I have in the database are Timestamp, String, Boolean (bit), INT and Double.
I thought the Open G Variant to string would work, but since the database toolkit returns a Database Variant vs a normal variant, the OpenG vi thinks all the elements are default (refnums), and this causes an error.
Variant to flatten string is very fast (less than a second), but it is not converted correctly.
Here is how I am doing this currently, but it is SLOOOW.
06-25-2015 08:53 AM
Ok, I solved my own question.
I used this post on Lava.org: https://lavag.org/topic/7555-convert-variant-2d-array-to-string-and-return-back/ Post 5.
Basically do it by column, since each column will be the same datatype, instead of each cell at a time.
06-25-2015 04:39 PM
You should also make the string array upfront in the correct size and then use replace.
That will make the memery handle in LabVIEW much faster.
Currently LabVIEW need to relocate the string array for every run of the 2 for-loops as the arrays grows.