LabVIEW Idea Exchange

cancel
Showing results for 
Search instead for 
Did you mean: 
0 Kudos
OneNorse

Array to Variant Function

Status: Declined

Any idea that has not received any kudos within a year after posting will be automatically declined.

I'm writing a VI for a lab which writes to a database. The number of measurements will vary depending upon which fixture is set up in the lab, and whether the experimenter decides to add extra ones. This results in a varying size array of doubles which gets written into a single database with generic column names c1-c200. The column names are cross-referenced in another table keyed to the lab fixture. 

The method I found here is to convert from array->cluster->variant->database insert. The problem with this is that a cluster's size must be available at compile time, making the varying array size more difficult. It also limits the size of the array/database entry to 256 elements in the array->cluster conversion. 

I know I could just populate and write a 200 element array into the database, but that increases database activity , is inelegant, and forces me to put zeros in the unused columns, instead of leaving them null. 

 

Attached is a VI which demonstrates what I would like, though it only works for doubles and is still limited by the 256 element limit. It could very easily be changed for integers or strings, but it would be best if it could take any type of array input, with a much larger size limit. 

Download All
7 Comments
AristosQueue (NI)
NI Employee (retired)

First of all... save your "Array To Variant.vi" with the .vim file extension in LV 2017 or later, and turn on inlining. Done with that? Great. You now have a malleable VI that is fully polymorphic to handle any type of array, not just doubles. So first problem solved.

 

Now, about the "must make decision at compile time"... unless I'm wildly misunderstanding something, it's the database that has a fixed size in this example... that's the reason that you have to use clusters, so that there's one field per field of the table. No matter what functionality LV adds, you'd still have to decide the number of items at compile time.

 

For the record, the "To Variant" primitive can already convert an array to a variant... indeed, it handles any LV data type. But the database needs a variant that represents a cluster specifically with the same number of fields as the table has columns, so using To Variant on your array isn't going to help.

OneNorse
Member

  Thanks for that, The .vim extension is new to me so I tried it and it works great as far as it goes. 'Array to Variant' still has the 256 element limit, and is still a kluge with 256 case values, one per size of incoming array.

 

At the beginning of each test, a sensor map is read from another table, and an array of column names is generated, listing only the columns used in that particular run. The double array is created to have the same size. 

During the data collection, each row is added to the database using 'DB Tools Insert Data.vi' with a few standard fields, like DataRunID and Status. This results in a new row with ~5 filled in values and 200 Null elements. The row is immediately updated with 'DB Tools Update Data.vi' whose inputs are the array of column names and the (variant) array of data*. The size of the variant input must match the size of the column name array, which is determined at run time. After the update, anywhere from 30 to 150 of the double columns will be filled (with 50 more for future use) and the rest will remain Null in the DB.

 

Using the 'To Variant' primitive to convert the double array before piping it into the 'Update Data.vi' results in an error complaining that the sizes of the two inputs are different, presumably because the variant array does not have the extra cluster information and is deemed too small.

 

* The reason for two steps is that building a cluster out of, say, 2 ints, 3 strings, and an array of 50 doubles gets you a cluster with 6 elements, one of which has 50 doubles. What would be needed for a single DB Insert would be a cluster with 55 elements.

    

drjdpowell
Trusted Enthusiast

When I used the DB Toolkit about a decade ago, I quickly hit the problem of it only working with clusters.  However, I noticed that the toolkit converts cluster to array of variants internally, and thus it was easy to make a copy of one of the methods an convert it accepting an array of variants.  I suggest you do that (and NI should consider adding such a function).

drjdpowell
Trusted Enthusiast

BTW, this the conversation in this closely-related idea: Convert-an-Array-of-Variants-into-a-Cluster.

OneNorse
Member

Thanks for the suggestion, I took a look in the toolkit and that approach looks good when I get time for it. I also noticed a thread indicating that a 64-bit version of the DB Toolkit, is in beta . Hopefully that will include upgrades to usability along with the 64-bit compatability. 

AristosQueue (NI)
NI Employee (retired)

@OneNorse wrote:
Hopefully that will include upgrades to usability along with the 64-bit compatability.

Last I heard, it will not. The 64-bit work is only testing existing functionality to confirm operation on 64-bit platforms and fixing any bugs needed to make existing functionality work.

Darren
Proven Zealot
Status changed to: Declined

Any idea that has not received any kudos within a year after posting will be automatically declined.