02-28-2006 07:00 PM
03-01-2006 07:16 AM
INSERT INTO table_name VALUES (value1, value2,....)
command. I have used this to get DB data sets I have not tried inserting but I find it easer to parse strings
itterativly using a string concatination for the values inside a loop with a shift register than trying to
deal with clusters which tend to have to be predefined.
Paul
03-01-2006 07:21 PM
03-02-2006 02:50 AM - edited 03-02-2006 02:50 AM
Message Edited by Support on 03-02-2006 09:28 AM
03-02-2006 07:45 AM
07-08-2010 12:25 PM - edited 07-08-2010 12:26 PM
I had need of dynamically variable size arrays of integers and was contemplating an approach just as you suggested, but your tip regarding converting to a variant was what I needed to actually implement it. FYI, we have a very high data rate application and we needed to develop a custom binary file format with configurable channel counts, etc. and with as little overhead as was feasible. We stored information about each channel with the data, so I needed to find a way of scaling the array sizes stored to match the number of channels configured. I used my channel count property as the control for a case statement that created my cluster (then variant) for each possible number of channels (1 through 16 in my case). On the other end, just as I needed to write the data to the file, I had another case statement and a Variant to Data that converted it back to its proper format before writing it in binary to the file (the File I/O operation was also in the case statement). Worked like a charm. Thanks for your tip, it probably shaved a day or two of effort off my work :).
01-19-2016 05:19 PM - edited 01-19-2016 05:24 PM
In case anyone is still having the same problem, I've attached an INSERT_NEW_ROW VI that accepts a string array as the new row value, adding each new value to the corresponding column of the database. Basicaly I've build the SQL statement to insert the new values.