LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Database to TDMS Converter (Query Help)

So I have some database files that I'd like to convert into TDMS.  At the moment we have some very large Microsoft Access databases (around 1GB) and I'd like to do some testing, converting them to TDMS, adding various properties to the TDMS file like min, max and average values for channels when appropriate.

 

I have very little database experience.  So little I almost don't feel like mentioning it.  But I have a huge data base and the database connectivity toolkit and I am able to query a table, and columns in that table.  But a few things I struggle with is getting rows (Records?).

 

Attached is an example database, and a VI that will open the database and convert it into a TDMS file.  The problem is it does this by reading all rows in a table for a single column then write that to a TDMS file.  My database may have 8 million rows and I can't do that in a single read.  But what I would like to do is find out the number of rows in a column (in a table) then read a number of them, say 10,000 at a time and write those samples as chunks in the TDMS file.

 

I've tried various strings in the Conditions in put like Where Rows > 5 but I always end up getting errors.  Can someone help make determine the number of rows in a column, and a query that has basically a offset, and length for the samples to read.  Thanks.

Download All
0 Kudos
Message 1 of 12
(4,880 Views)

Generally a database table will have a column that is the primary key.

So, you would use a SELECT statement on that column.

http://www.w3schools.com/sql/sql_where.asp

0 Kudos
Message 2 of 12
(4,870 Views)

Just a thought; why not export the database table to a CSV file and then iterate over the file line by line using just LabVIEW primatives?

 

My experience is that the DB Toolkit is not very effective when dealing with very large data sets.

0 Kudos
Message 3 of 12
(4,868 Views)

When making your query:

where (SystemName='TempController') and (FileName='this.file')

Here is a partial sql string from a working query into a LabVIEW select vi. Temp Controller and FileName are columns in a MS SQL table.

See if that helps to get what you want.

Glad to answer questions. Thanks for any KUDOS or marked solutions 😉
0 Kudos
Message 4 of 12
(4,852 Views)
0 Kudos
Message 5 of 12
(4,839 Views)

I don't think you can count how many rows are in a column, but you can see how many rows are in a table. Put this into your query:

 

SELECT COUNT(*) FROM YourTableName

 

Count.png

0 Kudos
Message 6 of 12
(4,829 Views)

Try DB Tools Get Properties. It has:

Column count, record index, record count, etc...

Glad to answer questions. Thanks for any KUDOS or marked solutions 😉
0 Kudos
Message 7 of 12
(4,805 Views)

@Hooovahh wrote:
 The problem is it does this by reading all rows in a table for a single column then write that to a TDMS file.  My database may have 8 million rows and I can't do that in a single read.  But what I would like to do is find out the number of rows in a column (in a table) then read a number of them, say 10,000 at a time and write those samples as chunks in the TDMS file.

You might be able to use the Advanced Palette of the DB tools to go through a Selected recordset row-by-row, rather than ever working with a giant Array.   I would look at "DB Tools Fetch Element Data".

 

Alternately, see if Access has the equivalent of (in SQLIte) the LIMIT and OFFSET key words that allow one to select in chunks.

0 Kudos
Message 8 of 12
(4,788 Views)

Thanks for all of your responses, sorry I haven't responsed yet I was a bit busy.

 

@nyc This data doesn't have a column that could be used as a primary key.  The data is just rows of data like in a text file.  There is a Total Time that is ever increasing that I might be able to use.  But the value difference between rows is inconsistent.  I think I could make this work but it might not be optimal getting too few rows in some cases, and too many in other.

 

@PhillipBrooks How would you suggest I do the conversion?  The only function I saw was the Save Dataset to File which only supports XML or ADTG.  While the XML did work without crashing or running out of memory.  It also increase the size of the file.  The database was about 1GB MDB which turned into a 2.9GB XML which I don't think is any easier to work with.

 

@PatrickLye I believe I know how to get each column one at a time, which my posted code does by using Select [Column] From [Table].  My issue is how to I only get N rows, starting at a row offset?

 

@Ben I found that cheat sheet and it is awesome, but I didn't see anything on there about getting a row selection.

 

@Eric1977 Awesome that is a good first step and it returns quickly even with 8.7M rows.

 

@PatrickLye I see the Get Properties shows the Record Index, and Record Count, but the Set Properties doesn't have an equivalent way of setting this, at least none that I saw.

 

@drjdpowell I get error -2147217900 when I try to use the "SELECT * FROM accdbtable LIMIT 2" or "SELECT * FROM accdbtable OFFSET 2" so I think that means I can't use that function.

 

EDIT: the SELECT TOP 2 * FROM accdbtable does work, so that's a start.

0 Kudos
Message 9 of 12
(4,713 Views)

Can you get this data to open in Access? If so you should be able to add a primaty key column and write some basic code to increment the number if autonumbering the field doesn't do it for you. Then you can access the data via SQL calls on this field.

If you don't have that available you can download MS SQL Express, load it, import the access data into a new table, add the primary key column, and autonumber the column or run some SQL from management studio to fill the column. Then you can access the data via SQL calls on this field.

Glad to answer questions. Thanks for any KUDOS or marked solutions 😉
0 Kudos
Message 10 of 12
(4,675 Views)