LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Logging multiple unsynchronised data sources

I am developing an application which is acquiring and logging data from multiple data streams (~15 in total) which are all running on their own clocks at different data rates and are wholly unsynchronised. Nothing I can do about this as they are from different manufacturers and transmit data over UDP, a couple of them split their data into sections so I have one device sending out 7 different blocks of data which aren't synchronised even though they are from the same device.

 

In order to roughly synchronise them I am oversampling these devices and logging at a much slower data rate into one file which works fine.

 

I have decided that it would probably be useful for debugging and more advanced usage in the future if I could log all of the data coming from all of the devices (This is an edge case so I am not fussed if it isn't particularly elegant).

 

Any suggestions on file structure to do this, log ~15 data streams all acquiring at different rates.

 

Ideal requirements are:

- Plain text file as the customer is going to be looking at this in excel (TDMS is out of the question)

- Ideally all data in one file

- Don't miss any data from any of the devices

 

I am pretty sure the only way of doing this is to create an individual file for each data device which I am not a fan of, I am just hoping that someone has a more elegant solution.

0 Kudos
Message 1 of 12
(1,540 Views)

While I confess that I have not used this format (I try to keep my data sources synchronized, simple, and the files easy-to-parse), NI has a File Type designed for Streaming multiple data sources, including data coming in at different rates, called TDMS (which, of course, stands for Technical Data Management Streaming).  If you search the LabVIEW Help for Data Files, or do a Web Search for LabVIEW Data File Format, you can learn about a number of options that should solve your problems.

 

I have seen some talks at NI Week where industrial data were taken from very different systems at very different rates (including one that recorded Public Transit data from trains and trolleys in a metropolitan area, monitoring hundreds of variables, and the NI File Structure made it easy to examine and analyze these data).  So don't get discouraged -- NI might have a Solution all designed for you.

 

Bob Schor

0 Kudos
Message 2 of 12
(1,499 Views)

I'm aware of TDMS files...although I haven't used them either for the same reasons that you said.

 

I may resort to them, but this is for a customer who is only going to have access to Excel and Matlab so ideally I need to keep the data in a plain text format.

 

What the format is is the question. Without going down the road of TDMS files I think it has got to be separate csv files. I was just hoping someone had a better idea.

0 Kudos
Message 3 of 12
(1,494 Views)

Text files for data are typically rows and columns with maybe some headers.

 

this is a problem for asynchronous data, typically you need to write at least one row at time. It is almost impossible to insert a point in row.

 

suggestion make it easy for yourself, save to TDMS first, then convert to a text file before your program finishes

mcduff

0 Kudos
Message 4 of 12
(1,484 Views)

I third the motion for TDMS.  And then point your customer to NI's TDMS importer for Excel.  Async data at different rates doesn't tend to fit very naturally in a single flat text file.

There are ways to handle it, but they'll take quite a bit more work than sending your customer that link.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
Message 5 of 12
(1,474 Views)

@Niatross wrote:

I was just hoping someone had a better idea.


Another Idea: Write all data into the same file (plain text spreadsheet). give each signal a unique name and write three columns: timestamp, name and data.

Create a functional global variable for this (non reentrant).

Excel can read the file and is able to filter one or more signals by name.

 

Message 6 of 12
(1,469 Views)

That's something I haven't seen or thought of before Martin, probably won't use it because it would be so horrendously memory inefficient at the data rates I am looking at.

 

I will have a look at the TDSM importer Kevin mentioned.

0 Kudos
Message 7 of 12
(1,441 Views)

I have started actually looking into this and decided to have a play with TDMS files. See my testbed attached so that you can see what I am trying to do.

 

Just to reiterated, I have multiple data sources each with there own unsynchronised clocks, transmitting data over UDP so it comes through non-deterministically so waveforms can't be used. A couple of the sensors also split their data into sections and transmit each section separately.

 

What I want to end up with is all of this data stored in a way that I can easily create graphs overlaying a time history of data from different sources (Basically a XY graph)

 

In my testbed you can see my first attempt (simplified to two sensors, one has split its data into two groups which are transmitted separately, sometimes at different rates).

 

This testbed doesn't work for a couple of reasons; primarily because it turns out that each call of the TDMS write function doesn't write a new 'line' to the file but just appends the data for each channel to the end of the channels data. This results in the timestamp column having twice as many entries as the actual data columns.

 

The only way I can see of doing this is either:

- Create a group for every block of data and write the received timestamp to the first column of that group. I would end up with multiple groups for the same sensor (In one case 15 groups because data is split across 15 data transmissions)

- Write every channel as a timestamp - data pair

 

I am leaning towards the first option, but with either method I can't work out of creating an XY chart in Diadem. The only way I can create the time histories I want would be to read it back into LabVIEW.

 

Am I missing something simple or am I going down the right lines?

Download All
Message 8 of 12
(1,364 Views)

Just for completeness, other file format options for this kind of thing include HDF5 and SQLite.  

0 Kudos
Message 9 of 12
(1,358 Views)

I would write every channel (or perhaps channel group) as a timestamp - data pair.

 

Each channel (or channel group) arrives at its own asynchronous pace, which will probably not be perfectly regular.  So you can expect that each should need to store its own unique timing information.  When I read from async instruments, I generally set up blocking reads with short timeouts that I can ignore or clear so that I get my data immediately as it arrives.  As soon as I get data, I grab a timestamp for it.

   Time-of-receipt isn't *quite* time-of-occurrence, but it's at least consistent and is often as good as you can reasonably do.

 

If I had force data being transmitted, I would make 2 channels for it, named something like "force_time" and "force_data".  I'd probably store the time info as relative time in seconds while also storing a "force_t0" timestamp as a channel (or channel group) property.

 

(I'd probably use a single common "t0" for all the channels and maybe also save it in a property one level up in the file-group-channel hierarchy.  Yeah it's redundant, but it makes your ability to overlay all the different data more straightforward.)

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 10 of 12
(1,344 Views)