LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Data loss write to file

Solved!
Go to solution

I have a VI that reads an A2D, and then logs two channels to a binary file. I am having issues with data loss when I write to the file and I cannot figure out where i am losing the data. I am sampling at 512Hz and I beelive I have all teh DaqMX Vi's setup correctly. I read up on the DaqMX VI's I am using but I cannot find a problem with them. I have tried several different A2D's to make sure it isn't the hardware, and they all prodruce the same data. 

 

You can ignore the Open and Write VI's outside thewhile loop as they simply open the file and write header information for the program I am using to open the binary file.

 

Any thoughts on where the data is getting lost?

 

0 Kudos
Message 1 of 12
(4,348 Views)

Are you seeing all of the data on the Graph? (btw, that graph can be replaced with a chart and you wouldn't need the shift registers.)

 

I don't think this is the problem, but you could change your file position setting to be "0" and "End" instead of pulling the file size.

 

When you say you're missing data, what do you mean? Few samples/sec than you expected, or gaps in the data? You need to clarify a bit.

Cheers


--------,       Unofficial Forum Rules and Guidelines                                           ,--------

          '---   >The shortest distance between two nodes is a straight wire>   ---'


0 Kudos
Message 2 of 12
(4,335 Views)

Yes, all the data appears to be on the graph.

 

When I open the file, the data loss is actually a zero point. I have checked a different acquisition program using the same input signal to verify that the signal generator isn't the problem. Attached is a picture of what the data looks like when it is "lost".

Download All
0 Kudos
Message 3 of 12
(4,330 Views)

Those look like pretty big chunks of data, not just a data point or two. It's definitely following a pattern too. Weird.

 

You can try not writing to file every single iteration, or just not read so fast. The file I/O could be getting overloaded since you're reading and writing as fast as it will let you. Add a Wait and see if that fixes the problem.

Cheers


--------,       Unofficial Forum Rules and Guidelines                                           ,--------

          '---   >The shortest distance between two nodes is a straight wire>   ---'


0 Kudos
Message 4 of 12
(4,301 Views)

1. No need to set the file position.

2. No need to index your data and then recombine it to write the data.  Just write directly.

3. Use a Chart instead of a graph.  A chart keeps a history, eliminating the need for you to build up your array.

4. It would be really nice to see your Setup Channels VI so we can see how your task is set up.

5. You might want to consider having your DAQmx Read read a specific number of samples instead of -1, which is all available data.  This would allow you to sort of set the loop rate and ensure you always have data when you save it.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 5 of 12
(4,299 Views)

 


@James.M wrote:

Those look like pretty big chunks of data, not just a data point or two. It's definitely following a pattern too. Weird.

 

You can try not writing to file every single iteration, or just not read so fast. The file I/O could be getting overloaded since you're reading and writing as fast as it will let you. Add a Wait and see if that fixes the problem.


Tried this and it does the same thing as below when I read in a specific number of samples. See attached image.

 

@crossrulz wrote:

1. No need to set the file position.

2. No need to index your data and then recombine it to write the data.  Just write directly.

3. Use a Chart instead of a graph.  A chart keeps a history, eliminating the need for you to build up your array.

4. It would be really nice to see your Setup Channels VI so we can see how your task is set up.

5. You might want to consider having your DAQmx Read read a specific number of samples instead of -1, which is all available data.  This would allow you to sort of set the loop rate and ensure you always have data when you save it.


The VI that I created is using code from a much larger program, so some of the stuff that isn't needed in this program is there becasue I was recreating the program that had issues. Also, the channel setup VI should be in the first post, I'll reattach it though incase you can't get to it.

 

I did get rid of the set file position, changed it to a chart, and changed the samples to 256 so it should write to the file twice a second. What I end up getting is data, zero data, and then data agian where it left off (See attached image). To me it seems like the Write to File it writing to the file even when there isn't any data. Even if I put a case structure, check if the array is empty and if it is empty don't write to file.

0 Kudos
Message 6 of 12
(4,265 Views)

Oh, and If it wire the Daqmx read data directly into the write to file, I get a bunch of junk data. Teh two attached file show the data I get with -1 wired to the samples to read, and 256 wired to the samples to read.

Download All
0 Kudos
Message 7 of 12
(4,254 Views)
Solution
Accepted by topic author adekruif

(You renamed those VIs after you saved the main VI, so they wouldn't load properly)

 

Now we're getting somewhere.... it's a repeatable pattern and it looks like it's every other half-second you are inserting zero data, not losing it.

 

You're probably getting all zeroes from Channel 1 and writing it along with Channel 0 in the same line of data.I don't know how binary files work with 2D arrays, but try just writing a single channel to see if that fixes the issue.

Cheers


--------,       Unofficial Forum Rules and Guidelines                                           ,--------

          '---   >The shortest distance between two nodes is a straight wire>   ---'


Message 8 of 12
(4,251 Views)

You have two tasks, acquiring N channels of data at some rate, and saving data to disk.  Ideally, these would run simultaneously (or "in parallel"), but you have constrained them to run sequentially (or "in series").  This means that while one task (say, disk I/O) is running, the other task (A/D conversion) is "blocked".

 

What you need is a Producer/Consumer Design Pattern (open a blank VI, go to File, New ..., Create New from Template, Frameworks, Design Patterns, Producer/Consumer Design Pattern (Data) to see what this looks like).  In your Producer loop, set up your A/D not for "unlimited buffer sizes" (-1 samples), but some "reasonable" sample size (say, 10 samples).  Your Consumer loop does the Writes, without needing to reposition the file (since it always at the end of the file following a write).  Note that your graphing should also be in the Consumer loop, as graphing can be slow (you might want to consider using a Chart and updating your points no faster than 20-50 points/second, which you can do by plotting every Nth point).

 

I have an application where I acquire up to 24 channels at 1KHz on a PXI platform, stream them (via TCP) to a PC in "chunks" of 50 samples, spool all of the data to disk, and plot every 50th point (which makes my 24-channel chart update at 20 points/second).  I have a "clock" channel as one of my data channels so that I can verify that I never miss a sample.  Needless to say, I use Producer/Consumer several times in this process ...

 

Bob Schor

Message 9 of 12
(4,193 Views)

@James.M wrote:

(You renamed those VIs after you saved the main VI, so they wouldn't load properly)

 

Now we're getting somewhere.... it's a repeatable pattern and it looks like it's every other half-second you are inserting zero data, not losing it.

 

You're probably getting all zeroes from Channel 1 and writing it along with Channel 0 in the same line of data.I don't know how binary files work with 2D arrays, but try just writing a single channel to see if that fixes the issue.


Aha! I was geting the data from channel 1 in the data for channel 0. When I tranpose the array I get what I need with no weird data.

 


@Bob_Schor wrote:

You have two tasks, acquiring N channels of data at some rate, and saving data to disk.  Ideally, these would run simultaneously (or "in parallel"), but you have constrained them to run sequentially (or "in series").  This means that while one task (say, disk I/O) is running, the other task (A/D conversion) is "blocked".

 

What you need is a Producer/Consumer Design Pattern (open a blank VI, go to File, New ..., Create New from Template, Frameworks, Design Patterns, Producer/Consumer Design Pattern (Data) to see what this looks like).  In your Producer loop, set up your A/D not for "unlimited buffer sizes" (-1 samples), but some "reasonable" sample size (say, 10 samples).  Your Consumer loop does the Writes, without needing to reposition the file (since it always at the end of the file following a write).  Note that your graphing should also be in the Consumer loop, as graphing can be slow (you might want to consider using a Chart and updating your points no faster than 20-50 points/second, which you can do by plotting every Nth point).

 

I have an application where I acquire up to 24 channels at 1KHz on a PXI platform, stream them (via TCP) to a PC in "chunks" of 50 samples, spool all of the data to disk, and plot every 50th point (which makes my 24-channel chart update at 20 points/second).  I have a "clock" channel as one of my data channels so that I can verify that I never miss a sample.  Needless to say, I use Producer/Consumer several times in this process ...

 

Bob Schor


 

 

Thanks Bob. In my actual program I have several loops. I have a data acquisition loop, a logging loop, and a display loop.  As far as samples to disaplay to the front screen, I am decimating it by a factor of four with a sample rate of 512 Hz. Based on your suggestion of 20-50 samples/second displayed, I will decimate at a much higher rate. 

0 Kudos
Message 10 of 12
(4,148 Views)