Measurement Studio for .NET Languages

cancel
Showing results for 
Search instead for 
Did you mean: 

Writing to file continuously?

Hi all,
 
Does anyone know how to log acquired data to a file continuously, using C#?  The example, ContAcqVoltageSamples_IntClk_ToFile, seems like it only writes the file after all the data has been acquired (when CloseFile() is called).  I'm collecting a lot of data for a long time, so I'd like to flush the buffer to file as often as possible...
 
Thanks!
John
0 Kudos
Message 1 of 9
(4,552 Views)
Hi John,

This example writes the header strings, the total number of data rows, and finally the actual data when CloseFile() is called. They do this operation at the end so that they know how many data rows there are (if you are streaming there could be any number of rows). If you want to change it to write on the fly, this will affect the read code such that you can no longer "know" how many data rows are stored in the file. Instead of allocating a fix sized array on read, you will need to use something like ArrayList, and loop until you get an EndOfStreamException on ReadDouble().

As far as modifying the code to write continuously, all you really need to do is move the code from CloseFile() into LogData(). In LogData(), instead of saving to the savedData array, just write to the file:

if (useTextFileWrite)
{
     // Writes data to file
     ArrayList l = savedData[j] as ArrayList;
     double dataValue = (double)l[i];
     fileStreamWriter.Write(dataValue.ToString("e6"));
     fileStreamWriter.Write("\t"); //seperate the data for each channel
}
else
{
     for (int j = 0; j < channelCount; j++)
     {
            // Writes data to file
            ArrayList l = savedData[j] as ArrayList;
            double dataValue = (double)l[i];
            fileBinaryWriter.Write((double)dataValue);
     }
}


Some of the variables will need to be changed, but overall it should look similar. Also, notice that the outer for-loops are no longer needed (remember we don't know the number of data rows), and fileStreamWriter.Close() remains in Close() (as we need to leave the file open while we stream).

Regards,
0 Kudos
Message 2 of 9
(4,531 Views)
Hi James,
 
I started writing my data to file during each callback, but now my UI crashes (due to buffer overruns).  I'm acquiring 64 channels of data at ~20 kHz/channel.  Is there a way to do this efficiently?
0 Kudos
Message 3 of 9
(4,519 Views)
Hi John,

What do you mean by "buffer overruns"? Does DAQmx throw a buffer overflow error because you aren't reading from the card fast enough? Could you post a screenshot?

Regards,
0 Kudos
Message 4 of 9
(4,510 Views)
Here's a screen-shot (in PNG format) of the error message.
0 Kudos
Message 5 of 9
(4,506 Views)

Hi John,

What DAQ device do you have?  If you are sampling each channel at approximately 20KS/s, then the rule of thumb is to read from the buffer 10 times a second.  This means that your samples to read (buffer size) should be 2KS/s. 

There are many reasons that you are getting this error.  You may not be reading the data off the buffer fast enough.  There are two typical causes for this which are not setting up your buffer correctly or taking too long doing other processes in between reads.  If you set your buffer up as explained earlier and you wait two seconds in between calling the DAQmx Read, then you will overrun your buffer. 

Respectfully,

Rob F
Test Engineer
Condition Measurements
National Instruments
0 Kudos
Message 6 of 9
(4,478 Views)
Hi and thanks for the response.  The problem isn't my buffer reading frequency, it's purely due to the disk writing.  I can throw in some junk code instead of file writing (like performing long analyses where I'd normally write a value to disk) and the program won't crash.  It's only when I use the hard disk that thinks go bad.

As a temporary solution, I've increased the buffer size of the FileStream from the default (4kB) to 128 kB.  That seems to have solved the problem, for now, though I still feel like there should be a better solution out there (for instance, this only seems to work with BinaryWriter; trying to write bytes directly to the stream continues to crash).  Another thing that worked was serializing the collected data objects and writing the serialized objects to disk.  That worked without changing any buffer sizes.

If anyone has any better ideas for writing to disk efficiently, I'd love to hear them.

Thanks,
John
0 Kudos
Message 7 of 9
(4,454 Views)
Hi
 
I met same problem in LabVIEW before without any write file actions, and could you please paste related code here, it is really many case can cause this problem, if you do continuously acquire, you need to put acquire method to a loop, before the loop you need to set sample rate, in the loop read out the all data or write to a file, the DAQ automatically control the buffer for continuous acquisition what is ecqual to your sample rate.
 
If you do a time stamp acqusition for example 10s or 100s, you can use sample rate and sample to read to control the total time, if you set sample rate to 10 and set sample to read to 100, DAQ will finish the task at the tenth second, this means you can write 100 data once.
 
 
********************************
*The best Chinese farmer*
********************************
0 Kudos
Message 8 of 9
(4,381 Views)
I am writing 64Channels at 20KHz sampling

gave up on writing a CSV file

had to write to a binary file in the callback loop

writing text was too slow

impossible at this datarate


Philip Newman
General Dynamics
Electric Boat
0 Kudos
Message 9 of 9
(4,375 Views)