I'm writing a data logging program that may be used for long periods of time. A typical test could run anywhere from days to months at a sample rate in the kHz range. So, a lot of data will be collected, but I don't necessarily care about all of it. If everything is running as expected for several days, then I don't need to keep the data from those days.
So my question is, what would be the best way to deal with this problem? All of the data will need to be looked over by a human at some point, so how hard would it be to start writing to a different file every day? Is there a way to do that which doesn't lose some amount of data while it is stopping one file and starting another? Is there, perhaps, a way to edit a file that is being written to at the same time?
I really don't know how to move forward, so any ideas or suggestion is welcome.
I have based my program on the attached, standard example VI, and I have included a picture of the logging portion of my code.
Solved! Go to Solution.
I would try to implement a simple producer & consumer architecture. This will allow you to decouple data aquisition and storage.
For saving you could stream data to a tdms file. Since you would have decoupled the storage of the data from it's aquisition, you could create new files etc every once in a while in the consumer loop.
The VI you posted does not match at all your picture. But based on your other thread, I see you are using the DAQmx Configure Logging. That makes things REALLY simple for you. You can use a DAQmx Read Property Node and set the Logging->Samples Per File. With that, DAQmx will automagically create a new file every X samples that are logged.
FYI, I'm not sure you really want to be storing all of that data. Storing a Double value (8 bytes) 1000 times a second generates 8*1000*60*60*24 = 691.2 MB of data *per day*. That's a LOT of data. If you need it, you need it, but it's at least worth pointing out.