I have a loop with data acquisition and I would like to write this data to a file.
What is the best option: to open the file once, at the beginning of the acquisition and to close it when it is done, to open it and close it every time I write a line on the file or to do something else?
If there is no best option, I would like to understand the good and bad points of each option.
Solved! Go to Solution.
I would choose the first option: open before DAQ loop, write in loop, close after loop.
Depending on datasize of each write access I would also collect data to get bigger write blocks (instead of writing 20 bytes 100 times a second it is usually better to write 2000 bytes once a second...)
The best approch is always open the file before acquistion, write the acquired data and then close when the acquisition is done
1)usually the time taken to open and close a file is much more than the time taken to write the data.
2) Also there is no point in opening and closing the same file in each iteration of the loop for writing the new data
If the data you get each iteration is of no vital importance, then once a whole loop or even program is enough.
And, if each iteration's data is to be protected, I'd sertainly tried to find some system's time to open and close the file every single iteration - why not?
One more thing,
I am running a master slave structure.
The file is already open and I have its reference at the slave and at the master.
The slave is writing data at the end of the file every 500ms.
The master has its event structure and, once in a while, it is going to read the entire file.
Will tihs work fine or I will have problems to read and write at the same time?
Cheers from Brazil,
It depends on the file functions you're using. It should technically be possible, but some problems might arise.
For instance, if you read the file in chunks and at some point write to it you might write in the middle of the file (where the file cursor is currently located). Of course you can set the file position before writing, but this might lead to race conditions.
If possible I would limit the file access to one place or use a semaphore to make sure only one process is accessing the file at any time.
By the way, to prevent data loss you can use the "Flush File" function to force the OS to write the buffer to the file.
I am currently using a Labview datalog file.
The writing occurs at the current cursor position (it should be at the end of the file) and the reading sets the cursor to the beginning of the file and then it reads it all (leaving the cursor at the end of the file).
It appears to me that the use of semaphores would be a good way of preventing errors but wouldn´t it be easier to simply open and close the file every time I need to read or write (or it would make no difference)?
Well, Gabriel, you certainly will have problem you're talkin' 'bout.
Just avoid this simultaneous read-write situation.
I would insist upon determinism as far as file read-write is concerned.
You should make your master and your slave "communicate" with each other and syncronise their read and write activities.
Many ways to get there, by the way, exist in LV.