Given a circular buffer acquisition technique, what is the best way to calculate a time for each sample? Using the nidaq functions for c++:
...
SCAN_Start
...
DAQ_DB_Halfready
...
DAQ_DB_Transfer
...
(dot-dot-dots being other code).
I can either take a time sample just before SCAN_Start, and then calculate the time of each sample from the sample rate and scan rate as each half buffer comes in. The problem being that I don't know how much skew there will over the (up to) 2 days that the acquisition is going for, and how long there is between SCAN_Start and the first sample; or,
I can take a time sample just before each DAQ_DB_Halfready, and when it is ready, assume that this is the time of the last sampl
e in the half buffer, and then calculate backwards the time of each other sample using the sample rate and scan rate. The problem here being a complicated method of calculating the time 'backwards' for a buffer of 10 channels each with 5000 samples, and getting accurately the time of the last sample.
I realise windows 2000 isn't real-time, but as real-time windows is (about 10ms) I'd like to get a time for each sample. Is there a better method?
Thanks,