I would like to timestamp changes on my digital inputs using change detection and store that to a file. I know I can just sample the inputs, but my 6281 card is already running analog input tasks at a lower rate than I need for the timestamp resolution for the digital inputs.
My current thought is to create a timing source tied to the port change detection then use that to drive a timed loop (LabVIEW RT) which would timestamp and store those to a structure via a RT fifo. Does that sound like a reasonable thing to do?
On a related note, is there a common structure that is used for storing digital data? I know and have used the digital waveform structure, but that isn’t compressed (dt based, not discrete timestamp based).
I have a routine that records from multiple 16-bit Analog Channels at 1KHz, reads (digital) Values from a 16-bit Encoder at 1KHz, and also handles "occasional" Events, such as the time the Subject pushes a Push Button, or the time I turn on an LED, or play a Stimulus (all Boolean "digital" Events).
For the 16-bit digital Encoder being sampled at the same rate as my Analog signal, I just treat it as another Analog signal, but process it differently (i.e. I may have 12 channels of "voltages", where 32767 represents +10v,and -32767 represents -10v, while the Encoder channels get interpreted as Gray-Scale (and converted into Binary)).
But what about the "occasional" Event Data, which comes in "whenever it comes in", but at a much lower rate than 1KHz? I save this to a separate "Events" file, with each Event being saved as a Cluster, some of whose elements determine what the Event was (Button Press, LED On/Off, etc.) and another Element being the count of a 32-bit 1KHz "clock" that gets updated at the same rate as the A/D channels. Note that this is definitely a "roll-your-own File I/O system", and requires you to develop analysis routines that know and understand such a proprietary format.
There are other data formats, however, that can probably also be adapted for handling such a "mixed" data stream, including TDMS and others. As I don't have much experience with these, I'll welcome comments from other Forum users ...
My digital inputs are all of the occasional variety. Generally, ttl outputs from multiple systems which need to be measured for their time differences and correlated to multiple analog inputs. Hence the higher resolution need for discrete time stamping.
Is timstamping these via a RT loop triggered by the change detection event reasonable for near 10 us resolution or better? I haven’t tried it yet. It does sound like I will need to roll my own structure though.
A few questions:
- What computer is the PCI card in
- What OS is it running?
- What hardware is running the real-time system?
Timed loops are generally only reliable at 1 kHz or slower. Anything fast you would need an FPGA (which runs 40 MHz).
It’s a pxi card with an 8135RT (pharlap) controller in an 8 slot pxie chassis. I would hope timed loops on this system can perform better than 1khz.
That Controller should certainly be adequate. I presume you have a Timer card in Slot 2 -- does it have a MHz clock? [Clearly if the Master Clock everything is using is 1 kHz, you will have trouble running faster than that ...].
I presume you know that you (ideally) want your Timed Loop to execute much faster (or at least faster) than the loop time, i.e. if the Timed Loop is running at 10 kHz, all the code inside the loop must finish in 100 micro-seconds. This is generally accomplished by having DAQ take place inside the loop, and then "exporting" the data out of the loop using a Queue, RT FIFO, Channel Wire, or similar means and sending it to a Processing Loop for any further processing.
If you are doing mostly A/D, you might not need to use a Timed Loop, but could use the A/D sample clock, set the number of samples to a reasonable number (e.g. 1000), set the Sampling Frequency (e.g. 10 kHz), and use Continuous Sampling (which will run the While loop at 10 Hz, 10K samples/sec / 1000 samples = 10 Hz).
Actually, I wanted to run the timed loop using a change detection timing source and not based on the sample clock.