Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

High time-specificity on DAQ without bogging down RAM

Currently, we're using a multifunction DAQ system (or hoping to, soon), to select a series of specific voltage input patterns.  The time spacing of these events is very specific, but the overall information of these events is minimal.  This is a problem because the system ends of creating a huge (albeit sparse) matrix that clogs up the RAM.  For example, let's say we wanted to have a voltage signal be at 0 V for t=0.934934624 seconds, then jump 5 V, then to 10 V at t=3.334625346, and then 15 V at 5.234235243, and so forth...  It's quite sub-optimal to input a matrix with time steps of t=10^-9 (the level of precision given in the numbers above--don't worry, they're completely arbitrary) into the system, so I'm wondering whether there's some driver or other method that we could use to old run the three relevant bits of information through the computer's RAM.  

 

I apologize in advance if this question seems a bit novice...I'm making an early foray into NI, so a lot of this is preliminary evaluation...

 

Thanks!

 

Sam

0 Kudos
Message 1 of 2
(2,476 Views)

Hi Sam,

 

Are you using the DAQ device for outputting the voltages or for reading them in?  I can't be quite sure, but it sounds to me like you are looking to output the voltages from the DAQ device.

 

 

If you need to generate the voltages, there actually is a clever way to do so on X Series DAQ devices (or 2nd generation Compact DAQ).  You can write a buffered array of pulse high/low durations to a counter output task, and use the output of the counter to clock an analog output task.  The analog output would only need to have one sample for each voltage level desired (max range of ±10V however), and would update to the next sample on each edge of the counter output.  X Series devices can use a 100 MHz timebase for the counters so this will give you a 10 ns resolution (not 1 ns like you mentioned in your post).

 

 

If you are acquiring the voltages, you essentially have two options (option 1 being far better than option 2, but it requires some external components):

 

1.  Make the comparison in hardware using a separate external comparitor for each level you want to detect.  You can feed the result of the comparator to the digital input lines of an X Series (TTL).  The X Series can configure "Change Detection" which generates a pulse in hardware whenever one of the desired digital input lines changes state.  You can use this pulse to sample the internal 100 MHz timebase using one of the counters (still 10 ns resolution).

 

2.  Make the comparison in software.  You would have to actually sample the input at the required rate, which immediately reduces your resolution by ~2 orders of magnitude assuming you want to use a multifunction DAQ device.  Higher rate oscilloscopes might be used but this takes away the whole point of what you are trying to do.  You could read back the relatively large arrays but only store the actual transitions, so memory usage would remain fairly constant while the application was running.

 

 

Best Regards,

John Passiak
0 Kudos
Message 2 of 2
(2,467 Views)