07-03-2012 08:16 AM
I'm fairly new to LabView. I have an NI-myDAQ, and I'm trying to accomplish the following:
Output 10kHz square wave, 50% duty cycle.
Input 200kHz sample rate, synced with the output such that I receive 20 analog input samples per square wave, and I know which samples line up with the high and low output of my square wave.
So far, I have used a counter to create the 10kHz square wave, outputing on a digital output line. I have tried to learn from the following document (http://www.ni.com/white-paper/4322/en), but I'm unsure how to sample at a different rate than my clock pulse. It seems this example is instead intended to sample a single analog input per clock pulse. Perhaps there is some way to create a faster clock (200kHz) in software, and use that to synchronize the analog input gathering as well as a slower 10kHz square wave output generation?
I will eventually need to use both analog input channels for getting data, and one analog output channel for writing out data, so I need the square wave pulse to output on a digital pin.
How would someone accomplish this in LabView?
Solved! Go to Solution.
07-06-2012 12:36 PM
Hi Erick, take a look at the specifications for the myDAQ. You don't need to worry about the 200 kHz sample rate, that's is actually the maximum rate that can be accomplish. to sample at different rates simply use one timing VI per task. I hope this helps
07-06-2012 03:13 PM
All the subsystems (AI, AO, Ctrs) derive clocks from the STC3 so they don't drift, but in order to align your AI sample clock with the pulse train you are generating from the counter you'll want to trigger one of the tasks off of the other. I would start with a couple examples from Example Finder>> Hardware Input and Output>>DAQmx. You can trigger AI off of the pulse train, start with Gen Digital Pulse Train-Continuous - you're probably already using a VI like this to generate the 10k pulse train. For AI, start with an example like Cont Acq&Graph Voltage-Ext Clk-Dig Start.vi- you'll want to use the internal clock so just delete the "Clock Source" control and it will use the internal clock. From there, you just need to set the "Trigger Source" to either be the PFI line the counter is generating on, or "/<your device name>/Ctr0InternalOutput" - assuming you are using counter 0. You'll want to make sure you start the AI task before the counter task so that AI is ready to trigger off of the first pulse. They should be aligned at this point.
For debug, you can use DAQmx Export Signal to export the Sample clock - you can then scope the PFI line and pulse train to make sure they are aligned.
Hope this helps,
07-06-2012 03:39 PM
I can't thank you all enough.
After charting some data, it was of course, not drifting as you all stated. The issue was how to start the tasks at the same time.
As a stop-gap simple solution last night, we found that the 10kHz pulse was always starting after the analog input task, so we simply connected the error out from a 10kHz DAQ assitant clock output instance to the error in of an analog input DAQ assistant. This seems to work, with the caveat that it's a little random whether the analog input starts on a high or low 10kHz pulse. In this application though, we're just computing the difference of means, so it's ok if they get flipped since we're taking the absolute value.
It is apparent to me that doing it the way you describe is the better way to do it, and I will go through and get that up and running when I have time. I will say that while I've gone through the DAQ tutorials, I haven't found a good succinct explination of how and why different DAQmx vi's are used and connected. I definitely need to do more reading.