07-05-2011 12:15 PM
I am trying to generate digital output of high and lows with particular delays on different lines. Each daq assistant is activating single line on a port on USB 6501. There more complex high and lows that I need to generate with variable time difference between high and low. There is codebelow which accomplishes what I am trying to achieve but for executing a long pattern of high and low signal is very time consuming to do this way. I am sure there is a better way to do this, I not a expert on labview so I haven't uncovered its full potential. Can anybody suggest a more effective and a quick way to do this. I would hgihly appreciate. Thanks!
I have not shown in the code below but through DAQ assistant I have initialized lines to low level logic.
Solved! Go to Solution.
07-05-2011 12:18 PM
First you need to get rid of the DAQ assistance. They are slow and clumsy. They need to be replace with real code to ever have a chance at precise timing.
What kind of timing accuracy are you looking to achieve?
07-05-2011 12:24 PM
Not very concerned with accuracy as long it is +/-50ms. But my code is expanding, and this causing build, compilation very slow. I defintely need to get rid of DAQ-assistant but I don't know any other approach.
07-05-2011 12:28 PM
Can you post your code so that I can help you. It would be much easier to help with real code.
07-05-2011 12:30 PM
That is the real code. I want to know how to make it less bulky and more efficient.
07-05-2011 12:33 PM
No that is a picture of the real code. Can you make a vi of the section that you attached and save it so that I can help.
07-05-2011 12:35 PM
My whole code is basically bunch of these but different sequence of lines and high and low and hey are very long. All these units are triggered by switches (Array + switch).
07-05-2011 12:35 PM
OKay
07-05-2011 12:40 PM
See the attached file
07-05-2011 12:50 PM
Are you going to have more that one DAQ card installed in your system?