10-11-2008 03:32 AM
I'm a complete Labview newbie trying to set up a fast loop to do some software processing.
I created a timed loop and wired the NI DAQ frequency source to it.
I did the following:
1) wire DBL numeric to FREQUENCY (call it my_freq)
2) wired DEV1/CTR0 to counter
3) wired "i" to Numeric output
4) wired integer const=1 to period of timed loop
I can change my_freq up to about 20,000 (20khz) and it seems to be relatively accurate. I can count 10 seconds according to the windows clock between when I hit start and stop and it gives around 200,000.
Anything above this and it starts to lose counts. Is 20khz the max speed I can run this loop? Isn't the close base 80MHz?
My system is:
1) PCI 6221
2) Quad Core P4 3ghz Core2duo
3) Geforce9600
Shouldn't I be getting way above 20khz? I would like to sample and write to the DIO lines at the max clocked 1MHz.
Is
there any way I can speed up the above code? I know there's some
commands that let you sample say 1000 samples every clock base but I'd
like to use the fast hardware based timer if possible to do some
software processing.
Am I limited by the update speed of the visual numeric control (but the internal processing is faster)?
I wrote the following pseudo code (still figuring out how, I'm used to C not labview) inside the loop would I get 10x performance?
while(1)
{
count=count+1 ;
if (count==10){
Numeric_output=Numeric_output+1;
count=0;
}
}
Thanks for the help!
Thanks for the help!
10-11-2008 03:57 AM
To read and write clocked IO you shouldn't use a software (single read/write) timing.
You should use a hardware clock (from your counter for instance) and read/write multiple samples in one burst, the examples of DAQmx show pieces of code to use a different clock.
Look at timed IO. Be aware you can only use the first 8 lines to read/write timed DIO.
Ton
10-11-2008 04:13 AM
Thanks for the response!
I am currently using the hardware counter (dev1/ctr0) to control the timing.
Max speed seems around 20khz per read without dropping timing.
I will look into the burst reads. Does the burst read give timing information?
10-11-2008 04:54 AM
I thought about it some more and it's bothering me because I can find a software that will run on a pentium iii that will give 100khz timing on the pc parallel port under windows using the QueryPerformanceCounter.
Is labview performance that poor that even with a hardware timer I can only get 20khz loops?
Do I need to set up the clock base or something to increase the freq speed of my counter? I would imagine it should be in the 80mhz range.
10-11-2008 05:00 AM
Here is my VI.
What am I doing wrong that if I increase the FREQ to above 20,000 it starts to lose counts?
20khz seems very slow. I would imagine I should be getting at least 1mhz with hardware timing on a PC.
I found some labview 7 VIs that do software timing in the microsecond range. Is there one that works on Labview 8.5?
10-11-2008 06:17 AM
I think you can't run a HW timed loop that fast. Every tick goes a long way through the system.
But what is your goal?
Ton
10-11-2008 09:17 AM
Hi Henry,
measuring time in µs-range is no problem using windows (apart from accuracy). When you have LV7 vis they will most likely also run in LV8.5.
But your task seems to be quiet different: you don't want to measure time, you want to clock some data input. You should use HW timing here as mentioned before, as Windows will interfere your timing heavily!
What should the attached vi do? Simply update the indicator at 20kHz doesn't make sense. You can't watch it that fast and your screen (probably) updates at 60Hz...
10-11-2008 10:39 AM
I agree with all the answers that you seem to have some serious misconceptions here. There is absolutely no way to reliably run such a loop under a general purpose OS (such as windows) and you should not even try. Even if your simplistic experiment shows something near 20kHz, it won't be regular or reliable. Sometimes, the OS goes off to other things (networking, checking for updates, swap file managements, servicing any of the other 50+ processes running, switching to the UI thread to update the indicators, etc.
You will have the same problems under C.
henry99 wrote:
Shouldn't I be getting way above 20khz? I would like to sample and write to the DIO lines at the max clocked 1MHz.
Is there any way I can speed up the above code? I know there's some commands that let you sample say 1000 samples every clock base but I'd like to use the fast hardware based timer if possible to do some software processing.
If you want to do things like that in software, you might look into LabVIEW FPGA. Here you can do single-cycle timed loops.
10-11-2008 02:38 PM
Hi thanks for the responses.
My reason for buying a DAQ card and Labview is to control. What stuff is open ended as I am using this device as a hobby tool more than anything else.
Right now my application/hobby of the week is to control 4 axis movement of a XYZ+rotary table with 4 stepper motors I bought off ebay for $85.
NI makes a very nice pulse output command with the DAQ Assistant but you can output pulses only on the CTR0 and CTR1 lines on my 6221. So how am I going to control 4 motors with 2 lines?
Well my idea is to have CTR0 control my timed loop. Now every time tick, I can software output to all the 24 DIO bits at once. I would use the timed loop as a fast and accurate delay routine.
My pseudo code would be:
function timed_loop_delay(input_ticks)
timed while loop{
if i>=input_ticks then stop
}
function stepper(num_steps_input,delay){
for i=1 to num_steps_input{
data_io_lines=on
call_function timed_loop_delay(delay)
data_io_lines=off
call_function timed_loop_delay(delay)
}
As you can see if I have a very good timed loop, I can make my own stepper output with a few lines of code.
I am well aware that I can buy a more expensive board with more counters but most M series boards still only have 2 hardware timers. I can buy a counter board or a motion control card but this is a hobby. I picked up a used 6221 off ebay with SCB68 + cable + labview license for $250 and don't want to spend more than that. I can easily program my Silabs toolstick which costs $20 including development system and runs at 20mhz to do the two above loops no problem and control it with RS232 over the USB port with simple commands.
However, I want to make my life easier and use the very nice windows system and 6221 I just bought.
I don't buy the argument that windows is the problem. I have 4 cpus that run at 3ghz each, very fast 1000+ mhz memory, etc. Programs like Mach3 which cost $100 can control CNC motion control using 6 axes of movement using a cheap computer under windows, the pc parallel port and software timing to 100khz.
So in summary, all I want to do is write a fast and accurate time delay routine. I am using the hardware timer on my 6221 which is supposed to be at a clockbase of 80mhz hooked up to timed loop. I should now be able to do single read/writes at 1us intervals (the speed of the board). I might upgrade my board to one that does 10mhz clocked I/O but if I can't do fast timing past 20khz, there's no point.
10-12-2008 01:14 AM
You have 8 DO lines that can be timed on the 6221 (port 0).
Have a look at the following example to get an idea to do timed IO:
Good luck,
Ton