LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

accurate timing between events

hi all

I have a question regarding timing. I am using labview to output a digital 1 followed by a 0 through a 6516 card to a valve. 1 opens the valve and 0 closes it. The time delay between the two instructions decides the time for which the valve is opened and hence droplet volume. The valve is connected to a micro needle capable of dispensing micro lite volumes.The time delay between the two instructions decides the time for which the valve is opened and hence droplet volume. Right now I am doing this using a sequencial case structure where i have added a case in between on and off which has a 'wait until next ms' block. The time delay is on the order of milliseconds ( 1-100 ms) in order to get the requisite volume.
The program works fine but i am wondering what is the accuracy of this setup and is there a way i can make this more robust. I connected the output from the card to an oscilloscope and when I look at the square waves generated, it seems as if the period is not very accurate at higher frequencies.
I am attaching the vi to this message ( labview 8). Any help would be greatly appreciated. I am just starting to get a hang of this stuff.


--
Abhishek Sahay
MS Candidiate
Biomedical Engineering Department
Rutgers University
732-789-7313
0 Kudos
Message 1 of 17
(5,552 Views)
Unless you're using a counter on your card to do timing, you'll only be accurate to about 16 ms, because that's about as accurate as your OS will get you.  You could look into hardware timing or a real-time OS if it's critical.
Message 2 of 17
(5,544 Views)
Hi Jeff
Thnaks for the input. As I do not have access to hardware timing I am focusing on any way that i can increase accuracy by changing the current configuration of the VI. As in different block for timing or structure because i am not sure if this is the optimal way to do it

Regards
Abhishek
0 Kudos
Message 3 of 17
(5,537 Views)
wait until next ms multiple is good for longer time limits, but if you set it to, say 50 ms, and your program finishes on 51 ms, it will wait the extra 49 ms.  this means you may have a full 100 ms wait between iterations.  however, if you just put a wait ms in parallel with your code, it will guarantee the code takes at least 50 ms to execute (but the code may also take longer).


Message Edited by JeffOverton on 06-09-2008 01:36 PM
Message 4 of 17
(5,534 Views)
There is no reason to be starting and stopping the task with each iteration of your loop. You can also use the timed sequence instead of the Wait Until Next ms Multiple.
Message 5 of 17
(5,520 Views)
Wait until next ms multiple is completely inadequate for this! It will simply NOT work as you expect because the two waits will interact. The result will strongly depend on how the two times relate to each other. For example If the loop rate is not an integer multiples of the wait, your timing could be all over the map. This can make very unpredictable code. It is mathematically incorrect!
 
See also the following explanation:
 
If you really want to do software timing, a wait(ms) is the correct way for the duration delay. (You can keep the "wait next ms multiple" for the delay between loops. Except for the first iteration, it will keep correct loop pacing as long as the code can support the loop rate.).
 
I agree you should do a hardware timing solution, shouldn't be too difficult with your setup. Maybe all you need is to do one "multiple samples" (two samples in this case) with the desired delay between them for each iteration of the loop.
Message 6 of 17
(5,517 Views)
Hi Altenbach

Thanks for the help. I understand now what it means when you say that this is mathematically incorrect. I made the change that you suggested and my reading completely changed for the better!. I am getting a linear relationship now between the time delay and droplet weight which was non linear before the change. In  the next iterations of the experiments I will definitely change over to hardware counting.
Thanks once again to anyone for the feedback.

Abhishek
0 Kudos
Message 7 of 17
(5,466 Views)

Hi everyone

 

I am working on the same project and have come to a point where i need hardware timing. i am attaching the final version of the program i am currently using. The daq card is pci 6516. We also have a PCI 6251 which understand be used for hardware timing but i am not really sure how to go about it. I would be greatful if someone could push me in the right direction. 

 

Regards

Abhishek

0 Kudos
Message 8 of 17
(5,286 Views)

Abhishek,

 

If you are wanting to use hardware timed analog output it looks as though you will just need to put in a DAQmx Timing VI. This will allow for you to set your rate that you want the hardware to write data out. I have attached a link below that covers 10 standard functions in DAQmx that are commonly used. One of those is DAQmx Timing, this would be a good place for you to start. 

 

NI-DAQmx Timing

Aaron W.
National Instruments
CLA, CTA and CPI
0 Kudos
Message 9 of 17
(5,240 Views)

Hi Aaron

 

Thanks for the help. Actually i want a hardware timed digital output. I am controlling a valve which is switched on or off based on digital '1' or '0' . I want to control the time it real mins on. i was using the 'wait until ms' block till now but it is not very accurate for small time durations. i wanted to find out if i could use the 6251 cardto count the time

0 Kudos
Message 10 of 17
(5,230 Views)