LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

time base to multiply with tick count

Hi,

Happy Easter!

 

A quick question please! I have designed the vi to calculate the speed of a signal. I'm using the simulated signal to make sure it works before applying on real hardware. In this case I know the simulated frequency which is 10.1 Hz (0.099s). Using tick count I am getting rougly 16000 ticks as the period.

This VI is designed from new vi from getting started window. I have real time, fgpa and labview 2009 on my machine.

Am I successful so for?

If so, what is the timebase I multiply my 16000 ticks to get frequency? I believe I should get 0.099s to ensure I'm on the right track?

Thanks for help in advance

Cheers

 

0 Kudos
Message 1 of 4
(4,701 Views)

The timer ticks are on all desktop platforms in fact in ms. However this timer tick is based on the underlaying software timer tick of the OS and as such although fairly accurate nothing you should use for very accurate timing. It's run by a software interrupt that gets triggered by a hardware interrupt that derives its clock from a cristal oscilator on your system board. This crystal oscilator is typically not very high precision since every ppm accuracy costs an extra fraction of a pence and with nowadays pricing scheme, a few pence here and there saved can make the difference between a mother board that makes still some profit and one that doesn't. Also the timer tick interrupt can get preempted by higher prioritized interrupts such that its interval is really more an indication than a very accurate measurement.

 

That said using the real time clock in the PC is typically even less accurate as its resolution in nowadays Windows versions is typically about 16 ms.

Rolf Kalbermatter
My Blog
0 Kudos
Message 2 of 4
(4,668 Views)

-


rolfk wrote:

It's run by a software interrupt that gets triggered by a hardware interrupt that derives its clock from a cristal oscilator on your system board. .

 

 


 

 Thanks for reply Rolf! Let me explain more what I am trying to achieve.

My original code is in FPGA VI where tick count function has the internal counter which has the clock frequency of 40MHZ and clock counts could be chosen as 2^8, 2^16, or 2^32 counts after which the counter rolls over and start again from zero. Using algorithm it is possible to count the ticks/clock cycles between two events such as start of the loop and stop of the loop and then multiply it with 1/40M to get the loop period.

I have posted the code in normal labview vi where the tick count function look different to the tick count in FPGA VI and I guess the knowledge you have shared-particularly, the hardware interrupt derives clock from crystal oscillator of OS- applies to tick count function when it is used in normal function?

The reason I tried normal Labview VI is that FPGA VI functions doesn't have sine wave simulate signals functions. Now I am testing the code on the development computer running on the random number. I get the loop rate as 40000 counts which implies the random number generator when FPGA is running on development computer has the frequency of 1kHz. Is it true? If so, then I guess my code is verified.

 

Ta

 

0 Kudos
Message 3 of 4
(4,660 Views)

The tick count functions are idea for timing your applications. The use of timestamps (on PC based applications) and tick count functions are commonly used for bench-marking applications. Have a look at the following link for an example of use.

 

3 methods of creating a software stop watch

http://decibel.ni.com/content/docs/DOC-9811

 

Furthermore, please have a look at the attached .vi (LV 2009), which uses tick counts to bench mark a variety of different method of data transfer in LabVIEW.

It shows the efficiency difference between wires, variables and property nodes.

 

Thanks for your time,

Rich Roberts
Senior Marketing Engineer, National Instruments
Connect on LinkedIn: https://www.linkedin.com/in/richard-roberts-4176a27b/
0 Kudos
Message 4 of 4
(4,617 Views)