LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How long does a tick last for the Tick Count (ms) VI in LabVIEW?

I'm trying to compare the timing performance between a VI implemented in LabVIEW and another VI implemented in LabVIEW FPGA module. If I use the Tick Count in both VIs (every one in their own module), I want to know how long does a tick last in LabVIEW standard (in the computer)?

Thanks

0 Kudos
Message 1 of 14
(4,182 Views)

1 millisecond.

 

That is why it is even called "Tick Count (ms)" and the output is called "millisecond timer value".

0 Kudos
Message 2 of 14
(4,178 Views)

I have this doubt because I read that a tick lasts 55ms in the following source:

http://books.google.com.mx/books?id=en1GKs2huTcC&pg=PA33&dq=tick+count+(ms)+labview+55+milliseconds&...

My tick count provides me better results than the hardware results, which is impossible because this is the same algorithm. If I consider the 55ms time, then the FPGA is really faster.

                          Hardware          Software with                  Software with

                              (ms)        1 ms Tick Count (ms)     55ms Tick Count (ms)

Algorithm 1          49.48                   1.5552                               84.535

Algorithm 2         0.8875                  0.032                                 1.87

Algorithm 3         0.1756                  0.0241                               1.43

Algorithm 4          0.27                     0.27                                   1.32

 

What's your opinion?

Thanks again

0 Kudos
Message 3 of 14
(4,157 Views)

Without seing your code and how you have calculated the time it is implossible to say anything about your timing.
But I would doubt that the PC is runing your algorithm in 1.5ms. How did you even get a 1.5 ms out of the ms timer ?

 

You do know that a vi can run on both a FPGA and a PC?

You do also know that and FPGA is so different to a PC runing a OS, that this compare makes little sense.

 

What is you demand for the timing in your algorithm?

How complex is your algorithm?

How offen is your algorithm runing ?

What is the algorithm connection to the rest of your program?


How fast you can get the FPGA to run your code depends very much on how you have structured the code.

But for runing an algorithm on a FPGA compared to a PC, you should see a big difference. The FPGA should be faster.

0 Kudos
Message 4 of 14
(4,143 Views)

OK, I used the following VI for measuring the elapsed time the four algorithms lasts with TIck Count in both LabVIEW and LabVIEW FPGA module. I want to highlight the advantages of use of the FPGA instead the PC. The four algorithms were coded in both LabVIEW and LabVIEW FPGA module. So they have differences about the elements they use for calculating the final result, for instance, the FPGA VI uses memory elements and the software VI uses arrays. So, the number of algorithmic iterations must be the same in both cases. So, I want to compare them to recommend the FPGA use for real-time applications.

0 Kudos
Message 5 of 14
(4,136 Views)

Hello vitrion,

 

It appears that you're using the Tick Count Express VI available in LabVIEW RT and LabVIEW FPGA rather than the Tick count(ms) function as stated in your question. What time source is the express VI configured for in the RT code?  In the FPGA code?

 

Regards,

Tom L.
0 Kudos
Message 6 of 14
(4,130 Views)

I forgot to say that in LabVIEW, I used the following code for measuring the time in less than miliseconds. I repeated the code 1000 times and the final result I divided it by 1000.

0 Kudos
Message 7 of 14
(4,128 Views)

@vitrion wrote:

I have this doubt because I read that a tick lasts 55ms in the following source:

http://books.google.com.mx/books?id=en1GKs2huTcC&pg=PA33&dq=tick+count+(ms)+labview+55+milliseconds&...

My tick count provides me better results than the hardware results, which is impossible because this is the same algorithm. If I consider the 55ms time, then the FPGA is really faster.

                          Hardware          Software with                  Software with

                              (ms)        1 ms Tick Count (ms)     55ms Tick Count (ms)

Algorithm 1          49.48                   1.5552                               84.535

Algorithm 2         0.8875                  0.032                                 1.87

Algorithm 3         0.1756                  0.0241                               1.43

Algorithm 4          0.27                     0.27                                   1.32

 

What's your opinion?

Thanks again


I think you are misunderstanding what the value of a "tick" is.  From what I gathered from reading your link (which I couldn't get to from your post, but accidentally ran into on a search) a tick of the OS clock."  Curiously, though - why 55ms?  That was mentioned as the tick of a Win95/98 OS!

 

I guess you can say that the it returns the value of the next "tick" of the OS clock and returns that value in ms.  So, the value of a tick varies from OS to OS.  In Windows 95/98, for example, you can only look at the OS clock every 55ms or so, but it still returns a value that represents ms.

Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
0 Kudos
Message 8 of 14
(4,126 Views)

In LabVIEW FPGA I'm using the 40MHz clock, so each tick lasts 25ns. I installed only the LabVIEW FPGA module and the required components.

0 Kudos
Message 9 of 14
(4,124 Views)

OK you're right. That is not the time for the Tick. But if I use LabVIEW for measuring the elapsed time using the VI I provided, the result is less than 2ms, which is not correct as you can see in the past messages.

0 Kudos
Message 10 of 14
(4,122 Views)