08-13-2019 11:32 AM
Hi all
I'm controlling a robot with myRIO and I need to read time in microsecond resolution.
I have real-time module, so I use "Tick count express vi" from real-time module by choosing microsecond output.
But I do not get accurate data. For example I want to measure the sampling time of a loop and I put a 1ms wait in the loop, but the elapsed time that I read oscillates (which is natural) and reads values less than 1000 microsecond (which is wrong since I have 1ms wait in the loop).
I am wondering when we run a VI with myRIO from windows, what timer it uses.
In this case if it uses windows timer it it natural to get wrong results (since windows timing resolution is 1ms), however if it uses myRIO timers I should get accurate results.
Does anybody have any ideas?
Best
Farshid
Solved! Go to Solution.
08-14-2019 04:52 PM
I found the answer to my question. It uses myRIO timer itself.
No the problem is that when I am running a loop in almost 1kHz, as I plot the sampling time (the time between consecutive loops), there is much jitter. Also as it can be seen in the following plot, the hiccups time has an increasing trend.
I tried to attenuate this by changing priority of VI but it did not help.
Is there anyway to avoid this. My application needs to be run around 1kHz.
Best
Farshid
08-19-2019 11:08 AM
Can you attach your code to this post so that people can see what you're using to measure?
08-19-2019 11:43 AM
Hi
I found the answer.
It uses its own clock.
The issue that can be seen in the picture was also caused, since I was sending data out of the while loop through a tunnel.
The solution was using two parallel while loops. One does critical things and shares variables through shared variables. The other while loops does graphing and other processes at what rate it can.