LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Get date time in seconds under the hood

Solved!
Go to solution

Hi everybody,

 

I'm working on a test bench with a PCI card Meinberg PTP270PEX.

 

This card is able to overwrite the Windows System time with a Windows service working transparently.

 

Since Windows 8, a new time management has been introduced, allowing to get timestamps with a true µs resolution. So, it's possible to get high resolution time under Window using the "GetSystemTimePreciseAsFileTime" fonction of Windows.

 

My question is : what does exactly the standard "Get date time in seconds" of LabVIEW Under the Hood ?

GetSystemTimePreciseAsFileTime call or a standard GetSystemTime call ?

 

I don't want to do a dll call from labVIEW because of the calling time of the dll which will introduced a time offset on the timestamp.

 

Best regards,

0 Kudos
Message 1 of 12
(3,207 Views)

Why would it be any faster if Labview called GetSystemTimePreciseAsFileTime than if you did? Also, unless you are on a real time OS, you aren't going to get sub microsecond accuracy anyways.

0 Kudos
Message 2 of 12
(3,149 Views)

According to MSDN GetSystemTimePreciseAsFileTime() was introduced in Windows 8. Therefore as long as LabVIEW will support Windows 7, it can't really use that function anyhow.

 

Aside of that, anything that majoris says is true too. It doesn't matter if you call that API or LabVIEW. The time for the call is about the same. And Windows as a non-RT system CAN certainly take much more than a few microseconds to call this API no matter how you do it.

Rolf Kalbermatter
My Blog
0 Kudos
Message 3 of 12
(3,140 Views)
Because
0 Kudos
Message 4 of 12
(3,096 Views)

Because of time overhead of DLL call from a diagram

0 Kudos
Message 5 of 12
(3,095 Views)

Disable debugging in the Call Library Node after you have made sure everything works perfect and the overhead compared to when LabVIEW calls this function internally is totally neglectable! The inaccuracy of Windows scheduling in terms of getting your code executed in a particular thread, and between other tasks (processes) on your computer, is magnitudes bigger anyhow!

Rolf Kalbermatter
My Blog
0 Kudos
Message 6 of 12
(3,091 Views)

I know..So why having a microseconds timer in the palette?

0 Kudos
Message 7 of 12
(3,086 Views)

Manufacturer of 1588 PCI card explain me that what you say is true, before Win8. Since, major improvements have been made to time management in Windows. To be confirmed!

0 Kudos
Message 8 of 12
(3,082 Views)

The timer functions precision isn't a lie. Just the accuracy (aka the repeatability). And I think the same will be true for the precise time in the Windows API. My general reading seems to indicate that the PreciseFileTime function uses the query performance counter for its precision. And the QPF API function has been around since XP. And how the QPF is actually calculated depends on hardware. Very likely, calling PreciseFileTime will cost you a syscall, which will cost you a context switch. Which will be slow when we are talking about 10s of nanoseconds. But maybe the driver for your PCI card is magic and prevents all that somehow (i.e. perhaps writing the time to a user-mode memory mapped location).

0 Kudos
Message 9 of 12
(3,063 Views)

I agree...but reading this has bringing me to see things in a different way :

https://www.greyware.com/software/domaintime/v5/overview/w32time.asp

0 Kudos
Message 10 of 12
(3,054 Views)