From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Labview Temperature Controller

Solved!
Go to solution

Hey, I am using Labview to interface with a temperature controller. In order to acquire the data, I am using a while loop. In selecting the reading interval for the device, the number you input is in milliseconds and its supposed to gather 1 value per value of the reading interval. So when I input an interval of 1000, it should acquire one value per second. However, when I acquire 1000 data points, it actually takes ~1075 seconds. Any idea why this may be happening? 

0 Kudos
Message 1 of 7
(2,181 Views)
Solution
Accepted by topic author pvally

I don't see anything in you pic that would control the loop timing but even if there was there is always some overhead in VISA communication and the possibility the OS (unless you're using a real-time OS) will take control and delay your loop.

 

Basically, f you want perfect timing and deterministic behavior you have to use a dedicated real-time system.  Otherwise you are going to get some timing inconsistencies.

 

An extra 75 seconds in 1000 seconds seems excessive though.  Maybe try making your hardware collect as fast as it can and add a 1000mS Wait.vi in your loop?

LabVIEW Pro Dev & Measurement Studio Pro (VS Pro) 2019 - Unfortunately now moving back to C#, .NET, Python due to forced change to subscription model by NI. 8^{
0 Kudos
Message 2 of 7
(2,162 Views)

I'm guessing that you're doing a lot of other stuff in series with your acquisition.

0 Kudos
Message 3 of 7
(2,162 Views)

Because of overhead on either side of the call.  What it's actually doing is sending a request, then waiting for a reply, then it looks like it sends two other messages as well,then loops again.  So while your sample setting is 1 second long, the time to send and receive to your device is probably about 25 milliseconds, so 1 second plus .025 x 3 = 1.075 per sample, x1000 = 1076 seconds or so.

0 Kudos
Message 4 of 7
(2,157 Views)

I can always calculate what the extra time is so I can set the reading interval to slightly lower than 1000 if I want 1 value per second.

0 Kudos
Message 5 of 7
(2,140 Views)

I added a wait (1000ms) and changed the reading interval to 100 and it acquired ~1000 data points in ~1000 seconds. Not too sure why this worked but thank you for the help.

0 Kudos
Message 6 of 7
(2,129 Views)

It worked because the time it took to collect and read the results was less than 1000mSec.  That means the Wait VI (set at 1000mS) was providing all the loop timing.  This will normally be pretty stable but there are no guarantees without an RT OS.

LabVIEW Pro Dev & Measurement Studio Pro (VS Pro) 2019 - Unfortunately now moving back to C#, .NET, Python due to forced change to subscription model by NI. 8^{
0 Kudos
Message 7 of 7
(2,029 Views)