From 04:00 PM CDT – 08:00 PM CDT (09:00 PM UTC – 01:00 AM UTC) Tuesday, April 16, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Time Stamp Issue When Running in subVI vs In Main Loop

Solved!
Go to solution

I have code capturing some input signals using a real time controller in a PXI rack.   When I run my code as a subVI the timing does not track exactly to the 2msec rate that I want.   when I run the code in the Main Loop, I get exactly a 2 msec rate.   Can someone explain why this might be?  Does the call to the SubVi cause a delay sometimes?  Also, I am not sure this is not affecting my data also in that if I missed a step in the time base will my data also be missing.   TDMS file showing in SubVI and In Main Loop is attached.

0 Kudos
Message 1 of 7
(1,078 Views)

Hi Chuck,

 


Chuck@Eaton wrote:

When I run my code as a subVI the timing does not track exactly to the 2msec rate that I want.   when I run the code in the Main Loop, I get exactly a 2 msec rate.


No, you don't get "exactly 2ms in the MAIN loop" as can be seen from your TDMS data!

 


Chuck@Eaton wrote:

Does the call to the SubVi cause a delay sometimes? 


How should we know when you don't attach any code?

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 2 of 7
(1,053 Views)

Sorry, I meant 1 msec.  Regardless, is there any reason code running in a subVI versus Main would have a varying time basis?

0 Kudos
Message 3 of 7
(1,019 Views)

If you can't or won't post code then it is really hard to help with any degree of certainty, but in general the overhead for calling a subVI should be very low i.e. at most single-digit numbers of microseconds.

 

You could try setting your subVI to run inline if it's that important, assuming none of the contents of the subVI preclude it from being able to be inlined.

 

Have a look at the "SubVI Overhead" section of this page:

 

https://www.ni.com/docs/en-US/bundle/labview/page/lvconcepts/vi_execution_speed.html#subvi_overhead

0 Kudos
Message 4 of 7
(1,003 Views)
Solution
Accepted by topic author Chuck@Eaton

If you're running in LabVIEW RealTime you should be able to get very repeatable timing but without seeing code we can't begin to debug. I'd make sure all of your loop priorities are set correctly such that nothing is getting preempted.

 

And if you're using DAQmx, then just use hardware timing instead of software timing and you'll get very good timing data.

Message 5 of 7
(994 Views)

Thanks Bert.   I will look at the loop priorities.  I will also look at the code (I didn't create the code) but I don't know what would be affecting the timing while in a subVI versus running in the main loop.  DAQmx is not being used as all data is stored in the rack computer.   After the test runs then the data is transferred to the PC.

0 Kudos
Message 6 of 7
(982 Views)

Hi Bert,

 

The subVI prioritization was exactly the problem.  Thanks for your help!!

0 Kudos
Message 7 of 7
(902 Views)