Showing results for 
Search instead for 
Did you mean: 

UNIT Test Framework doesn't set the right inputs


I'm trying to do set up a unit test with LabVIEW's "Unit Test Framework". I have a typedef cluster as input, one of its parameters is a double float value called "time". The problem is that LabVIEW doesn't take the value I'm setting in the test case, but uses a default value of 2 instead. How can I make LabVIEW use the value defined in the test case?


0 Kudos
Message 1 of 3



It's quite suspicious to me that the parameter is called "InputTime" in the test and "time" in the VI.


Other that that it's hard to say without being able to interact.

0 Kudos
Message 2 of 3

"InputTime" is just the name of an indicator I created to get the value for the screenshot. It is not influecing the test case. Is it possible that a set default value in the input typedef can result in such a behaviour? Is the test framework not able to overwrite the default value?

0 Kudos
Message 3 of 3