Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

DAQ Recommended warm-up time

Solved!
Go to solution

I am using the PCIe-6321 and was wondering if anyone knew additional details regarding the recommended warm-up time specifications.

  1. When does the 15 min timer start? (task creation, task start, computer power)
  2. What is the maximum signal error if the time is not allowed to elapse prior to reading data? (the room temperature doesn't drift more than +/- 5 degrees F.)

Thank you in advance.

0 Kudos
Message 1 of 7
(1,941 Views)

Is there a reason why you cannot wait the specified warm-up time? 

The reason for the warm up time is so the device can reach a stable temperature. Once the device has reached the stable temperature, it will operate within the specified accuracy settings. The warm up time starts after the device has been powered on (Calibration section on the following page: http://www.ni.com/product-documentation/53090/en/). 

Message 2 of 7
(1,888 Views)

The reason for the fast start is due to manufacturing timing concerns.

 

My thoughts are this:

  1. Run self-calibration, store temperature at this time.
  2. On software startup, check current temperature compared to self-calibration temperature.
  3. Re-run self calibration if temperature difference is > 1 degrees.
  4. Check every hour if temperature drifts?
0 Kudos
Message 3 of 7
(1,885 Views)

Exactly how precise do your measurements need to be?

 

Will one degree high or one degree low drastically affect the integrity of your measurements? Does the system turn on and off regularly? Does it turn off for long enough that he card will need to warm back up? Is your environments temperature raising or lowering quickly? I don't feel like a 1 degree change in card temperature is going to drastically change your end measurement a significant amount. Unless you are doing ultra precise measurements, in which case the specific card you have might not be the best fit. 

Message 4 of 7
(1,875 Views)

As accurate as possible. I need to be able to quantify the integrity of the measurements. I can implement software checks to verify accurate measurements as I proposed previously.

 

0 Kudos
Message 5 of 7
(1,871 Views)
Solution
Accepted by topic author jnj_nross

Understood! Do you have a formula, function, or algorithm that uses the card temperature to quantify the integrity of the measurement? If so, logging the card temperature with some kind of property node in LabVIEW during each measurement might be your best bet (You can use the System Configuration API). You can use the temperature and whatever other factors you are monitoring to give an integrity rating of your measurement or something similar.

Message 6 of 7
(1,866 Views)

Page 4 of the manual has the formula that I used to quantify the error.

http://www.ni.com/pdf/manuals/374461b.pdf

 

Thank you for everyone's help.

0 Kudos
Message 7 of 7
(1,843 Views)