When you checked with MAX were the settings for sampling, etc. the same as in your program? There really shouldn't be any difference if the settings are the same. As to measuring with a meter, they have very slow response times compared with most data acquisitions, so they effectively average out noise, unless it is really big and really slow. Most, if not all, measurement fluctuations like you mention are traceble to noise on the input signal. The difference between MAX and a program may be in the sampling rate, gain settings (full scale definition), whether autoscaling is turned on in one or the other. I've seen the reverse, where I saw what looked like really big "noise" swings in MAX, only to realize that I had autoscaling turned on and the graph's scale was from 4.9 to 5.1 and what looked like a big swing was a few mV. The same can happen if you are looking at the output of your DAQ on a graph, be sure to check how the scaling is set. If the sampling rate is really high for the DAQ vs MAX, MAX may effectively "average" the readings, as it doesn't seen a rapidly changing value, where the DAQ, sampling at a higher frequency, may catch every noise fluctuation.
P.M
PutnamCertified LabVIEW Developer
Senior Test Engineer North Shore Technology, Inc.
Currently using LV 2012-LabVIEW 2018, RT8.5

LabVIEW Champion