07-07-2006 03:12 PM
Hi,
I'm using Labview drivers I've downloaded from the ni.com website to control an agilent 34401A digital multimeter. All I'm trying to do right now is read min/max/avg measurements after putting the device into a DC current mode, configuring it for min/max measurements, and then pausing for a couple seconds. The program successfully puts the multimeter into the correct modes, but always returns 0 for each measurement. Could you take a look at the program to see if you find a simple mistake on my part? It should be attached to this message. Thanks very much.
07-07-2006 03:50 PM - edited 07-07-2006 03:50 PM
I don't have the instrument or manual handy but is it possible that the instrument is waiting for a series of trigger to actually take a series of measurements. do you see the display changing? You can try setting a breakpoint at the min/max function and manually trigger the instrument to see if that makes a difference. Also, you don't need the sequence structure at all. Most of the time, sequence structures just serve to make your programs harder to read and debug. Use dataflow for controlling execution order like in the picture I've attached.
Message Edited by Dennis Knutson on 07-07-2006 02:51 PM