Instrument Control (GPIB, Serial, VISA, IVI)

cancel
Showing results for 
Search instead for 
Did you mean: 

USB-GPIB-HS readings from Agilent 34401A furiously slow

Solved!
Go to solution

Hi everyone,

 

I'm trying to get decently fast DC voltage measurements over GPIB with an Agilent 34401A. I've been trying different commands from the manual for about 3 hours now. The commands all work fine but nothing has improved my sample rate speed, which is about 343ms. 

 

Here's the manual I'm using: HP 34401A User's Guide (purdue.edu)

 

and I'll paste my Python code below as well.  As far as I can tell from the manual, the READ? Command is the fastest way to get an instantaneous reading but it's still infuriatingly slow.

 

Any idea how to speed up my sample rate?

 

Thanks in advance!

 

 

# Python code starts here

import pyvisa as visa
import time

rm = visa.ResourceManager()
agilent = rm.get_instrument("GPIB0::18")
print(agilent.query('*IDN?'))
agilent.write('DISP OFF')
while(True):
    before = time.perf_counter()
    agilent.query('READ?')
    after = time.perf_counter()
    print(after - before)
0 Kudos
Message 1 of 4
(229 Views)
Solution
Accepted by topic author EvanReeds

You would have read the manual for 3 hours but still missed this highlighted part in pg.55 of the manual.

santo_13_0-1626648507213.png

Resolution setting of the DMM determines the time it takes to measure the signal and directly translates to the resolution. In your python code, you did not configure the resolution and hence default setting is 10 PLC. PLC means Power Line Cycles, 1 PLC is the time taken by a single cycle of your mains AC supply, based on your region, it will be 20ms (50Hz) and 16.67ms (60Hz). 10PLC translates to 167 ~ 200ms of measurement time.

Another thing to look for is Auto Zero, this precedes the measurement phase and also adds time. The default setting is AutoZero ON.

santo_13_1-1626648775647.png

 

Manual does not provide a specific readings per second for the default configuration with AutoZero ON but with OFF, the best it can do is only 6 per second.

santo_13_2-1626648996025.png

 

 

With 343ms per reading, I think, almost 3 readings per second is still acceptable because you did not optimize your settings for speed, you just took the default settings and started automating using python.

 

-Santhosh
Semiconductor Validation & Production Test
Soliton Technologies
NI CLD, CTD
LabVIEW + TestStand + TestStand Semiconductor Module (2013 - 2020)
NI STS for Mixed signal and RF
Message 2 of 4
(184 Views)

Chapter 8 of your manual, starting on page 217 shows the settings to use to utilize the hardware in the method desired.

  • Choose desired Range instead of Autorange for quicker results
  • Use less resolution for quicker measurements
  • ...
Help the Community (and future reviewers) by marking posts as follows:
If it helped - KUDOS
If it answers the issue - SOLUTION
Message 3 of 4
(167 Views)

Thank you for the thorough response! It was definitely the resolution slowing me down. I'll paste my updated code below to anyone who ran into the same roadblock. The command I added to decrease sample time is  'CONF:VOLT:DC 10,0.001', which sets the dmm to dc voltage reading in the 10V range with 4.5 digit resolution. My sample rate is now 7ms.

 

# Python code starts here

import pyvisa as visa
import time

rm = visa.ResourceManager()
agilent = rm.get_instrument("GPIB0::18")
print(agilent.query('*IDN?'))
agilent.write('DISP OFF')
agilent.write('CONF:VOLT:DC 10,0.001')
while(True):
    before = time.perf_counter()
    agilent.query('READ?')
    after = time.perf_counter()
    print(after - before)
0 Kudos
Message 4 of 4
(144 Views)