LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Increasing reading rate of 34401a

Hi,

 

I am working on characterization of 12 bit ADC in terms of DNL/INL looking to the transition voltages. For this purpose, I am using 16 bit DAC as input for DUT. I have 2 questions to ask. Firstly, I am using Read Multiple Points.vi to read 5 samples of data for each applied DAC code and average those due to the noise of DAC(Ni 9263). However, the vi reads that amount of sample very slowly. I have tested Read vi in a simple program(it reads only data in while loop without any delay) and it reads 50 samples around 8sec. Since I need to read ADC input for each 2^16 DAC code, it takes lots of time. That vi uses Immediate Trigger option btw.  Secondly, I tried to apply external trigger to DMM which seems to be faster but for this trigger option, I want to trigger and read the data at some specific times. In other words, DMM should wait for my signal to read for each DAC code to read ADC input voltage. This should be done in a continuous manner. I have looked the example in Labview about external and software triggering but both take all data at one time. Hence, I need to reinitiate the Read task each time. Is it a better way to solve this problem? For my GPIB, I am using GPIB USB HS dongle of Ni.

 

Thanks,

 

Ouz

NSC

0 Kudos
Message 1 of 8
(6,061 Views)

Ouz, I'm a bit confused by your question. Is the DMM you are using an Agilent 34401A or something else? What version of LabVIEW are you using? Can you post your code?

 

If you are using the Agilent 34401A use the low level driver to create a simple vi in this manner:

Initialize.vi (with proper VISA resource name)...Configure measurement.vi (use appropriate manual range and shut off auto range)...Read (Multiple Points).vi (with the sample count set to 5)....Close.vi.

No need for a loop at this time. This will yield an array of 5 elements (dbl) which you can average. Is this fast enough?

Now Using LabVIEW 2019SP1 and TestStand 2019
0 Kudos
Message 2 of 8
(6,046 Views)

Hi GovBob,

 

Thanks for your reply. I am using Agilent 34401a as DMM and Labview 2009. Right now, I am at home but I have tested the example under Hardware>>Instruments in examples of Labview which  does

 

" Initialize.vi (with proper VISA resource name)...Configure measurement.vi (use appropriate manual range and shut off auto range)...Read (Multiple Points).vi (with the sample count set to 5)....Close.vi.  "

 

As you said, there is no need for a loop for this. But the reading rate is too slow (I have read 50 samples in 8 sec(I used 50 samples to keep time more easily and assumed that there is linear relation between number of samples and reading time. BTW, I didn't make an averaging operation in that program. I am only interested in reading rate. )).

 

Right now, I am at home and all codes are in my laptop at job so I couldn't send till Monday but in my final program, I want to read mutliple samples for each analog voltage applied to the input of  tested 12 bit ADC. However, it takes a lot of time since I am applying input to ADC with 16 bit DAC(Ni 9263). I need to finish reading process around 10ms for each applied DAC code.

 

I hope it is more clear 😉

 

Thanks,

 

Ouz

0 Kudos
Message 3 of 8
(6,027 Views)

If memory serves, the instument is not capable of speeds thatt fast so on Monday, read the nanual. You will certainly want to turn autozero and autorange off. Reducing the resolution may also help.

 

Don't think this will help with your synch problems though.  Even with you providing a trigger to the meter, 10 ms is too fast to acquire and transfer the data. I stongly suggest using something more appropriate such as an NI DAQ card.

Message 4 of 8
(6,020 Views)

Along with Dennis I believe that you are using the wrong hardware.  Reading rates of 10 mSec at >16bits resolution is just not going to happen easilly (and not with the 34401A) 

 

Generally we can discuss some classes of instruments that make DC Voltage readings.  O'scopes (or ditiizers),  DMMs, and Direct Digitizing multi-meters (DDMMs). 

 

O'scopes generally have vigh speed 8 bit digitizers on the modern digital scopes but can operate with very high sample rates.  In case you wonder- 8bits is near 2% of full scale or, about as close as the human eye can distinguish the display on the front of the scope.  Scopes do not need greater resolution and sacrifice it for speed and bandwidth.

 

DMMs achieve the opposite effect by increasing reolution by either averaging readings (like a DDMM) or increasing the measurement time allowing the inputs to the ADC to settle.  The exact archetecture of the ADC has some effect but you still need to sacrifice speed for accuracy.

 

What it appears you need is the speed of a digitizer and the accuracy of a DMM.  This requires a "precision" approach.  One such approach is to prescale the input by adding a precision DC offset to the signal and digitizing at a very high gain. (say a +/-10mV range) and re-add the offset post measurement.  This requires a precision CALIBRATED custom offset generator with programmability.  

 

Another method is to define a different measurement approach altogether.  Rather than depend on the primary count of your ADC it is sometimes possible to measure the reaction that your system takes when a specific ADC value is present for a time.

 

And with modern FPGsA it is possible to design custom hardware solutions that may meet your needs with custom IP


"Should be" isn't "Is" -Jay
Message 5 of 8
(6,004 Views)

Hi,

 

Thanks for your replies, Dennis and Jeff. As I can understand, it is not easy to read the required precision with the required reading rate. I am wondering whether you know what is usual(or maximum) reading rate that I can achieve with 34401a. I have read the manual of 34401a and it says 1000 readings/s direct to GPIB which I couldn't achieve.

 

Thanks,

 

Ouz.

0 Kudos
Message 6 of 8
(5,969 Views)

I don't have the manual handy so I can't confirm that rate. If you can set it up manually for a scan that fast, you can certainly do it programatically. Always try a manual setup first just to give you an idea it's possible and what you might have to do in your program.

 

You can do the timing yourself for a single read. Use a sequence structure where in the first frame, you get the tick count. In the second frame, do a read, In the third frame, have another get tick count and subtract the value from the first frame from the result in the third frame.

 

 

0 Kudos
Message 7 of 8
(5,959 Views)

Have a look at page 57 of the user manual.

You need to change the integration time from its default of 10 power line cycles which limits you to 6 reading per second (for 60 Hz power).

To get to 1000 readings per second you would have to set the parameter to 0.02, but then the instrument's resolution drops to only 4 digits.

To get your required resolution, assuming that you can maximize the range, you would need the next step up, 0.2 NPLC (number of power line cycles).

The reading rate for that is 300 Hz (again for 60 Hz line power) and should be OK for verifying linearity on a 16 bit DAC if you can go close to full range.

Klaus

 

Message 8 of 8
(5,930 Views)