LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Automatically selecting the highest precision current range for an SMU

I've been developing a set of LabVIEW vi's for automated characterization of FETs and I'm having trouble with selecting the appropriate current range/limit for this purpose. Part of this is down to not properly understanding in detail what the 'Current Limit', 'Current Limit Range', and 'Current Limit Autorange' functions do.

 

Equipment is two PXIe-4138 SMUs, one connected between drain and source and one between gate and source. The FETs my lab produces have an on current range from 10 mA to 100 nA, and off currents can get down to 10 pA. I know I can't accurately measure that off current with these SMUs, but I am trying to get the best measurements I can with what I have. There's also a matrix switch involved, that's working just fine at the moment.

 

What I want is for the vi to connect to the FETs on a chip sequentially through a matrix switch, check the on current of the FET, select the current range that will give most accuracy without clipping the signal, and then sweep the gate voltage while measuring drain current with a constant Vds. As an alternative to the italicized part of that I'd also be happy with something that can dynamically switch between ranges in the middle of a gate sweep, but that does seem more complicated and I might get strange results if my vi is switching ranges part way through a sweep. Throughout all of this I want a current limit to protect the FET which is well above the ranges I would need to select for accuracy reasons.

 

My first attempt at solving this was quick and dirty - I set 'Current Limit Autorange' to on, and set 'Current Limit' to the maximum current that the device can survive. The hope was that the autorange function would do the dynamic range switching I wanted for maximum precision. Evidently that was incorrect, I got extremely inaccurate results. In the interim I'm just manually setting both 'Current Limit' and 'Current Limit Range' to 1 uA to get high precision and restarting the vi with a higher limit if it clips.

 

Questions:

  • It seems like dynamically switching ranges for maximum precision is something a lot of people would want to do, but I haven't found an easy solution yet. Is there a clean way of doing that that has escaped my googling? Preferably a method that will also stop the vi from selection a range/limit that would allow the device to get damaged.
  • The specifications for my SMUs lay out which current ranges are allowed and the accuracy/resolution associated with these ranges. Which do I need to set to access these ranges: 'Current Limit', 'Current Limit Range', both, something else entirely? I've read conflicting things.
  • What exactly does 'Current Limit Autorange' do? And 'Current Limit' and 'Current Limit Range'. I understand that 'Current Limit Range' sets the allowable values for 'Current Limit' but not much more than that, and when I was digging through LabVIEW help I clicked a link saying it would explain how the autorange selects a range but didn't find that information.

Thanks very much in advance for any help.

Message 1 of 1
(1,014 Views)