Real-Time Measurement and Control

Showing results for 
Search instead for 
Did you mean: 

Current Measurement with a shunt (very noisey)


I am trying to use a shunt to measure current. 


This is the setup:

- cRIO 9045

- Module 9205

- 1 Mohm resitor at the module

- RSB-500-100 shunt (


Unfortunantely the raw readings are very noisey. After applying the multiplyer to calculate current out of the raw voltage I end up with basically useless data since it is not accurate.


To verify that the hardware setup is correct I ran a test with a different DAQ (keysight) as well. Here is the comparison:



And this is the current:


Does anyone have an idea what might cause this? 





0 Kudos
Message 1 of 4



Whilst it isn't very clear, the variability you are seeing in the voltage appears to be over a very long time period (hours), so are you sure the current isn't just varying a bit or these re effects from slowly changing environmental conditions. The DAQ trend without the variability looks to be over a much shorter time scale.

There can be all sorts of reasons why noise / variability can arise - electrical interference, the use of multiplexed adc, anti-aliasing relate to sample time and frequency content, code problems (deterministic vs underterminstic, buffer overflows etc), graphing issues - but most of these tend to happen at high frequency or give some sort of clue in what the resulting signal looks like.

Eliminate any software things by looking at the signal in NI-MAX.


Have a think if the trends you are showing are actually the raw data (sample by sample), what the sample time is compared to timing of the loops gathering the data (timed loops or while loops and a wait?), scan interface timing settings, is cRIO doing anything else (CPU load <80%), some IO modules only support prescribed sample times.

Sorry cant be specific - but hope the above gives you some things to think about.

Consultant Control Engineer
0 Kudos
Message 2 of 4

Thanks for the answer.


To make sure that it is not our setup or the Labview VI I ran a test using a separate cRIO, the 9502 module and MAX. No labview and no other things connected.
The cRIO is grounded and the module self calibrated through MAX.
Here is a screenshot:

The minimum reading was 15.7 mV and the maximum 16.4 mV. That is a delta of about 0.7 mV.

The load was set to 80 A.
0.0157 V * 5000 = 78.5 A
0.0164 V * 5000 =  82 A
This is too much of a difference. Shouldn't the module in combination with the cRio be able to do better than that? 
To compare the results I ran the same test on a keysight multimeter as well as on a keysight DAQ. Both gave me the same good results: 


Is there anything we can do? Or is this just the way it is?
Yes, I tried different cRIO's and different modules.
0 Kudos
Message 3 of 4

Turns out that changing the input and output limit to +/- 200 mV makes a big difference:







Message 4 of 4