I have a B&K microphone with preamplifier that I need to measure the dB level. I have read several articles / white pages on how to do this but I'm still hazy on it and the results from my calculations do not appear to be correct. This is obviously my first stab at sound measurement. The specs of my pre-amplifier, microphone sensitivity, and DAQ signal input are as follows:
Preamplifier peak output = 7 vdc
Microphone sensitivity = 49mv/PA
From B&K, the "Ref = 1V/PA". Hazy on this, where does this factor into my calc?
DAQ input (3rd party, not NI) = +-10 VDC, 24-bit. (other input options are available to include ( +-5V ,+-1.25V, +- 600mV, etc.) I chose +-10V input since the peak output of the preamplifier is 7 VDC. However, I will never measure levels of this magnitude.
I'm coding the calculation using a text based language, not Labview. From this NI white paper
it states that P= V (mv) / Sensitivity (mv/Pa). So in my case, is the value for "Sensitivity" just a straight "49" as the denominator, or is it "49 /1000"?? Given the values specified above, can someone write out an example that I can reference that shows real numbers in a real example calculation? Thank you
My interpretation would be that the mic/preamp combination gives you a nominal 49 mV output for every pascal of sound pressure input, i.e. 1 volt is output from the preamp for every 1 / 49e-3 = 20.41 pascals measured by the mic. The peak output of 7 V tells you the maximum sound pressure you can measure is 7 / 49e-3 = 142.9 pascals.
Pascals are converted to sound pressure level (SPL) in dB using the equation SPL = 20 * log10(p / p_ref). p_ref is the reference pressure of 20 micropascals (= 2e-5 Pa).
If your measured voltage is X volts, then the SPL is 20 * log10((X / 49e-3) / p_ref) dB.