I would like to compare an RMS signal that was computed analogically by an acoustic emission sensor coupler to one computed digitally based on the raw signal.
The data was recorded for 20s with a sampling frequency of 10 kHz. The analog RMS converter used a time constant of 1.2s. What would that be equal to in term of "one-sided interval width as % of channel lenght"?
I have tried 6*10^(-7)% [time constant*100/(overall time*sampling frequency)], but it does not allow me to compute the RMS, saying "set width is too low: Width 1 corresponds to 0.0005882(...)%".
You're RMS converter used 1.2 seconds from a 20 second sample, so that would be 1.2/20=0.06 as a % of the channel length if you wanted the same. Otherwise decide how many neighbouring values on each side of the sample you want to use for each interval, and calculate this as a percentage of the total number of samples (10kHz for 20 seconds= 200000 samples) bearing in mind RMS Width should be 1E-9 <= RMSWidth <= 100.