Currently, I'm examining fluorescence through scanning through wavelengths on a monochrometer and averaging the signal on the oscilloscope, storing it all in an array and graphing in the end.
My goal is to time delay the value attributed to each wavelength in order to weed out rapidly decaying noise in my samples. The light source will be chopped, and a trigger line feeds into the oscilloscope.
Can this be done either:
through adjusting instrumental settings (I'm using a Tektronix 220, interfacing through GPIB)
or through manipulation of a triggered output waveform using integration with specified boundaries relative to the trigger time?