We are using 3 PXI-4071s in parallel for measuring high voltages with precision. The program is written using LabVIEW 8.5.1.
An additional test requirement has been added which requires the use of a quadrature decoder and the DMMs synchronously.
We believed this would be straightforward, using backplane trigger 0.
However, something odd is happening.
With a cut-down VI that uses just one DMM, we get 100 microseconds sample time running with internal triggers. However, if either the overall trigger or the sample trigger is set to TTL0, then the sample time becomes 5.1 milliseconds. It seems very strange that even just setting the overall trigger, which one would expect to only affect the time to the first sample not the time between samples, has this effect. Also, the data sheet for the DMM says that the maximum trigger rate is 6 kHz.
We have confirmed that this reported sample time is independent of the speed of the clock actually connected to TTL0. If the clock is faster, we get the reported sample time. If it is slower, the samples occur on the clock edges.
Does anyone know whether there is a parameter which has a default which changes based on the trigger source and can be changed to get round this problem please?