Hello everyone,
We were until now successfully using a resonant laser scanning system (running at 8kHz) and controlled by a PCI MIO 16E 4 Multifunctions Card with a ATEME DSP Counter/Timer board (very fast internal clock up to 15MHz). Our system scans at a line frequency of 8kHz while the DSP clock (used at about 5MHz) is used as the pixel clock (so we get about 600 pixels per line).
But because we started to reach the limits of our system using LV 5.0 on a Pentium 2 machine with W2K (where everything worked fine), so we decided to make an important upgrade : we went to LV 7.1 on a P4 3,6GHz with Win XP Pro.
All cards worked fine on the new machine when we first installed Win2K and LV 5.0 to be sure that there are no hardware problems. Then we wanted to move on progressively : we installed LV7.1 onto W2k. Now from that point on there has been a problem :
Our VI uses 2 CINs called by subVIs. We use the standard CounterVIs (Group Config, Set attribute, Control, ...) provided with Labview 7.1. as well as standard AO/AI VIs. Nevertheless, the acquired data is erroneous. There are entire lines missing in our scans. (One has to know that the scanner sends TTL pulses for sync at 8kHz; this signal starts the DSP clock which counts the number of photons received for each pixel defined by the pixel clock at about 5 MHz.) For verification purposes only we can count the number of pixel clock pulses per scanner sync. For about 1/4 of all lines this value gives us 0 or 1. This means the DSP board seems to see the scanner sync ( otherwise there should not be a line at all!) but does not get any pixel clock.
Now my question, are there any important changes in the above mentioned VIs I use when moving from LV 5.0 to LV 7.1.? Or is there a problem with the CIN (code interface nodes) when upgrading ?
We are currently stuck with this problem and the problem seems to come from LV 7.1. So, is anyone out there who experienced the same kind of problem ?