LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Absurd output values after runnig the program on a different PC

Solved!
Go to solution

Hello everyone,
I added a new functions to a old Program, which task was to measure the time until an analog Input reaches a reference voltage at a rising flank. I added the function to do the same on a falling one with a different reference voltage and to measure the Input Voltage over the time of 1 minute.
I programmed and tested this on a Laptop and everything runs fine.
But after I send the Program to the PC which stands next to the measurement station the Output of the program part which measures the time on a falling flank is 0 or a to high number.

The Laptop has a NI-DAQCard-6036E, Windows 7 32Bit and LabView 2010SP1 Dev Suit.

The PC has a NI-PCI-6229, Windows XP 32Bit and the LV-Runtime 2010.

Both have MAX4.7.7, NIDAQmx 9.2.3, NI-488.2 2.81 and NI-VISA 5.03.

I hope anyone has an idea why this happens.

 

0 Kudos
Message 1 of 6
(2,296 Views)

Without seeing the code there isn't likely much we can hep you with.  Please attach the code.

0 Kudos
Message 2 of 6
(2,248 Views)

Hello,

it's a trigger Problem.
I implemented functions but didn't knew how the old program worked.
First some background information, the system measures the switching time a automatic welding filter needs to turn on and off.
The guy who build this in first place, wired an analog output, which switches the device through IRLEDs, to the digital trigger, that starts the measurement.

If the PCI-6229 is used the singal of the analog output is recognized as a digital flank, when switched on (voltage rising), but not when switched off (voltage falling).
I don't know why, but if the DAQCard is used, both singals were recognized as digital.
 
Regards,
Lucas
 
 
0 Kudos
Message 3 of 6
(2,141 Views)

Double check the MAX configurations on both machines.  Does your program rely on MAX to setup the DAQ and Scaling, or is it all done in the program? 

---------------------
Patrick Allen: FunctionalityUnlimited.ca
0 Kudos
Message 4 of 6
(2,127 Views)

The MAX configurations are the same and the DAQ and Scaling is done in the Program.
The problem is the capacity of my circuit.

In the last week I tried to reduce the capacity and used a Z-Diode to cut off the signal. Even it now gets triggered from time to time it isn't stable.
I think I will implement a new clock trigger, which switches the IR-LEDs and starts the measurement at the same time.

I hope the time deviation is not greater than 5µs . (That should have been my first idea :D)

0 Kudos
Message 5 of 6
(2,080 Views)
Solution
Accepted by topic author LucasWolf

After changed the trigger the program still didn't work properly.
But I got a different Error, the -200609 which says the buffer size is too small.
After I configured the Input Buffer, everything runs fine.
I think the Buffer was the Problem all the time.

I don't have any clue why it worked on one system but not on the other.

0 Kudos
Message 6 of 6
(2,035 Views)