My program is to inject current into a substrate and measure voltage across it.Resistance is then computed using this two values. When the voltage is within the range of 0.5~5V, the program will continue to run with this current. If voltage is out of range, my program will increase or decrease current until the measured voltage is within this range. However my program can increase the current when voltage is out of range but when voltage is within range, it will inject the initial set current. I have use Cases for this function, is there something wrong with the case I use. Pls Help!!! I have attached the program and drivers for my measuring equipment. Thanx