i had a similar problem with the same info on my transducer except it is rated for 2000 psi, onto my problem...
i supply power to my transducer via my scxi 1000 chassis, then i read the readings via a mulitimeter... there is a increase in millivolts as pressure increases so i assume everything is wired up correctly...
in the past when i go into MAX and try to calibrate different transducers, when i hit run under tasks, the numbers fluctuate if i have MAX taking continuous samples...
So this new transducer is working according to the multimeter but when i try to calibrate it the values do not fluctuate and are stuck at a given value throughout the given pressure range... what are some steps to troubleshoot this?
sorry KowdTek, didnt mean to thread jack but i figured instead of making a new post, i could use urs.
I need some more information. What DAQ card are you using? What sensor? What versions of MAX, DAQmx, and operating system?
How do you have your task set up? Can you post a screenshot of the task and a screenshot of the calibrate window?
Also, can you give me the steps of how you calibrate your device? I'm just trying get a better understanding of your situation because this does seem quite strange.
I happened to make it work, i am still new to labview and am understanding more by the day, I was miraculously able to make it work but now i am trying to understand the basics of how to make things work...
I have a SCXI 1000 chassis with a SCXI 1600 usb connected to a laptop (windows XP)
I then have a SCXI1121 module with an SCXI 1321 thats wired to my 20k pis transducer (omega px602-20kgv same properties as above just higher pressure rating)
I have my gain set to 1 and my excitation to 10V (in the 1121 module and in MAX)
Now to my problem... I have tried BOTH a linear and custom table scale in max but my values are not correct.
For example these are my following data points:
Using the first and last points i got my slope and linear equation and used that equation for my linear scale and my readings were off, when i was to read 30psi it read 120psi
I then tried using a custom table in max. To do this i listed every single psi level from 0-20k with the corresponding mV value obtained from a function using the linear equation. This did not work as well as my readings were off.
I then calibrated my scales up to 1000psi, which worked, until i reached 1000psi and everything after that the readings were off. I assume that the scale took place at this point. My pressure levels that i am to test are rather high so i wish to not have to calibrate to those values if i do not have to. Is there a way around the calibrating using the information i provided?
I made a custom scale using those data points as well, and it seems pretty linear. It seems like without physically doing the calibration, your best bet would be to do what you did and extrapolate the linear scale to fit the higher values. I imagine this would become more and more accurate the more points you can calibrate into this scale at the higher ranges.
i did do that using excel and it did not work... i somehow made it work although my scale/transducer seems to work for an hour and then it loses "calibration" and then it seems to fail (pressure readings are off scale) while i am running my vi..
I am doing pressure testing and this, in my opinion, is not very safe, what is going wrong?
i called ni and they said i am doing everything correct so far, so then i contacted omega(pressure transducer) and they said that maybe its the excitation voltage that is throwing it off.. i did check my transducer mV output and it reported 3 different readings at 500psi, how can i check my excitation voltage to see if it is dropping while i run the vi? id perfer not to run my actual test while doing so, anything i can do to check to see if the excitation voltage chages? run a dummy test?