LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How do i convert x-axis label from "sample #" to time in microseconds relative to x=t=0 microseconds? (with known acquisition rate)

Highlighted
I have acquired data and stored to a file. Upon reading back the information and graphing it, the default x-axis label is the sample number. I wish to convert these sample numbers into a microsecond ammount of time since the beginning of acquisition, which is t=0. I have also stored the acquisition rate with the file. I can not seem to accomplish this without affecting the range of the graph which skews my display. Is there an easy way to accomplish this that I am not aware of? Any help is greatly appreciated! Thanks!
0 Kudos
Message 1 of 4
(1,472 Views)
Highlighted
Your problem can be solved by several ways I think. Different people have different ideas. Now your problem is you want to change the appearance of your X scales in your graph right. You right click your graph in your front panel.
>>X-scale
>>>formatting
Then you can do what you want. You can change to sci-notation,eng-notation. I think you get what I mean and hope this work and if not forgive me.
Regards,
Saw Naing Aye
0 Kudos
Message 2 of 4
(1,472 Views)
Highlighted
To do it programatically, create a property node of the graph and then select Range.XScale.Multiplier. Make the property write and wire your acquisition rate to it. If you autoscaling the X scale, the display range shouldn't change since it's just a text change but you may need to change the format of the x scale in order to display floating point numbers.
0 Kudos
Message 3 of 4
(1,472 Views)
Highlighted
Thank you so much! Using your idea I was able to make it work for me, namely, when I read in a graph, I set the X axis to auto scale, then set formatting and X0=0 and dX=1+(1/scan rate) and this represents the X axis in seconds for me.
Thanks again!
Josh
0 Kudos
Message 4 of 4
(1,472 Views)