Hi everyone... this is my first post here, however I think I have found a pretty large bug! Let me give a little background infomormation on what I am trying to do). My VI takes an arbitrary amount of databuffers that have been recorded at various sample rates and for differing lengths of time and graphs the data on a single waveform graph. With the different sample rates, sample start/stop times, and sample durations I have been making sure that each plot is correctly bundled with an appropriate X0 and deltaX, and also correlated with the correct Y scale. All of the "heavy lifting" of associating plots with the correct Y-scales and labels is done with property nodes (ActPlot, ActYScl, YScale.Offset, YScale.Multiplier, YScale.NameLbl.Text, etc). Furthermore, I manually set the min/max on both Y Scales, as well as the min/max on the X scale.
Now, the problem: It seems that somewhere in the process of changing all of these properties, that LabView got confused on what/how much it should actually be displaying. Here is the image, problem areas in the red box.
I have tried changing __ALL__ options on the X-scale, and even removing the extra Y-scale (it seems to be not displaying the section about the width of the 2nd Y Scale Label). Each time the VI runs, I manually set the min. to -3E-6 and the make to 1.1E-5
Can anyone figure out what the heck is going on that would cause Labview to not display all of the data within the min/max range, or to display whitespace beyond the (in this case) maximum?
Any help appreciated,
Derek Steinkamp
derekste [at] fnal [dot] gov
Accelerator Division, Fermi National Accelerator Laboratory