LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

waveform chart does not show time properly

Solved!
Go to solution

Hello,

 

I am trying to represent the voltage from a keithley multimeter vs time using a Waveform chart. The voltage is recorded properly, however, the time does not seem to match. It shows around 10 times higher seconds. See the VI attached where I also used alternatively an XY graph. I would like the time to be displayed in the waveform chart as in this XY graph.

 

Additionally, in the XY graph I have had to establish a zero time in order to obtain the relative time. Does it exist a more straightforward way to do it instead of using this trick?

 

Thank you.

 

0 Kudos
Message 1 of 22
(6,692 Views)

Access the chart properties

 

Set the XScale Offscale to 0

Set the XScale Multiplier to the correct time

0 Kudos
Message 2 of 22
(6,680 Views)

Ok thanks, it seems to work by using 0.1 multiplier. However, what if the multiplier is not exactly 0.1, i. e. if it is 0.0104 I could get a delay in the time scale after some seconds, and in fact this is what I am observing. i could calculate the proper factor by relating it with the real time, but why the proper time is not shown? Is there any other way to have a proper time without using this trick?

 

On the other hand, when exporting the data to clipboard and pasting it, in the time column only the figure for seconds appear (with no decimals). How can I set for example 3 decimal figures?

 

Additionally, is there any chance to choose the sampling rate? It takes 10 samples per second and in my VI the loop is repeated every 50 ms, this means it should read a lot more. I am using GPIB connection. Is it possible that the acquisition limit of the GPIB is 100 ms?

 

Finally, any hints to the other question:

In the XY graph I have had to establish a zero time in order to obtain the relative time. Does it exist a more straightforward way to do it instead of using this trick?

 

Thanks.

0 Kudos
Message 3 of 22
(6,650 Views)

Waveform data is data with *equally spaced* X values, by definition.  It's what you expect to get from an oscilloscope or any measurement instrument that controls its own sampling time, but not from measurements taken once per software command.  Thus you CANNOT use real, measured times as the X data in either a waveform chart or a waveform graph based on measured times for each point.  If you want a plot with data points plotted at their correct, inequally spaced X positions...well you've already figured out the solution.  That's what XY graphs are for and precisely why they are different from waveform charts/graphs.  

 

You may also have been confused by the fact that LabView sets the default X axis name to "Time", because that's often the X axis, but it's just a label and a default choice.  It is not meant imply that LabView is automatically recording the time your data is written to the chart.  So if you can reasonably count on the time between sampling being 0.104s, or you don't care that it's approximate then go ahead and set the multiplier to 0.104.  Otherwise, just use the XY graph with measured times.  

 

Why does the export round time to the nearest integer?  That seems buggy to me, but you have many solutions.  You can fill spreadsheet cells with times separated by 0.104 s quite easily, or you can export the measured times from the XY graph (which works).  Plus various ways to save data from Labview (if you're still in the process of learning as you build a real experiment, I recommend working on this soon) to spreadsheet files.  

 

If the 0.104s is limited by communication to the multimeter, you probably cannot count on it remaining steady.  The specced bit rate for GPIB is plenty fast to transfer way more than 10 numbers per second, but there's a lot of communication going on behind the scenes when you request a single measurement from software.  You'll only see anything close to max speed (100 kb/s ?) if you transfer large data objects.  Does the multimeter even bother to take measurements faster than that? (not a rhetorical question, I'm not familiar with that model).  I ask because you're calling it a multimeter as opposed to an A/D card or a scope.  They all do the same basic task (analog voltage to digital numbers, although the "multi" implies current and resistance measurement as well) but when I hear multimeter, I think of a device primarily designed to be read by human eyes.  Even if it happens to have a computer interface as a bonus, and probably won't bother to have a fast update rate.  

 

 

-Ian Konen
0 Kudos
Message 4 of 22
(6,637 Views)

Ikonen, thanks for you accurate explanation. I am clear now with the different graphs. I think my Keithley multimeter can sample higher than 10 numbers per second, the GPIB interface is one of the main features of the model. However, as you suggested, there could be something behind the scenes limiting this operation.

 

Can you give me any help to the question:

In the XY graph I have had to establish a zero time in order to obtain the relative time. Does it exist a more straightforward and accurate way to do it instead of using this trick? I am getting here some few repeated times (i.e. the same time for two consecutive measurements) logged and it shouldn't.

 

Thanks.

0 Kudos
Message 5 of 22
(6,617 Views)

You basically have the idea correct on setting the relative time:  Sample the absolute time once before the loop starts and subtract it from future measurements.  What's kludgey about what you did is using a control and a local variable to pass the value into the loop rather than a single wire.  Connect the wire straight from that first time measurement outside the loop to the subtraction node in the loop, and get rid of the "zero time" control entirely (unless for some reason you want the user to be able to change the zero time in the middle of an experiment)

 

The help on Get Date/Time in seconds says that precision will vary by system but doesn't give an estimate.  Given the max sampling rate of 20 Hz you said you've experienced, I'm surprised you're seeing repeat measurments (that means the precision is on the order of 0.1 s or there's another reason entirely).  But it also says the millisecond timer (called "Tick Count") will have a higher precision. Although it returns in units of ms, it only has an actual resolution of about 1/60th of a second or 16ms, but that should still work for you.  So replace the Get Date/Time in seconds with "Tick Count", divide by 1000 and see if the results imrpove.   

-Ian Konen
0 Kudos
Message 6 of 22
(6,608 Views)

I have made the changes suggested but the Tick Count has an undefined base reference time as indicated in the help, so when I use it a very large value of seconds comes out. It does not work.

0 Kudos
Message 7 of 22
(6,580 Views)

But you said you wanted relative time, not absolute.  In fact you demonstrated the solution by sampling time once before you entered your data collection loop, and then subtracting that from all subsequent measures, so that the time you are recording represents time since the experiment began, not time since 12:00 a.m., Friday, January 1, 1904.  If you didn't subtract an initial measurement of the current time, you would also end up with a "very large value of seconds" when using Get Date/Time in seconds, since a lot of seconds have passed since 1904.  Why would you change that when you switch to the tick count vi?  

 

I'm not really sure that's the issue though.  Although the Get Date/Time help says it loses precision when converted to a double float, on my system at least, they both seem to have 1 ms precision.  It does say it's system dependent: perhaps your computer clock only records time down to the second?  I'm skeptical, but not an IT expert.  Perhaps that's more common than I would have guessed.  If so, the tick count should improve things because it's a different timing source and should be more accurate than 1s.  

-Ian Konen
0 Kudos
Message 8 of 22
(6,571 Views)

Yes, I want relative time, but the relative time I get is very large and I do not know why. I do subtract the initial time but a large number is shown, far from the time spent since the experiment began.

0 Kudos
Message 9 of 22
(6,569 Views)

Did you divide by 1000?  Tick counts returns an integer number of *milliseconds*.  If that's not it I think you'll have to post another sample of your code.

-Ian Konen
0 Kudos
Message 10 of 22
(6,550 Views)