LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

One Interesting observation about Waveform Chart

I am not sure why you have as many tab-pages when, it appears, one can select any channel for a given graph.  Is having the history-data on the chart a requirement?  If not, here's a suggestion:  since the user will be only one of the tab-pages at any given time, update only that tab-page's chart.  This can be done by using a Case Structure and wiring the Tab3 terminal to its Case selector.  Now, move the respective charts to their cases. 
 
This way you are only writing to one chart every loop.
 
-Khalid
 
 
0 Kudos
Message 11 of 26
(1,308 Views)

Hii, Khalid

        Thanks for the Suggetion man, but  it wont work for me, because wehn vi, will initialise, all the graphs, will initialise to the current date and time, but after that only one graph, which is currently update will update the time on x-axis and else othes will not update the time, they will be haulted to the initialised time, now if i ll switch over to another graph in anpther tab, then that graph will start updating the time on X-axis from the initialised time and not from the current time, and the graph from which we have switched over will stop updating the time on X-axis, so it will hault the time on current time, thats why i have to update all the graphs, and talking about history chart, i have to keep this much history because user requires, but i will try to convince the user and to minimize the chart history length. anyways thanks for the valuable input, i hope that this will help me in my other projects.

Thanks,

Nishant

Message 12 of 26
(1,300 Views)

Nishant,

Please carefully review this article by DF Gray entitled "Managing Large Data Sets in LabVIEW".

http://zone.ni.com/devzone/conceptd.nsf/webmain/6A56C174EABA7BBD86256E58005D9712

The section labled "How to reduce the data used to plot graphs" will help you.

This is very deep info and you may want to read it a  couple of times before it all registers. Smiley Surprised

Ben

Message Edited by Ben on 03-22-2006 06:16 AM

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 13 of 26
(1,295 Views)

Hiii, Ben

             Thanks, its a real good document about memory management in LabVIEW, and other than that, this link has left so many other options for managing memory, information about when it is creating new buffer,etc..,

Thanks,

Nishant

0 Kudos
Message 14 of 26
(1,261 Views)
Hiii, Megan
 
             Have you find anything about this problem, as Ben and Altenbach told me, i have tried all the ways and also try to do the memory manaement, but still facing the problem, can you please suggest me something or i should post it as a bug?
 
Thanks,
Nishant
0 Kudos
Message 15 of 26
(1,231 Views)
i just took a look at ur code and i had noticed 1 major problem. u really should not use so many properties and globals it will go allot faster if u don't.
0 Kudos
Message 16 of 26
(1,220 Views)

Hiii, vivi

           i know there are so many property nodesa and globals, but i already had decrease some of that and now it is not possible for me to decrease it anyways, cos it wont give me the output i want, and if i talk about my original output, it is there it is giving me one second of time in one second, but this is the problem of widening the screen of monitor, if screen is 17", should it mean that the procedure one second must complete in 1/2 or 1/3 part? so that it shows the procedure completed in 1 sec in the wide screen?

Thanks,

Nishant

0 Kudos
Message 17 of 26
(1,215 Views)

Hello Nishant,

By "screen size" are you referring to the diagram size?  And wondering if that has an effect on the performance? 

-Khalid

0 Kudos
Message 18 of 26
(1,211 Views)
Hiii, Khalid
 
          Now i am not saying that, it is affecting on the performance, but as i described let me explain you with example. "Let me say I m setting the 1 sec of Time span between the starting point and End Point at the X-axis and let me say my screen is of 17', now on the 17' screen i will increase my graph's horizontal size to 12"-13", which contains 70-75% of my screen, and shows the larger display on the screen, here in exact 1 sec my cursor line (Green line in the  attached figure) should reach from Right to left, instead as graph's horizontal size is 12"-13", it takes 1.256 secs to reach from right to left.
 
           Now Lets say my screen is of 14", so the graph size will be 9"-10" and span of the X-axis is also one second between start and End, and in this graph size that green line will take either <1 sec(lets say 0.8 sec) or round about 1 sec for the same span on the x-axis, but with the different size of monitor screen and graph size.
 
       Now this is creating a problem in my case, so i couldnt provide the fix size of graph screen, to the customer and as i thought, it should not happen. thats why i am wondering so much.
 
In short " Larger the graph screen, larger the time to reach from right to left of screen for the cursor."
 
Thanks,
Nishant
 
        
0 Kudos
Message 19 of 26
(1,210 Views)

Nishant,

Ceteris paribus -- with everything else being exactly the the same -- just changing the display-size of the chart, slows down the system?  Are you sure you're not changing the X-axis' start and end points?  Does this happen on a single machine when you stretch the chart?

-Khalid

0 Kudos
Message 20 of 26
(1,171 Views)