LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

LAbView graphics speed?

Hello,


Could someone tell me what is the "tipical" Labview chart/graphics speed
to "show" on screen one million points.
To be more precise ... How long does it take Labview
graphics/chart/panel windows to plot (with a simple ploting function)
one million points loaded from an array in memory (to be more fast).
I tried to do this in a Visual Basic application but the results were
not very ... well ... convincent ...
One million points take something like 45 seconds to plot ("array to
screen" time roughly ...).

Thanks in advance,
Best regards,
Pedro


Note: I have sent another post similar to this one ... dejanews did not
confirm the posting .. so to be sure ... here goes the same ...


Note2:
Please reply also to:
samp@ ->repla
ce<-cfn.ist.utl.pt

And remove the obvious anti-spam string .


Sent via Deja.com http://www.deja.com/
Before you buy.
0 Kudos
Message 1 of 7
(4,188 Views)
It will depend on the PC and processor. Also the graphics depth has a big
influence on this. Set your screen to 24 or 32bit true colour and it'll slow
to a crawl


wrote in message <80ha34$pm5$1@nnrp1.deja.com>...
>Hello,
>
>
>Could someone tell me what is the "tipical" Labview chart/graphics speed
>to "show" on screen one million points.
>To be more precise ... How long does it take Labview
>graphics/chart/panel windows to plot (with a simple ploting function)
>one million points loaded from an array in memory (to be more fast).
>I tried to do this in a Visual Basic application but the results were
>not very ... well ... convincent ...
>One million points take something like 45 seconds to plot ("array to
>screen" time roughly ...).
>
>Thanks in advance,
>Best regards,

>Pedro
>
>
>Note: I have sent another post similar to this one ... dejanews did not
>confirm the posting .. so to be sure ... here goes the same ...
>
>
>Note2:
>Please reply also to:
>samp@ ->replace<-cfn.ist.utl.pt
>
>And remove the obvious anti-spam string .
>
>
>Sent via Deja.com http://www.deja.com/
>Before you buy.
0 Kudos
Message 2 of 7
(4,188 Views)
Just a note:
most monitors have less than 1024 columns and probably around 1 million
pixels. Unless you want to fill your entire screen with points, the question
boils down to:
How much computation do you want to spend to reduce the data so you still
get a visually accurate representation on your screen? If you just shuffle 1
million points to you graphics card, you're doing something wrong.
LabVIEW does a great job in data reduction keeping all key features (e.g.
local spikes) in view.

Here are some numbers ( no tweaking with debugging/priority settings)

--> 1 million random array to a waveform graph: less than 2 seconds (AMD
K6-2, 400MHZ)

Cheers
Christian

wrote in message
news:80ha34$pm5$1@nnrp1.deja.com...
> Hello,
>
>
> Could so
meone tell me what is the "tipical" Labview chart/graphics speed
> to "show" on screen one million points.
> To be more precise ... How long does it take Labview
> graphics/chart/panel windows to plot (with a simple ploting function)
> one million points loaded from an array in memory (to be more fast).
> I tried to do this in a Visual Basic application but the results were
> not very ... well ... convincent ...
> One million points take something like 45 seconds to plot ("array to
> screen" time roughly ...).
>
> Thanks in advance,
> Best regards,
> Pedro
0 Kudos
Message 3 of 7
(4,188 Views)
See your text...

Christian Altenbach wrote:
>
> Just a note:
> most monitors have less than 1024 columns and probably around 1 million
> pixels. Unless you want to fill your entire screen with points, the question
> boils down to:
> How much computation do you want to spend to reduce the data so you still
> get a visually accurate representation on your screen? If you just shuffle 1
> million points to you graphics card, you're doing something wrong.
> LabVIEW does a great job in data reduction keeping all key features (e.g.
> local spikes) in view.
>
> Here are some numbers ( no tweaking with debugging/priority settings)
>
> --> 1 million random array to a waveform graph: less than 2 seconds (AMD
> K6-2, 400MHZ)

But you don't show all these points at t
he same time. As you pointed out
yourself, the 1024 wide screen allows display of only 0.1 percent of the
data. Is it like shifting through the data like on a strip chart recorder?

-David
0 Kudos
Message 4 of 7
(4,188 Views)
David wrote in message
news:382CA0A4.F38C7EDA@visto.com...
> See your text...
>
> Christian Altenbach wrote:
> >
> > Just a note:
> > most monitors have less than 1024 columns and probably around 1 million
> > pixels. Unless you want to fill your entire screen with points, the
question
> > boils down to:
> > How much computation do you want to spend to reduce the data so you
still
> > get a visually accurate representation on your screen? If you just
shuffle 1
> > million points to you graphics card, you're doing something wrong.
> > LabVIEW does a great job in data reduction keeping all key features
(e.g.
> > local spikes) in view.
> >
> > Here are some numbers ( no tweaking with debugging/priority settings)
> >
> > --> 1
million random array to a waveform graph: less than 2 seconds (AMD
> > K6-2, 400MHZ)
>
> But you don't show all these points at the same time. As you pointed out
> yourself, the 1024 wide screen allows display of only 0.1 percent of the
> data. Is it like shifting through the data like on a strip chart recorder?
>
> -David

David

This was just to provide a rough number to Pedro, he was asking: "To be more
precise ... How long does it take LabVIEW ...". I actually send 1000000
points to the graph terminal and leave it up to LabVIEW how to deal with the
display.

As I mentioned, the more important question is: If you graph a 100000 point
array to an autoscaling waveform graph, will you still see the essential
features of the graph? How does LabVIEW map the 1000000 points into the
<1024 "one pixel-wide" bins? Try it with a random array (0..1) that you pipe
through 1/x. There are rare spikes that are only one point wide, but LabVIEW
shows all of them. Each bin show the full min-max rang
e in that particular
data segment.

So, yes, of course not all points are shown, but at any time you can zoom
into any small area of your graph to see any detail you want, even if the
program is no longer running!

Cheers
Christian
Message 5 of 7
(4,188 Views)
Hello,

I must thank everyone who followed this discussion.
Meanwhile I could not reply because I did not have time to do so. In the
frequency domain things are
not good either .. 🙂 my bandwidth is very very samll.
In between I started to make more testings in my application and I
realised that my problem is far
more complex than just simple "plotting".
I have used several Windows tool like Visual Basic, Delphi and Labview.
The idea to test the speed is to plot directly from an array in memory
(not in disk to avoid disk
access bottleneck speed) a big amount of data.
My starting point was my own (and very slow) direct plotting whit a
"custom made"
reduction/compression tecnhique. That led me to put this question.
Of course I now am convinced that the problem lies in my graphical
reduction technique.
Without any mathematical spline tecnhique the ploting is a bit slow
using either VB ou Delphi. If the
reduction technique is associated with a polinomial curve fitting then
the speed increases a lot ... but
my custom made program is still slow.
Obviously this techniques and many more are "embeded" in comercial
products. I tried the teechart
pro demo version and the results are much better. I also think Labview
is a very good choice for this
sort of applications.
In case anyone is interested the example in the teechart Activex is very
good, very enlightning.
The specific problem, has I told earlier, is not on the graphics speed,
but in the fact that the data are
transfered from a remote computer over the lan and that the ethernet
cards I have are limited to
10Mbit/s!
-The system also extracts data from a VME control module ... not simple.
Anyway, I am currently working on data reduction at the source ... the
remote machines ...
Maybe it will work,

Thanks once again,
Best regards,
Pedro





In article <382cbe18@newsgroups.ni.com>,
"Christian Altenbach" wrote:
>
> David wrote in message
> news:382CA0A4.F38C7EDA@visto.com...
> > See your text...
> >
> > Christian Altenbach wrote:
> > >
> > > Just a note:
> > > most monitors have less than 1024 columns and probably around 1
million
> > > pixels. Unless you want to fill your entire screen with points,
the
> question
> > > boils down to:
> > > How much computation do you want to spend to reduce the data so
you
> still
> > > get a visually accurate representation on your screen? If you just
> shuffle 1
> > > million points to you graphics card, you're doing something wrong.
> > > LabVIEW does a great job in data reduction keeping all key
features
> (e.g.
> > > local spikes) in view.
> > >
> > > Here are some numbers ( no tweaking with debugging/priority
settings)
> > >
> > > --> 1 million random array to a waveform graph: less than 2
seconds (AMD
> > > K6-2, 400MHZ)
> >
> > But you don't show all these points at the same time. As you pointed
out
> > yourself, the 1024 wide screen allows display of only 0.1 percent of
the
> > data. Is it like shifting through the data like on a strip chart
recorder?
> >
> > -David
>
> David
>
> This was just to provide a rough number to Pedro, he was asking: "To
be more
> precise ... How long does it take LabVIEW ...". I actually send
1000000
> points to the graph terminal and leave it up to LabVIEW how to deal
with the
> display.
>
> As I mentioned, the more important question is: If you graph a 100000
point
> array to an autoscaling waveform graph, will you still see the
essential
> features of the graph? How does LabVIEW map the 1000000 points into
the
> <1024 "one pixel-wide" bins? Try it with a random array (0..1) that
you pipe
> through 1/x. There are rare spikes that are only one point wide, but
LabVIEW
> shows all of them. Each bin show the full min-max range in that
particular
> data segment.
>
> So, yes, of course not all points are shown, but at any time you can
zoom
> into any small area of your graph to see any detail you want, even if
the
> program is no longer running!
>
> Cheers
> Christian
>
>


Sent via Deja.com http://www.deja.com/
Before you buy.
0 Kudos
Message 7 of 7
(4,188 Views)
> Could someone tell me what is the "tipical" Labview chart/graphics speed
> to "show" on screen one million points.
> To be more precise ... How long does it take Labview
> graphics/chart/panel windows to plot (with a simple ploting function)
> one million points loaded from an array in memory (to be more fast).
> I tried to do this in a Visual Basic application but the results were
> not very ... well ... convincent ...
> One million points take something like 45 seconds to plot ("array to
> screen" time roughly ...).
>

I got a new computer recently, a G4, and in 32-bit TrueColor,
it takes about 600ms for the default sized graph, and 800ms
for a large one almost full-screen.

The speed will vary a great deal depending on whether the
graph has point styles turned
on, whether it is a waveform
graph, XY graph, or bar plot. This is due to the number of
optimizations that are possible for different types of plots
and the ones that are actually implemented. The waveform
with just lines is quite optimized though.

Greg McKaskle
Message 6 of 7
(4,188 Views)