LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

cDAQ High CPU Use

Solved!
Go to solution
Solution
Accepted by ashesman1

I don't know if this applies to Linux and cDAQ, but i remember several cases where overlapping front panel objects and transparancy caused a lot of redraws hogging the CPU, could this be the case? Basically each redraw triggered a dirty bit causing it to redraw again.

/Y

 

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 11 of 18
(1,364 Views)

You could be on to something...   I definitely think there is something going on with controls or indicators.  I just cant put my finger on it exactly.  I have been on leave for a few weeks so will get back into it and see if I can narrow it down some more.

0 Kudos
Message 12 of 18
(1,339 Views)

The problem was two indicators that used transparency!!!  Replaced them with standard indicators (one string, one numeric) and dropped CPU use by 110%/200%

Message 13 of 18
(1,332 Views)

They might not need replacing, but if they're both transparent and overlapping seems to be the cause. I'm glad you got it to work!

/Y

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 14 of 18
(1,325 Views)

I'm a bit interested in this because I'm also working with the same controller.  Is this with the embedded UI enabled, or is high CPU usage seen even when the embedded UI is disabled?

0 Kudos
Message 15 of 18
(1,300 Views)

Using an embedded UI.  1920x1080 screen plugged into cDAQ 9132s display port.  I am now down to a typical CPU use of 60-70% combined CPU use which is not bad.  The OS on its own seems to use about 40-50%.  When we use the IP camera, it taps the CPU out if you try use high resolution (1024x768) or frame rate (20Hz).  NI support have confirmed that this is just the way it is!

0 Kudos
Message 16 of 18
(1,299 Views)

Thanks for the info, those all seem like higher values than I'd expect.  Sure these are just atom based processors and I can't predict what it would do with a camera at those rates and resolutions without any graphical processor for encoding.  But for just the OS I'm looking at between 10% and 20% with the embedded UI disabled.  With it enabled, and on a relatively low resolution monitor, I'm probably between 20% and 30% running no code.  Maybe having a 1080 monitor really is a big effort to for the CPU.

0 Kudos
Message 17 of 18
(1,289 Views)

What happens if you lower the colors to 16 or 8 bit on desktop?

/Y

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 18 of 18
(1,282 Views)