LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Priority when minimizing/maximizing windows

Alright, I've got a fun little problem with a quick little application I wrote last week.

 

Short version:

Using a USB-6251 BNC, I'm throwing out an excitation signal at 2.8MS/s and reading back from a laser vibrometer at 1.25MS/s.  Customer specified using the device to its full capability (sampling rate specifically), even though most excitations are under 5kHz.

 

Since the excitation can be an arbitrary length, I'm breaking up the writes and generating the samples on the fly in 100kS chunks.  This works fine for any period of time that I've tested (up to 2 minutes).  The acquisition piece is currently artificially limited to 2 seconds to prevent out of memory problems with allocating a gargantuan array and that side is working with no issues.

 

However, if you minimize or maximize any window on the system, there's enough of a glitch to cause a buffer underflow on the output.  You can actually launch Excel from scratch, open a file in it and work on it and the output will chug right along as long as you don't minimize, maximize or restore it from minimized state.  So far, the only real thing that I've tried is to raise the process priority to 'Above Normal' using the kernel32.dll trick from here, and to raise the priority (internal to the program) of the output  VI.

 

A little more about the architecture of the 'on-the-fly' generation.  I'm spawning off a VI in parallel that begins the generation and puts the chunks into a queue (size limited to 4 elements currently) up to the trial run time at which point it exits.  My writeSamples VI consumes these elements until the queue is empty, then stops. 

 

Anyway, thought I'd put this out there to see if anyone had come across something like this before.  I suppose I could increase my 'chunk' size to 500kS or something like that.  Thinking about it, my generation loop is not running at a higher internal priority level, I should do that as well.

0 Kudos
Message 1 of 4
(2,607 Views)

My collegue has reported the same problem to me this morning so you are not alone! For him the problem seems to be associated with one particular PC.

 

I will get him to post a solution if he comes up with one. In the meantime if you discover a solution we will be interested to know what it is.

 

Good luck!

David
www.controlsoftwaresolutions.com
0 Kudos
Message 2 of 4
(2,595 Views)

Apparently minimizing and maximizing windows is very processor intensive for short bursts.  I wound up having to make my producer queue a bit deeper (10 elements vs 4).  I also increased the priority of the producer loop which may have helped.

 

I did come across an interesting little bug though.  Initially, I was programatically changing the priority of the producer VI with a property node.  This worked fine during debug, but caused a lockup in the compiled version.  I removed the programattic change and relied on the manual VI setting to work instead.  Whether it is or not, I'm not sure, but the program is working now so my curiosity wanes ...  :smileyhappy:

0 Kudos
Message 3 of 4
(2,579 Views)

I know this is an old topic but I had this problem last week on a very powerful pc.  My time sensitive tasks were having issues due to simply minimizing and maximizing windows.

 

The solution for me was found in

 

Control Panel
Performance Information and Tools
Adjust Visual effects

 

To fix

Uncheck "Let Windows choose what's is best for my computer"
Check "Adjust for best performance"


Or you can manually turn on and off options as you choose, the issue here was "Animate windows when minimizing and maximizing"

0 Kudos
Message 4 of 4
(2,323 Views)