Showing results for 
Search instead for 
Did you mean: 

Reducing Windows Background Processes

I'm in charge of my company's data acquisition system and we've finally got everything running about how we want it to.  The last issue we're having is that whenever the screen updates the program slows down measureably.  I've got the program running in a loop that collects data, processes data, waits for user input, sends out commands to other hardware, and then repeats.  The program is configured to only display one data point (64 channels, plus 5 on a chart and three other numeric gauge indicators) of every 100 collected.  The effect this has is that while the program is not displaying data the loop executes in about 2-3ms, but that time jumps to ~60ms when it has to update the front panel.


My question is: how can improve the performance of my code on a windows XP system?

Is there some kind of indicator that I should use over another kind?

Does anyone have a guide on how to disable windows background processes (the computer the DAQ system is on isn't running much, but I'm not a very advanced user in that area)?

Any other general tips and tricks for this kind of problem?


I know the best solution would be to move to a real time operating system, but that just isn't an option right now.  Any suggestions on how to improve performance on the system I've got would be appreciated.

0 Kudos
Message 1 of 10

How exactly have you designed your application? Have you used the Producer/Consumer design pattern and taken advantage of buffering and multi-threading?


A real-time OS won't solve the issue. You don't seem to need any determinism in your application, do you?

Adnan Zafar
Certified LabVIEW Architect
Coleman Technologies
0 Kudos
Message 2 of 10

Hi cm_opi,


Can you please elaborate on your data acquisition setup? What kind of signals are you acquiring (analog/digital)? What is your sampling rate? How are you architecting your application to only display one data point out of every 100 collected? Are the five data points on the chart and the three on the numeric gauge indicators included in the 64 channels you mention? How are you benchmarking your loop speed to come up with the 2-3ms and ~60ms numbers? What version of LabVIEW are you running this on?


Thank you for choosing National Instruments.


Aaron Pena

National Instruments

Applications Engineer

0 Kudos
Message 3 of 10

Some clarifications, as requested.  We're acquiring 32 analog channels and 32 thermocouple channels.  The gauges and chart indicators are just a way of displaying a subset of the data in a more obvious form.  All channels are also displayed in numeric indicators.  The program is set up to run a loop with the following DAQ scheme: calculate a timestamp (relative to the start of the program), acquire a single scan of the analog channels, acquire a single scan of the thermocouple channels, and then do other data processing tasks.  There's no set data rate, we're trying to acquire the data as fast as the program will allow, so the 2-3ms loop times is what we see in the output data file.  One data point will occur at time 0.003, and then next would get the timestamp 0.006. 

Every time the screen is updated, in the post-processing part, the time to execute the whole loop increases dramatically.  In the data file we'd see a point recorded at time 0.006, and then the next one would be at time 0.066 (something to that effect).  Since screen updating seems to be the cause of the slowdown I'm interested in what I can do to get that part to run as fast as possible.


We're running LabVIEW 7.1, and to get the screen to only update occasionally I have it set to keep a counter going that is incremented each time through the loop.  If the counter gets to the value of 100, then in post-processing the current batch of data is sent to the front panel indicators and the counter is reset to 0.  For all other counter values the data just goes into the output file and the screen indicators are left alone.

0 Kudos
Message 4 of 10

Your strategy does not sound very good. If you want speed you should not use singlescan. And what is the point in updating indicators every 3. msec? If you want samples with accurate timing you should continuous sampling, and pull data from the buffer with a rate around 50 times pr second. Remember the human eye can not detect fast changes. If I am not wrong the frame rate in a motion picture is 25 frames second 

Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
(Sorry no Labview "brag list" so far)
0 Kudos
Message 5 of 10

Coq is right on. The simplest solution would be to collect the samples into a shift register and only output them into the chart display once every half second or second using the modulo and case statements and based on a certain number of loop iterations. Also, single point means that everything in your loop has to happen before it goes and looks for another data point, including updating the graph on screen. It would be better to request a quarter or half second worth of data at a time, still implementing the slower graph update method. Let us know how that works out.

[will work for kudos]
0 Kudos
Message 6 of 10
I'm not sure if I explained the data rate vs display rate very well, so let me try again.  When the screen is not updating the program trucks along at around 300Hz (3ms loop execution time).  This is just great for our application (we're really looking for 100Hz or better), so the single scan is working for us.  The problem comes in when that 100th loop iteration rolls around and we update the screen indicators.  This action delays the program by 60-70ms (16-14Hz).  We're not updating indicators with every data scan, and I intentionally made it so the indicators only update occasionally because otherwise the data was too difficult to view.  The user does have to see the data at some reasonable rate while the program is running, so I want to keep the screen updating in there.  I am interested in how to avoid/minimize that big jump in execution time whenever the indicators update.  Is there some way to configure LabVIEW/Windows so that action doesn't take as long?
0 Kudos
Message 7 of 10
And again I must said that your strategy is bad. If one of my colleagues had presented such an approach I would immediately have taken him friendly aside, and told him. That such a approach is very "No can do".......
Message Edited by Coq rouge on 06-22-2009 04:39 PM

Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
(Sorry no Labview "brag list" so far)
0 Kudos
Message 8 of 10



What are you doing with the extra data that you are not displaying on your front panel? Do you need to display the data to the front panel immediately? If the display is not absolutely time sensitive, I would like to bring up something Adnan brought up earlier, have you tried using a Producer/Consumer architecture?


Thank you for choosing National Instruments.


Aaron Pena

National Instruments

Applications Engineer

0 Kudos
Message 9 of 10

Hi cm_opi,


We have been helping people with that type of issue for years and the best thing you can do is to post your code so we can review it.


If you do not want to post your code then the best we can do is to advice you to set up your architecture such that the operations are stream lined and optimized by playing 20 questions.


So please post your code and let the volunteers take a llok and give you solid suggestings.





You continuous acquisition into queues read by the display and logging functions. The hardware will keep your acquisition going and the display (maybe defer FP updates) and logging can cope with Windows.

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 10 of 10