LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Program overhead issue

Hi all,

 

          I have a LabView progeram that is used to acquire/read in data from a FIFO board. The program is in LV11.0.

 

I have a few questions :

 

  1. The program that reads in data from the FIFO ( Via USB ) takes certain amount of time which stays approximately same no matter how less or more is stored on the FIFO. I think this is what is called as Software overhead delay.
  2. So I want to redesign the program by making it collect certain samples lets say 1. Then again collect another set of samples but this time 2. and , so on. At the end of FIFO fill I want it to pass those collected data to my computer.
  3. Is this possible. I can save a lot of time right.

Analogy of the situation is : No matter how much work I assign to the program it is going to take the same amount of time to complete the task !

 

I seek advice on this.

 

Thank you.

Best

Abhi.

Abhilash S Nair

Research Assistant @ Photonic Devices and Systems lab

[ LabView professional Development System - Version 11.0 - 32-bit ]

LabView Gear:
1. NI PXI-7951R & NI 5761
2. The Imaging Source USB 3.0 monochrome camera with trigger : DMK 23UM021

OPERATING SYSTEM - [ MS windows 7 Home Premium 64-bit SP-1 ]
CPU - [Intel Core i7-2600 CPU @ 3.40Ghz ]
MEMORY - [ 16.0 GB RAM ]
GPU - [ NVIDIA GeForce GT 530 ]
0 Kudos
Message 1 of 3
(2,399 Views)

What are you doing to control your data flow? (i.e. state machine) You can make a system that does multiple reads before processing data. If you are doing this I would also suggest that you use a consumer producer system so that you do not have to worry about this kind of thing. You have one loop read the data and one process the data.

Tim
GHSP
0 Kudos
Message 2 of 3
(2,391 Views)

It will be nearly impossible to design a system that can do N times the amount of work that it takes to do 1 unit of work. You can create a deterministic system that will have a known time for each unit of work This is not necessarily possible in an OS like Windows if the time period is very short. Under Windows this can only be done if the time per unit is fairly long.

 

However, since each unit of time will require some amount of time to complete doing 3, 5 or 10 times the number of units work will take longer than a single execution.

 

There a veriety of ways you can maximize performance in a system. In LabVIEW it is fairly easy to create parallel tasks which can significantly improve performance. In you specific case we would need to know more about your application in order to make any meaningful suggestions.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 3 of 3
(2,383 Views)