Thank you for the suggestions! I tried both Kay and Paul's suggestions on my VI, they don't seem to work. The total run time will still be the sum of run time + loop time. I have attached the VI with Paul's suggestion. Yet, if I use my idea to fix this, the elapsed time count will not be the same as run time value and I found this is a little odd. Is this a good fix for the problem?
I have also another question. It seems like the time elapsed count is a little off, see attached Front Panel picture. Is there a way to improve the count?
To answer mcduff's question - the reason I am not using the finite set of data point is because I'd like the user has an option to run the code continuously or to specify a run time.
Thanks for giving me "another chance to teach". Please forgive the somewhat-rambling nature of this Post -- I'm putting in the effort because I sense you are interested in becoming better (and maybe even proficient) in LabVIEW development, but you need some guidance.
I'm attaching a "partly-cleaned-up" version of your code, with lots of comments, in no particular order. I also know why your "timing" is off, and (more important) "how to fix it". That will come later. First:
Now, let's talk about Time, a key concept in LabVIEW (dealing, as it does, with timed events like DAQ, and with Time as a primitive Data Type (Timestamps). Here is a Principle -- DAQ Hardware Is the Best Clock! If you really want to time something, and you have DAQ functions, use the DAQ, don't depend on the CPU clock! You appear to be doing all of your Sampling at once, taking 10,000 samples a second for 10 seconds from 4 channels -- I certainly hope your cDAQ device has a 400,000-sample buffer! Assuming that it does, the DAQ Read is guaranteed to take 10 seconds, at which point it will transfer all of the data to the Array of Waveforms coming out of the DAQ Read function. Until this happens, almost nothing else in the Loop can run (as the Principle of Data Flow says it can't run until there is "Data on the Wire"). Now, everything else in the loop needs to run before the loop can finish, which might take a little extra time (and explains why your Elapsed Time is > 10 seconds).
How can you fix this? Simple, don't do all that processing before ending the While Loop! "But I have to process the data!", you say. Yes, but not in that loop. The Principle of Data Flow also means that LabVIEW can run two loops at the same time in something called the Producer-Consumer Design Pattern (there are LabVIEW Examples and Templates illustrating this). Here's how it works:
I'm attaching my revised version of your code. When you rewrite your DAQ loop and transform it into a Producer/Consumer design, try to copy the "style" of compactness, less white space, straighter (and shorter) wires, and a level Error Line ("The Error Line Runs Through It").
Post your New and Improved Code. Does it work (much) better?
Thank you for the thorough elaboration, Bob. I will take some times to study the Producer/Consumer design and post the improved code.
I have modified my VI using your suggestion. Few questions have came up:
1) When the VI is running, the FFT Graph and the corresponding data value are working fine. Yet, When I pressed the stop button/when the VI stopped, the graph is cleared and the data value is not properly shown, see attached 123 picture.
2) I set the run time to be 10sec, yet the time elapsed indicator only shows 9sec. I believe the loop starts counting with 0, that's why it gives me 9. Is it necessary to have this fixed.
3) I get rid of the Get Time Function because even with the producer/consumer design pattern, the timestamp count will still not be an integer; and replaced with the loop count as the timestamps. Is this what you referred to as "DAQ Hardware clock"?
4) In my previous version, by changing the "timeout" of the DAQmx Read, let say 5sec, the FFT graph will produce a FFT for this 5sec. However, when I used producer/consumer design pattern, I noticed that the 5sec "timeout" of the DAQmx Read will multiply the total run time. For example, "Run Time" is set to 10sec, "timeout" is set to 5sec. the total run time will be 50sec instead of 10sec specified (the time elapsed indicator will still remain 9sec). How can I fixed this issue?