LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Time-loop at 1 MHz

Solved!
Go to solution

Hello everybody,

 

after searching for something already post on the subject, I found a lot of interesting information but none directly that can answer my issue.

 

 

I have LabView with Real-Time extension.

 

Until now, I succeded to make a time-loop at 1 kHz, but in the timer, the 1MHz option is in grey, so I can't choose it, even with the real-time module.

 

Questions :

 

- I heared that MS windows is limited to approtimatly 55 ms of accuracy. Is it true ? I don't really understand because in that case, why NI made a 1 MHz if it is not possible for MS windows to do it ?

 

- Secondly, how can I make the 1 MHz timer for my time-loop ? Is there any process to do with real-time module to do that ?

 

 

Thanks in advance for your help.

 

Vivien

0 Kudos
Message 1 of 12
(6,163 Views)

Are you running this on a real-time operating system or in Windows?

You talk about Real-Time extension, but then proceed to talk about MS Windows.  The 1 MHz clock is available if you have your code running on a real-time target.  In Windows, only the 1 kHz clock is available.

 

The accuracy of the clock in windows is about 16 msec, not 55 msec.  (Maybe very old versions of Windows was 55 msec.)  In Windows, a loop is only going to run as fast as the OS allows it to run.  A normal while loop can run faster than 1 MHz if the OS allows it.  The problem with Windows is that is no determinism in how fast it will run loops.  It will run super fast at times, then proceed to steal cycles from the CPU when it decides to do something it considers more important such as a virus scan, or reindexing a hard drive.

 

What are you really trying to do with your code?  If you need a reliable 1 MHz clock, then you need to use a real-time OS.  If you want something to run fast, but not with any determism with Windows, then use a regular while loop with no timing functions.

0 Kudos
Message 2 of 12
(6,147 Views)

Thank you for these precisions.

 

My aim is to make some mesure through the DAQmx.

 

I measure an intensity function of the displacement of a motor, so in order to have the most possible number of points, I need that my acquisition run as fast as possible. Now indeed, it is around 15-20 msec. But if I can have something quicker, it will be beter since my motor step is at the lowest speed it can acheive.

At the end I need to know precisly at what time I take the measure in order to determine the position of my motor since I know its speed (assuming the speed is really precise).

 

So, now I am running LabView in MS windows.

But I don't understand what is a "real-time OS" ? I know macOS, MS OS, Linux... is it that kind?

 

How can I proceed to use a real-time OS ?

 

 

Thanks.

 

0 Kudos
Message 3 of 12
(6,134 Views)
Why do you think you need a real time os for DAQmx. The time to acquire is a function of sample rate and number of requested samples. Just simple arithmetic. What device are you using?
0 Kudos
Message 4 of 12
(6,125 Views)

I use an old NI USB 6009 Altera DEO.

 

The sample rate is 48kHz.

 

To make me understand on the issue, here a little explanation:

 

- a motor moves and then the signal on a photodiode is changing.

I digitalyze the signal of my photodiode through the NI USB 6009 at a frequence rate of 48 kHz with N samples in order to re-create the analogique signal in LabView.

 

Then I measure the RMS, and others, value.

 

I write the values into a spreadsheet in order to plot the curve : variation of the signal of my photodiode function of time (and since I know the velocity of my motor, intensity of photiode function of position of the step motor).

 

Maybe there is another way to do this, but until now, i thought about what you can see in the code.

 

 

Thanks.

0 Kudos
Message 5 of 12
(6,115 Views)
Solution
Accepted by topic author nobaka

What you've shown doesn't show any need to have a while loop running at 1 MHz.  Set your DAQ assistant to continuous samples, and collect a large number, or all that is available.  Analyze and and it to the file.

 

Some tips.

 

Merge signals is expandable by dragging the border down, so you can merge all the signals in one step rather than needing multiple merge signals.

 

You have the risk of rapidly growing arrays that will fill up your memory by using the concatenating tunnels.

 

You'd be better off using a producer/consumer architecture to acquire the data in one loop, then pass it to another loop for analysis and writing to a file.

Message 6 of 12
(6,106 Views)

Sorry but i don't understand what you are saying, or more precisly i don't see how to apply it.

 

 

I understand for the merging of all the signals, I did it.

 

My experience just takes 10 sec max, so hopefully, it allows me with my computer to gather a lot of information during this time.

 

What kind of loop do you recommend for the daqmx and the writing step ?

 

0 Kudos
Message 7 of 12
(6,091 Views)
There is nothing that stops the loop after 10 seconds and you are only writing the very last iteration to file. A regular loop is fine if you really limit it to 10 seconds but get rid of that silly conversion to 1D arrays as I mentioned in the other thread you started. Auto-index the output.
Message 8 of 12
(6,086 Views)

Hi,

 

I tried to do as you suggested.

 

You can see the simple code attached.

 

But it doesn't work : just the final value is written into the spreadsheet.

 

Between the 2 loops, I put them into " indexing".

 

What do I miss ?

0 Kudos
Message 9 of 12
(6,032 Views)
You basically missed everything. You should have kept the analysis in the acquisition loop. Right click the exit tunnel when you move it. Look at the options. Read the help and please take one or more of the free tutorials.
0 Kudos
Message 10 of 12
(6,023 Views)