Real-Time Measurement and Control

cancel
Showing results for 
Search instead for 
Did you mean: 

Real time analog output update and synchronization with analog input.

Solved!
Go to solution

Hi

I am currently working on a C-based program to control a DAQ board. I tried to update the analog DC output and read the analog input at the same time every time the system calls  'EveryNCallBack' function. But as a result, the output is not updated. And the task is stopped for some reason after a while.

This is a simpler version of my project. In my project, I need to do real-time data processing to detect peak in the current signal. As soon as the peak is detected, I need to reverse the voltage polarity. I have a LabVIEW program which can such a task but the sampling frequency is limited since real-time data processing in LabVIEW is really too slow.

Please find the attached for the C file.

0 Kudos
Message 1 of 18
(4,414 Views)

What hardware are you using?  Can you post your LabVIEW code?

0 Kudos
Message 2 of 18
(4,377 Views)

Hi:

Thanks for the quick reply.

I am using PCIe 6321 data acquisition device. I am programming data acquisition using ANSI C, for the reason above. I have uploaded the code.

 

0 Kudos
Message 3 of 18
(4,375 Views)

A desktop PC is not a real-time device.  If you require deterministic real-time data processing, you might be using the wrong hardware for the job.  That said, that card shows 250 kS/s AI and 900 kS/s AO.  What loop period are you attempting to achieve, and have you benchmarked your process to see how long it actually takes?

 

Also, do you have native LabVIEW code? Or only the C?  I am unable to troubleshoot that source code.  Is your entire application written in C? Or are you calling your processing routine from a LabVIEW VI using a Call Library Function node?

0 Kudos
Message 4 of 18
(4,370 Views)

Hi 

I am trying to achieve around 1 to 2 ms for one loop. Say, 100 samples per loop at 100 kHz. I have a complete  LabVIEW code for my full application (See attached, you could ignore the data processing part.). But it can only achieve 50 samples per loop at 25kHz. Higher sampling frequency will stop the data acquisition since the data processing inside the loop cannot catch up. That's why I am trying to use C-based programming for faster data processing.

But is it possible to write a C code specifically used for data processing in LabVIEW code, like Call Library Function node?

0 Kudos
Message 5 of 18
(4,365 Views)

The operating system is going to prevent you from getting accurate 1ms loop rates, c programming won't help.

0 Kudos
Message 6 of 18
(4,362 Views)

Hi 

Could you please be more specific? What if I don't use the software clock?  

0 Kudos
Message 7 of 18
(4,354 Views)

if you are programming in a windows environment, or other non-real time OS, it's up to the OS when the thread running your program actually gets worked on by the CPU and exactly when the I/O to your DAQ happens. Because why would you need 10 us accuracy / jitter for someone working on a spreadsheet or watching youtube vids? Doesn't make a difference to the user. So while your program is expecting to go and fetch the next 100 sample the OS say "nope! I need to go work on this thread running the minimized web browser and this other that checks for new windows updates" .  At least that's my understanding of it

0 Kudos
Message 8 of 18
(4,345 Views)

Hi 

It is very good explanation. Thanks.

0 Kudos
Message 9 of 18
(4,340 Views)

Preemptive operating systems, such as Windows, do not provide the determinism that an embedded (real-time) system does, so even a timed loop will not execute deterministically on Windows.  It will, however, still execute with a higher priority than normal LabVIEW loops in an attempt to achieve the configured timing, but it is at the mercy of the operating system scheduler.  Also, be aware that there is a hardware limitation with the Windows clock.  Tick counts are obtained from the system BIOS, but only provide 1 ms resolution (so MHz clocks can not be implemented on non real-time systems) on Windows systems. Date / time functions (such as the Get Date/Time in Seconds VI) obtain their values from the Windows calendar, which is updated only at 60 Hz, and so updates every ~16.7 ms.  Despite the fact that your desktop PC may have a CPU with a clock rate in the GHz, you don't have access to the raw hardware clock to time code execution.  As you are attempting to time your code at 1 ms, you are at the very upper end of what is possible on a non real-time system, and a timed loop in LabVIEW will try to achieve it, but will almost certainly be preempted at some point, resulting in late / missed iterations.

 

The acquisition rates of your DAQ device are possible through the use of buffering, so you can get lossless data streaming at high rates (high throughput) by accepting processing overhead (increased latency).  The way to address this in your application is to similarly run your processing on buffered data sets, increasing the batch size until your batch processing can iterate quickly enough to process all of the incoming data and still have some CPU cycles left over to cede to other processes.

0 Kudos
Message 10 of 18
(4,314 Views)