From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

FPGA synchronisation of I/O-Node and variable values / 1 ms delay problem

Solved!
Go to solution

I built up my project based on the sample „LabVIEW FPGA Waveform Aquisation and Logging on CompactRIO“.

 

In my FPGA Main VI there are running some loops in parallel with own timing.

 

One loop decodes an encoder signal (loop timer 1 µs) [NI9411]. The result is used as “Decoder_Counts” [values 0…3599].

 

Another loop locks for the Counts and compare this value with “Geber Impuls EIN” and “Geber Impuls AUS” values to set or unset some DO´s [NI9474]. The local variable “Lastschalter” indicates the state of the DO´s.

 

In the aquisation loop (running with the 50 kHz data rate of the AI NI9239) the values of “Lastschalter” and “Decoder_Couts” should be synchronized to the AI values to get information about the dependencies of all signals. Afterwards these signals were pipelined and after some max observation and scaling performance stored in a FIFO as indexed 8 channels.

FPGA_Main.JPG

 

Later the RT VI handles these 8 channels as a block and creates an 8 channel waveform with a timestamp (Aquire Data VI).

Aquire_Data.JPG

 

The problem: Monitoring the physical signals of the DO´s and AI´s at the CRIO with an oscilloscope I got the expected result, that the AI values change in relation to a change of my DO signal. But the LV waveform result indicates me, that there is a difference of 1 ms between the DO change and the AI reaction.

2015-12-10_10-14-53_LV.jpg

 

Up to now I have no idea for the reason of the delay. Because of the “Lastschalter” signal seems to be earlier than the AI reaction I expect there should be no problem with the usage of the variables. The small noise at the AI values indicates me that there are enough value changes, so the delay is not only a “one sample” problem. The “actual loop period (ticks)” of the FPGA acquisition loop shows up the expected 800 ticks (x 25 ns = 20 µs; 50 kHz). The data sheet of the DO-Module describe an output delay time of 1 µs max…

 

Hoping there is someone with a helpful advice.

0 Kudos
Message 1 of 15
(5,050 Views)

Hi MS_Sitec,

 

please provide us with the complete code of your FPGA+RT project. I would like to check the timing of the different loops and the synchronization between them. After that I can look into the code and check if something stands out.

 

Best regards,

Christoph

Staff Applications Engineer
National Instruments
Certified LabVIEW Developer (CLD), Certified LabVIEW Embedded Systems Developer (CLED)


Don't forget Kudos for Good Answers, and Mark a solution if your problem is solved
0 Kudos
Message 2 of 15
(5,001 Views)

here the complete project as zip

 

Thx

0 Kudos
Message 3 of 15
(4,992 Views)

Hi!

 

Thank you for your fast reply. I searched through your project and couldnt locate any location where this delay could originate from. Only exception would be the loop "Einlesen von Strom und Spannung + Weitergabe an SPS (ggf. als Mittelung)" in the FPGA Main VI. Does this loop really work with 50 kS/s?

 

For a small test of your hardware I/O I would propose that you just test the general I/O behaviour. I provided you with a small VI snippet of a minimal example to reduce complexity. Please compile that FPGA VI and watch the result stream data.

 

FPGA_snippet.png

 

Thank you!

 

Best regards,

Christoph

Staff Applications Engineer
National Instruments
Certified LabVIEW Developer (CLD), Certified LabVIEW Embedded Systems Developer (CLED)


Don't forget Kudos for Good Answers, and Mark a solution if your problem is solved
0 Kudos
Message 4 of 15
(4,976 Views)

What analogue input module are you using? What is it's sampling rate? I would expect the AI node to run slower than the DIO node due to the settling/sampling time. Some of the AI modules are pretty slow in reading (e.g. 52ms, 2ms?). You may have better results if you sequence the digital output to occur after the AI read has occurred.


LabVIEW Champion, CLA, CLED, CTD
(blog)
0 Kudos
Message 5 of 15
(4,963 Views)

What is the hardware latency you are working with?

 

I had similar "problems" before and it turned out to be the latency of the output hardware plus the latency of the input hardware itself.  While your AO and AI may be running at 50kHz, the conversion from physical to digital may be strongly pipelined and can consume many cycles.  In my example, I was working at 40MHz, but had a hardware delay of nearly 800ns (approximately 33 cycles of my hardware).

 

The difference to the external oscilloscope is that it only records your output values AFTER the conversion whereas your FPGA code records the output values BEFORE the conversion.

0 Kudos
Message 6 of 15
(4,961 Views)

Hi guys,

 

I dont think it is the hardware delay since we are talking about 1 millisecond. A few ticks perhaps, but 40.000 Ticks?!

 

The LV project looks like that (modules in question seem to be about 50-100 kS/s):

 

LV_proj.PNG

 

Best regards,

Christoph

Staff Applications Engineer
National Instruments
Certified LabVIEW Developer (CLD), Certified LabVIEW Embedded Systems Developer (CLED)


Don't forget Kudos for Good Answers, and Mark a solution if your problem is solved
0 Kudos
Message 7 of 15
(4,952 Views)

Many Thx for your advices.

 

The aquisation loop is realy running with 800 ticks...

 

Now I have prepared a test program following your advice and a corresponding RT program. I´m able to check this out in hardware tomorrow morning. I will report as soon as possible. 

 

MS

 

 

0 Kudos
Message 8 of 15
(4,944 Views)

@Christoph_D wrote:

Hi guys,

 

I dont think it is the hardware delay since we are talking about 1 millisecond. A few ticks perhaps, but 40.000 Ticks?!

 

The LV project looks like that (modules in question seem to be about 50-100 kS/s):

 

 

Best regards,

Christoph


The AI is running at 50kHz.

 

1 ms is 50 cycles of the AI, not 40,000 cycles of the AI.

 

It seems to be a ballpark figure compared to my previous experience, that's all.

 

Shane

 

PS Looking at the datasheet for the AI module used, the input delay is listed as 38.4 / fs. (Sampling frequency).

 

That makes approximately 0.75 ms in my book.

0 Kudos
Message 9 of 15
(4,937 Views)
Solution
Accepted by topic author MS_Sitec

AI delay specs 1.png

 

 

AI delay specs 2.png

0 Kudos
Message 10 of 15
(4,920 Views)