08-13-2025 12:59 PM
Hi,
I am running an NI-9239 input module and an NI-9260 output module in a cRIO-9053 chasis. Using a combined FPGA and RT approach, i take data in on FPGA, apply a PID and the output the raw input and PID output up to RT using a DMA FIFO. The FPGA runs at 40MHz and the input and output data rates are 51.2kS/s and this program should run continuosuly.
I would like to create time delay between these two signals that I can vary while the program runs, and would need control in the 1-500us range. Any ideas on how I could implement this without causing any stuttering or stalling on the output side? I have tried inserting Wait Timers in the RT and FPGA in various different structures of while loops and sequence structures but always seem to have a stuttering or aliasing problem.
One idea I had would be to have the clocks of each module start at different times but have no idea how to implement this correctly.
Any help greatly appreciated.
08-13-2025 08:30 PM
If you want delay in microseconds range, you should implement the delay in FPGA.
Please show what you have tried and we can advise accordingly.
08-15-2025 06:59 AM
Attached is a screenshot of the FPGA program I tried, my thought process was that I could start each module at some given dealy to each other and then enter the data to RT process and back down again (I will eventually do some more processing on FPGA side but keeping it simple for now). I wanted to have the functionality to change this delay while the FPGA is running hence the button and two while loops.
For testing puproses I send in 1kHz Sine from a funciton generator. What I have annoyingly found is that the phase between the input and output signal is completely random when turning FPGA off and on again, hence why I wanted to be able to vary the delay while FPGA is running.
The issue I had with this FPGA attempt is that the output signal was very strange, almost had an alisaing type effect (all is fine when I have no delay and case strcutures in)
Maybe you have some insight to why this phase is random? Which may simplify the situation.
Thanks!!
08-20-2025 07:01 AM
@NKJDE wrote:
Attached is a screenshot of the FPGA program I tried, my thought process was that I could start each module at some given dealy to each other and then enter the data to RT process and back down again (I will eventually do some more processing on FPGA side but keeping it simple for now). I wanted to have the functionality to change this delay while the FPGA is running hence the button and two while loops.
For testing puproses I send in 1kHz Sine from a funciton generator. What I have annoyingly found is that the phase between the input and output signal is completely random when turning FPGA off and on again, hence why I wanted to be able to vary the delay while FPGA is running.
The issue I had with this FPGA attempt is that the output signal was very strange, almost had an alisaing type effect (all is fine when I have no delay and case strcutures in)
Maybe you have some insight to why this phase is random? Which may simplify the situation.
Thanks!!
Signal delays are implemented using a buffer. The buffer max size is selected at compile time. When running, you can tell the data to stay in the buffer for the full length (max delay) or for just one cycle (min delay or none).
Your implementation is not a signal delay. It is a wait and will make signal gaps.