01-15-2016 07:11 PM
Hi
I'm trying to get a rough understanding of how timing works with inputs and outputs on the fpga and cRIO CPU when implementing control algorithms on the cRio cpu
Say I have a labview program where the following happens:
- The FPGA VI reads an analog input and sends it to a target VI (cRIO CPU)
- The target VI then makes a quick calculations and sends an output to the FPGA based on the input it received
- The FPGA then outputs this through an analog output module
Theoretically speaking, if the FPGA is running at 10 ms, and the rt target VI is running at 1 ms, will the following happen?:
- the FPGA reads the input at 0ms..
- The target VI recieves the input slightly after 0ms and calculates the an output..
- The calculated output is sent down to the FPGA before 1 ms have passed..
- The FPGA then sends this output to the analog output module somewhere around (or before) the 1 ms mark
Or:
- Or, will the calculated output be delayed until the 10 ms mark?
Thanks
01-15-2016 08:20 PM
What are you using to pass the data to/from the FPGA from/to the RT? DMA FIFO? Front panel controls? Scan Engine?
It would help us answer your question better if you provided some code.
01-18-2016 11:32 AM
It depends on how you structure your code. When you say the FPGA is "running at 10ms," it's not as though the FPGA code executes instantly, waits 10ms, then executes again; rather, you could have code executing throughout that 10ms period, where each individual piece of that code executes at 10ms intervals.