LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Trying to understand timing of inputs and outputs with LabVIEW FPGA

Hi

I'm trying to get a rough understanding of how timing works with inputs and outputs on the fpga and cRIO CPU when implementing control algorithms on the cRio cpu

Say I have a labview program where the following happens:
- The FPGA VI reads an analog input and sends it to a target VI (cRIO CPU)
- The target VI then makes a quick calculations and sends an output to the FPGA based on the input it received
- The FPGA then outputs this through an analog output module

Theoretically speaking, if the FPGA is running at 10 ms, and the rt target VI is running at 1 ms, will the following happen?:
- the FPGA reads the input at 0ms..
- The target VI recieves the input slightly after 0ms and calculates the an output..
- The calculated output is sent down to the FPGA before 1 ms have passed..
- The FPGA then sends this output to the analog output module somewhere around (or before) the 1 ms mark

Or:
- Or, will the calculated output be delayed until the 10 ms mark?

 

Thanks

0 Kudos
Message 1 of 3
(2,705 Views)

What are you using to pass the data to/from the FPGA from/to the RT?  DMA FIFO?  Front panel controls?  Scan Engine?

 

It would help us answer your question better if you provided some code.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 2 of 3
(2,678 Views)

It depends on how you structure your code. When you say the FPGA is "running at 10ms," it's not as though the FPGA code executes instantly, waits 10ms, then executes again; rather, you could have code executing throughout that 10ms period, where each individual piece of that code executes at 10ms intervals.

0 Kudos
Message 3 of 3
(2,607 Views)