07-28-2023 09:17 PM
Good evening, all,
About a year ago, I made some modifications to an application that generated pulses in a 120 mHz loop in the FPGA of a NI 9148 cRIO. I recompiled the FPGA using NI's remote compilation service, and deployed the app to our laboratory. This application runs on LabVIEW 2019. Its HMI runs on a windows PC, and the "brains" of the system is on the 9148. The app uses only one module, a digital I/O 9401, that cranks out high-speed digital pulses.
This week, I have attempted to make some small, relatively inconsequential changes to the FPGA algorithm. I discovered that the FPGA no longer compiles, generating a seemingly random sequence of timing errors on successive compilation attempts. The errors range from 0.00ns (on the low side) to 0.95ns (on the high end).
I've searched this forum and discovered assertions that compiling an FPGA involves some significant randomness. My question is this: aside from any "good luck" I may have had in compiling the application in the past due to its random choices, has the NI compilation service gotten more "picky" in its timing analysis, say in the last 12 months? Is there any systemic reason why my app should have compiled a year ago, but not now?
Thanks,
-- Mark
Solved! Go to Solution.
07-29-2023 02:42 AM
Hi Mark,
It is unlikely to have changed, especially for the 9148 as it uses the previous generation compiler which is essentially frozen in time.
What could have happened is that if your changes have increased the code size, this can cause timing violations since it is more difficult to route a more packed design. Have the changes made any difference to the utilization?
I guess the other test is to go back to the original version and try compiling that as well - see if there is any difference in success now.
07-29-2023 09:57 AM
Thanks, James. I dredged the original version up from SVN and attempted to recompile it unchanged. The result was a 0.99ns timing error in the 8.33ns loop.
So, the hypothesis is that the compiler is making random choices that sometimes result in timing errors, and other times not. This must mean that the algorithm my code is implementing is close to the max that the 9148 is capable of, at 120 mHz.
If I repeatedly run the compilation until it succeeds, can I be assured that the timing constraints are truly satisfied, and that the resulting bitmap will run properly?
07-30-2023 11:01 AM
There definitely are random effects in creating FPGA code. I had a design last year that was stretching the timing capabilities in one specific loop that I had tried to run at 125 MHz. This was for a 9651, but the principle is most likely the same for other targets.
Sometimes a compile would succeed and sometimes it would fail without any substantial code changes. Had to lower the loop frequency a bit to make the compile always succeed. It wasn't a big problem but for a discrete digital phase locked loop, higher frequency is always better.
07-30-2023 04:51 PM
After fooling with this for several days, I decided to collect data on the variation of the timing errors with repeated compilation of exactly the same code. So, I started writing down the worst timing error for each successive compilation.
On the 6th repetition, the compilation succeeded -- no timing errors.
In my previous efforts to finish this little project, I'd been ratcheting down the width of an internal counter in order to make the timing work. Once I realized the timing errors were essentially driven by randomness, I stepped it back up -- 20, 24, 28 and finally 32 bits. There were timing errors, but eventually they eventually went away upon repetition.
Moral to the story: be skeptical of FPGA timing errors unless they persist over several compilations of identical source code.
10-11-2023 10:49 AM
I have seen the same phenomena, and it actually just popped up this morning. Code compiles fine, change something in an entirely different area of the code, and a timing area pops up somewhere else. I too want the fastest cycle time possible for a servo loop. One thing I have anecdotally found to help is changing the compilation configuration for the build. Configuring for compilation speed often causes timing errors that can be removed by configuring for performance. configuring for speed isnt that much faster on my system anyway...
10-11-2023 11:05 AM
I've seen the same code refuse to compile for days, with dozens of attempted compiles only to have a 100% completion rate the entire next week.
There is almost certainly some date/time thing going into the random seeding of the compile process.
We had times where our response to two failed compiles in a row was "wait until after the weekend" and then it compiled for multiple weeks without problems.
It's not a deterministic process.
10-11-2023 11:33 AM
I completely agree, just sharing a "solution" that appears to avoid the compile next week "solution". Worked for me this morning...