05-12-2017 10:37 AM
I just made lots of changes...funny, you can look at the code and it seems fine, come back to it later and can find other optimizations...been programming LabVIEW for 22 years and this still happens (though, only about 5 years on/off on FPGA coding). Now estimated utilization is 50% vs 104%! (most from taking out calculations that happen only at prog start, which I can do in Excel and paste them in...had it that way so it was easy for someone to figure out what is going on, but very detailed comments will work well).
Once the compiles finish (just started them), I will provide more info.
Thanks everyone!!
05-12-2017 10:50 AM
Well, the latest changes solved the problem...reduced utilization probably did the trick. all 4 compiles finished successfully in less than 20 minutes (vs. 60+).
Now, to download and test the code...which will have to wait...have to attend training 😞
Thanks again to everyone!
Todd
05-12-2017 11:00 AM
Another good resource is the NI LabVIEW High-Performance Developer's Guide. It includes some good techniques for optimizing resources and timing for your LabVIEW FPGA designs.
05-12-2017 12:46 PM
labviewman wrote:
all additions/subtractions to the optimized 'AddSub' from the 'high-throughput math' palette
Minor note: there's no advantage to doing this. There's nothing "optimized" about the high-throughput addition and subtraction - the standard add and subtract (and, for that matter, multiply) can already execute in a single clock cycle. They're there for consistency with the other math operations, but "National Instruments recommends that you use the LabVIEW Numeric functions unless you need the benefits that the High Throughput Math functions provide." (from the high-throughput math function help)
05-12-2017 12:49 PM
Well, not so fast...4 machines compiled OK, one did not. Still randomly fails...max utilization is 50%. Timing problems are on non-diagram components (see attached).
I guess I will just have to live it it and perform numerous parallel compiles when changes are needed.
05-12-2017 12:51 PM
Do you know if that timing info is for stuff in your loop? It sounds like it might just be NI's stuff in which case it might be nice if NI could give us any benchmarks they might have. Supposedly the compiler gets better every year so it might be worthwhile downloading a trial of 2016 and seeing how your stuff performs there.
05-12-2017 01:04 PM
Are you by any chance trying to run at a faster-than-standard clock rate?
05-12-2017 01:26 PM
Negative, standard 40 MHz clock
05-15-2017 10:33 AM
I wish I could use a newer version of LabVIEW. We have the SSP but I am stuck with LabVIEW 2012 and EtherCAT 2.4 because of the dyno controller software...they only support EtherCAT 2.4 at this time, and the latest version of LabVIEW that supports EtherCAT 2.4 is 2012.
05-15-2017 10:38 AM
I have no idea...it doesn't tell me where the problem is. And the problem it finds is NEVER the same thing twice.
At least it now compiles (synthesizes) with a success rate of 60% (device utilization is now at 53.5%), so, in the future if I need to make changes, I will have 4 VMs going to get at least one successful result.
I'm giving up on finding the problem...at least it now works (a successful compile/synthesis) most of the time...downloaded code works perfectly.