02-04-2017 08:24 AM
I have a problem in my FPGA (7582-R) compilations: too few DSP48s. I've reduced the high throughput operations to the minimum but I'm still at a required utilization of 110%. Are there any general directives or any way to force the compiler to use less DSP48s?
Right now I don't care too much about execution time because my code just doesn't work. Optimization will be the next step.
Solved! Go to Solution.
02-05-2017 10:36 AM
Hello,
Have you set the options to reduce area usage? If you right click your build specification and go to properties, under the Source Options category you will have optimization options for the compiler. Select to Optimise for Area/Size and this should reduce usage without any code changes.
Thanks,
Ed
02-05-2017 01:49 PM
Can you provide some additional details about your application? Is the majority of your application inside the SCL? What high-throughput nodes are you using?
In general, if you can fudge on execution time, I would suggest breaking your application into stages that can be individually optimized. That will let you use logic sharing techniques on the stages that can be modified, such as using a SCL to iterate multiple times over a single high-throughput application, increasing latency but reducing the number of copies of the high-throughput node.
02-05-2017 03:43 PM
It'll be very difficult to tell you how to reduce your usage without seeing what you've got. If the compiler is using those assets, you're likely calling functions that require them.
02-08-2017 11:58 AM
This is my current subvi file; it's a frequency doubler (input is filtered with PLL and then frequency doubled). When I compile it with "Area" optimization goal it uses about 30 DSPs (out of the 48 I have). My current problem is that I would need to use more than one of these VIs in the rest of the code.
Any way to reduce it or, directly, force it to not use any DSPs?