I've tried to use that long ago and always had severe redraw performance issues. It seems it could only be used for relatively small datasets.
Did the performance improve? Maybe if it could be made to utilize the GPU and do things in hardware without penalty, I would again try to use it. In this case, the suggestion would be OK.
I only use it for single-draw graphs rather than graphs that refresh constantly, so I've just tried a few tests, and it is definitely very, very slow updating for large graphs. Also, the performance seems to depend not just on the number of plots, or points per plot, but significantly on the size of the graph plot area. But I'd still like the option to set a hundred plots to anti-alias, even if it then takes a second to draw.
I agree it should be done in hardware - it's almost inconceivable that a modern program would not have anti-aliased lines. But I imagine this code has been in LabVIEW for 20 years.