11-06-2013 05:59 PM
Why do the Scattergraph and the WPF Graph (MS2013) use so much memory when plotting large amounts of 1Hz data in ContinuousChart? (ex: memory increases 60MB every second up to ~800MB, then down and back up again).
I've attached a sample program that uses RASTER mode, Collapsed XAxis (similar to SuppressScaleLayout), 4 plots using ChartCollection<double, double> with 1,000,000 capacity. Application starts with 900K points on each plot, and we add new data at 1Hz. ContinuousChart adjuster adjusts range, which admittedly will require redisplay of entire graph, but only at 1Hz. I'm not concerned with the CPU usage, but this sample app grows to over 900MB of memory (particularly as x86 process, slightly better as AnyCPU).
We had a similar issue with the ScatterGraph control (on x86, we get OutOfMemoryException near ~1.6GB). Switching to x64 helps us, but we still support some x86 computers. I had hoped that the WPF graph wouldn't inherit this memory behavior.
My only workaround has been pre-decimating the data and using smaller Point arrays (plotting down-sampled instead of "charting"). This helps, but our users zoom in and out of the data a lot, and we then have to re-sample the data every time the user zooms (something I'd hoped not to implement). Can we not use Charting with 4 x 1M points without accepting this memory pattern?
As a workaround, I'm exploring custom decimation in a derived PlotRenderer. because I'd like to leave the Graph.DataSource bound to a ObservableCollection or ChartCollection type where I can keep all 4 x 1M points.
Attached example is based on the Charting.2012 example from MS2013
11-07-2013 08:51 AM
Hello ellisda,
I digged in our internal resource about this, and currently there are a couple Corrective Action Requests (CARs) that R&D are investigating. The references for this are: #352370 and #377048.
Carmen C.
08-11-2015 12:33 PM
Just wanted to let you know that we have made several performance improvements in this regard for the Measurement Studio 2015 release.