From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Rolling summations with large 2D arrays, and other trauma.

Is there a graceful solution or literature I can be referenced to that provides a good approach to this problem? I am wanting to create a rolling window that acts as a totalizer for the last 24-hour period. The problem with these types of things versus cumulative totals is of course the need for memory, which was hastily confirmed to me about 45 minutes in when my CPU started approaching its limit. (9068) 

 

The problem (for me) seems to be the following:

 

1.) the size of data after so long, and storing/manipulating this information as it grows.

2.) the nature of 2D arrays, whereby a user "resetting" a total would seemingly require some indexing magic with the addition of new attributing data thereafter, since 2D arrays cannot have "holes."  (right?)

 

I am currently doing this on the RT. I have considered a set of 1D arrays for the task, but I am doubtful it would resolve issue #1. 

 

Thanks,

R

 

 

 

 

0 Kudos
Message 1 of 8
(3,326 Views)

If I'm sampling 8 times a second from 12 devices, this puts me at a 12 x 691,200 sgl array at the 24hr mark.  😕

 

0 Kudos
Message 2 of 8
(3,312 Views)

RT systems are much more sensitive to you playing with memory than desktop systems, but I don't have much practical experience with memory management on RT, so I may be missing some stuff.

 

The first thing to consider is that you need ~30 MB in RAM for your data. I assume that the cRIO can easily spare that, but I haven't checked the specs. Your issue is likely not with the CPU, but with RAM.

 

The second thing is that arrays indeed can't have holes. An array is efficient because it's very simple to get to element N because the elements are of fixes size and contiguous, but for that to work, you can't have holes.

 

What you're doing is a natural way to do this, but is problematic (particularly in RT) because it requires regular memory manipulation (either moving all the elements when you delete the first or moving the array itself to start where the second element was).

 

One option is to use a queue of a fixed size and use lossy enqueue elements and then preview the queue. This takes care of the memory mangement behind the scenes, but I don't know how efficient it is and whether the previewing is problematic. I'm sure it create a second copy and hopefully that's reused each time.

 

Another option is RT FIFOs. I'm assuming they're designed specifically for this, but I never used them.

 

A third option is to make the code more complex - init the array to size N once, then keep an index of where you are in the array and simply replace the element at the "end" of the array (which is effectively curr-1). You would have to handle rolling over the end of the array and starting from the start (i mod N), but it's not super complicated code and it will probably be the most memory efficient.


___________________
Try to take over the world!
Message 3 of 8
(3,302 Views)

Hi destko,

 

1.) the size of data after so long, and storing/manipulating this information as it grows.

On a RT system you should not have growing arrays. Really!

Initialize the array(s) needed at startup of your RT exe.

 

2.) the nature of 2D arrays, whereby a user "resetting" a total would seemingly require some indexing magic with the addition of new attributing data thereafter, since 2D arrays cannot have "holes."  (right?)

No, arrays can't have holes…

 

I am currently doing this on the RT. I have considered a set of 1D arrays for the task, but I am doubtful it would resolve issue #1. 

Keep this mantra: Don't use growing arrays in RT systems!

 

For your 24h period at 8Hz samplerate you need 8*86400=691200 samples, so initialize an array of 691200 elements (per channel). With each array you also keep an index to the current index: when getting a new sample you read the current element at index, replace this element with the new value and then increment the index (with rollover at end of array). Then you calculate the new array sum:

array_sum := array_sum + new value - previous value

This way you don't need to add 691200 elements to get an array sum!

 

- Keep in mind those floating point rounding issues: summing an array of ~700k elements can produce wrong results. Adding and subtracting values repeatedly can produce other wrong results…

- Your cRIO9068 is able to handle this amount of RAM (depends on how demanding your remaining RT exe is programed)…

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 4 of 8
(3,300 Views)

Gerd also brings up a good point about using SGL - its resolution becomes low relatively quickly (it gets to 1 at 2^24, which is ~16 million). Since you're summing 700k values, you could get near enough, so you might wish to use DBL, at least for the summed value, depending on your values and desired accuracy.


___________________
Try to take over the world!
0 Kudos
Message 5 of 8
(3,283 Views)

I'm not on a new enough LV version to open your vi, but I'm taking hints from your text.

 

I keep thinking the solution *should* be more like a cumulative total that doesn't demand so much memory, b/c I'm having a hard timing figuring out where there's a need for an algorithm that:

  • fundamentally smooths over fluctuations by "totalizing" across a 24-hour period
  • keeps updating a sliding window of totals at an 8 hz rate, thus demanding retention of every sample taken in the previous 24 hrs.  With this kind of rolling window, each new sample contributes only 1/(8*60*60*24) to the rolling result, or just over 1 part per million.

 

Do you really need a rolling average?  Or just a 24-hour cumulative total you can reset every 24 hours?

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 6 of 8
(3,218 Views)

You know exactly how big the array will be so it would help to initialize it once at fixed size with all zeroes. Now you can get rid of all the fancy math and simply use Quotient&remainder to keep track of the insertion point. (the insertion point inrements by one and wraps back to zero after the end has been reached. At each iteration, you would get the current value (or row) and place the new value (easiest with an in place element structure). You don't even need to sum the array elements, simply keep the running sum in a scalar shift register where you subtract the element(s) you just removed and add the element(s) you just placed in that same location.

 

I think I made an example code very long ago, let me try to find it.

Message 7 of 8
(3,204 Views)

Thank you for the input. Had to postpone this for a while but am back in the saddle. The Q&R idea is a good one and the init/RT advice was helpful. 

0 Kudos
Message 8 of 8
(3,077 Views)