From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabWindows/CVI

cancel
Showing results for 
Search instead for 
Did you mean: 

Is it better to make multiple arrays of data or one big array and access it many times?

Solved!
Go to solution

I am going to acquire up to 5 samples per second of 50 channels over 30 minutes then average ten of the channels and generate a report.  There are other bits of information needed to complete my task and graphs.  Would it be "faster" for the PC to have one large 4d array with all the information or faster to make several separate arrays?  Would it be more reliable to have one or many(fewer code lockups)?

0 Kudos
Message 1 of 4
(3,066 Views)

Hello CalorimeterOperator,

 

first of all I have a doubt that you have posted to the wrong forum: I saw all your other questions were reletad to labVIEW, so you are probably coding in LV and you must post in the relevant forum.

 

Having said this, why are you thinking of a 4d array? I see only a 2d array: channel vs. time.

 

Since it seems that you need to post process data after acquisition has completed, measures acquired in memory per-scan can be rearranged per-channel transposing the matrix (with TransposeData () in CVI or Transpose2dArray.vi in labVIEW) to ease up the process of averaging the single channels.



Proud to use LW/CVI from 3.1 on.

My contributions to the Developer Community
________________________________________
If I have helped you, why not giving me a kudos?
0 Kudos
Message 2 of 4
(3,051 Views)

Yes, I do mean to be using labview.  I expected this crowd to have a different perspective (and I could not find LV blog).

For each channel, I want to scale per 6 other variables per channel(nominal high, mid, low, and reading high, mid low)  So there is the channels and time, then there is the calibration values, then there is the calibrated readings.  If I put all that along with my test information in one large matrix, it would be simpler for me to remember where in the matrix each item is, but if it is in several matrixies then the "active" matrix is smaller while the other information is not used.

The sales rep for NI was indicating the computer power is much higher than I am used to.  (I have not started programming yet, I am preparing for a huge project on a "new"(2 year old)PC.)  I am trying to understand just how much power I am missing from my days of gwbasic, Q-basic and our current Visual Basic 6.0 running on XP.  This matrix question is new to me.

0 Kudos
Message 3 of 4
(3,035 Views)
Solution
Accepted by topic author CalorimeterOperator

This discussion has been posted also here in the LabVIEW forum.

 

I can only add that you should verify how many of your data are effectively varying and so it's worth to save a value on each sample. I mean: nominal min/max/average and measure min/max are probably constant in the process, while you may want to keep a moving average on actual measures or a cumulative average on measures up to some point. This could help to reduce memory occupation and possibly some matrix dimension.



Proud to use LW/CVI from 3.1 on.

My contributions to the Developer Community
________________________________________
If I have helped you, why not giving me a kudos?
0 Kudos
Message 4 of 4
(3,009 Views)