From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
06-04-2013 04:04 PM
I am going to acquire up to 5 samples per second of 50 channels over 30 minutes then average ten of the channels and generate a report. There are other bits of information needed to complete my task and graphs. Would it be "faster" for the PC to have one large 4d array with all the information or faster to make several separate arrays? Would it be more reliable to have one or many(fewer code lockups)?
Solved! Go to Solution.
06-05-2013 02:26 AM
Hello CalorimeterOperator,
first of all I have a doubt that you have posted to the wrong forum: I saw all your other questions were reletad to labVIEW, so you are probably coding in LV and you must post in the relevant forum.
Having said this, why are you thinking of a 4d array? I see only a 2d array: channel vs. time.
Since it seems that you need to post process data after acquisition has completed, measures acquired in memory per-scan can be rearranged per-channel transposing the matrix (with TransposeData () in CVI or Transpose2dArray.vi in labVIEW) to ease up the process of averaging the single channels.
06-05-2013 09:07 AM
Yes, I do mean to be using labview. I expected this crowd to have a different perspective (and I could not find LV blog).
For each channel, I want to scale per 6 other variables per channel(nominal high, mid, low, and reading high, mid low) So there is the channels and time, then there is the calibration values, then there is the calibrated readings. If I put all that along with my test information in one large matrix, it would be simpler for me to remember where in the matrix each item is, but if it is in several matrixies then the "active" matrix is smaller while the other information is not used.
The sales rep for NI was indicating the computer power is much higher than I am used to. (I have not started programming yet, I am preparing for a huge project on a "new"(2 year old)PC.) I am trying to understand just how much power I am missing from my days of gwbasic, Q-basic and our current Visual Basic 6.0 running on XP. This matrix question is new to me.
06-07-2013 02:09 AM
This discussion has been posted also here in the LabVIEW forum.
I can only add that you should verify how many of your data are effectively varying and so it's worth to save a value on each sample. I mean: nominal min/max/average and measure min/max are probably constant in the process, while you may want to keep a moving average on actual measures or a cumulative average on measures up to some point. This could help to reduce memory occupation and possibly some matrix dimension.