01-03-2006 10:09 AM - edited 01-03-2006 10:09 AM
Message Edited by Naturalarch on 01-03-2006 10:10 AM
01-04-2006 08:14 AM
For a single channel, the maximum record size is going to be a bit under 128 million (2bytes/sample and there is a bit of overhead). However, your program will not handle that. I estimate you are making between 8 and 10 copies of your data once you fetch it from the 5124. In addition, the data is being fetched as scaled doubles, which increases its size a factor of 8. So you are getting a data multiplication factor of somewhere between 64 and 80. Depending on the amount of RAM your computer has, this could definitely cause problems.
Not to worry, however, you just need a few simple changes. First, fetch your data as I16 instead of double. This will give you an automatic reduction in size of a factor of eight. You will need to scale and offset the results, but that is not an issue. Do this scaling when you need to. Next, use the NI-SCOPE measurement functions to compute the amplitude and frequency of the signal. They are far more efficient than the Tone Measurements Express VI in terms of copies, and probably speed as well.
Finally, decimate your data before you display it. The graph makes at least one extra copy, and usually more. Since your display is on the order of 1000 pixels wide, 2.5 million points is a waste of RAM. You can get decimation code from the tutorial Managing Large Data Sets in LabVIEW. The tutorial will also tell you how to avoid data copies and how to find them in the first place. The techniques in the tutorial are used in the NI-SCOPE Soft Front Panel, which is written in LabVIEW.
Note that dealing with large data sets in LabVIEW is a fairly advanced topic, and your block diagrams will start to get larger. However, with a bit of care, it can be done. Good luck. Let us know if you need more help.
02-21-2006 05:17 AM
02-22-2006 08:16 AM
02-24-2006 10:40 AM
02-27-2006 07:37 AM
02-27-2006 08:05 AM
03-01-2006 08:37 AM
03-02-2006 08:56 AM
03-03-2006 08:08 AM