LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Signal out of sensor range


dsb@NI wrote:

~40 s * 5000 S/s is only ~200000 samples. I would not expect this data to bog down a modern computer. As @GerdW suggests, share your code (with acquired data in a constant or saved as default in a control).


I didn't manage to save the acquired data in a constant or save it as default in a control, so I attached it in a txt/tdms file. This is the data without the code you sugested, because the occuring error prevented me from doing that.

0 Kudos
Message 11 of 21
(1,900 Views)

Hi Michael,

 


@Michael_Jr wrote:
This is the data without the code you sugested, because the occuring error prevented me from doing that.

Ok.

You set a samplerate of 2kS/s in your DAQAssistent, but read 25.6k samples. Fortunately you didn't forget to set the timeout to 14s.

Then you do a lot of conversions with those data:

  • create 2 plots with 25600 samples each
  • using a filter on your samples
  • calculating several values (energies etc.) from the filtered samples
  • converting all those values into DDT wires (for no good reason)
  • Building an array of DDT (basically) by using MergeSignal
  • Saving the bundled data using one more ExpressVI…

I don't think the plot creation loop (with your "out of bound detection") is the reason for the "out of memory" issue…

When does the error occur? Immediatly in the first iteration? Or only after running the loop for some time?

 

Btw. cleaning up your code improves readability:

How do you stop your VI?

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 12 of 21
(1,893 Views)

Thank you for your answer Gerd!

 

I tought it's cleaned up, but I guess that wasn't enough 😄

 

 


@GerdW wrote:

You set a samplerate of 2kS/s in your DAQAssistent, but read 25.6k samples. Fortunately you didn't forget to set the timeout to 14

So "samples to read" still plays a role even if the Aquisition mode is set to "continuous samples"?

 

 

 


Then you do a lot of conversions with those data:

  • create 2 plots with 25600 samples each
  • using a filter on your samples
  • calculating several values (energies etc.) from the filtered samples
  • converting all those values into DDT wires (for no good reason)
  • Building an array of DDT (basically) by using MergeSignal
  • Saving the bundled data using one more ExpressVI…

I'm still a newbie in Labview. Please excuse my mistakes. I tried to change the channel name and as a side-effect converted all those calculated values into DDT, beacuse somehow they didnt change in the measurement file when they were changed in the channel properties only.

 

 

 

How do you stop your VI?


By pressing the red abort execution button. I have the feeling that is not the way to do it properly..

 

 

 

I don't think the plot creation loop (with your "out of bound detection") is the reason for the "out of memory" issue…

When does the error occur? Immediatly in the first iteration? Or only after running the loop for some time?


The position and velocity plots show the first 12-14 s (the new, "out of bound detetion" plot shows nothing), then the  the "out of memory" prompt shows up and the vi stops working pointing out the DAQ Assistant;

 

Possible reason(s):

The application is not able to keep up with the hardware acquisition.
Increasing the buffer size, reading the data more frequently, or specifying a fixed number of samples to read instead of reading all available samples might correct the problem.

Property: RelativeTo
Corresponding Value: Current Read Position
Property: Offset
Corresponding Value: 0

Task Name: _unnamedTask<6>

 

 

 

0 Kudos
Message 13 of 21
(1,884 Views)

Hi Michael,

 


@Michael_Jr wrote:

So "samples to read" still plays a role even if the Aquisition mode is set to "continuous samples"?


Sure it will play a role! It defines the number of "samples to read"…

 


@Michael_Jr wrote:

By pressing the red abort execution button. I have the feeling that is not the way to do it properly..


Some say this is like stopping your car by hitting a tree…

What about replacing the FALSE constant by a button?

 


@Michael_Jr wrote:

The position and velocity plots show the first 12-14 s (the new, "out of bound detetion" plot shows nothing), then the  the "out of memory" prompt shows up and the vi stops working pointing out the DAQ Assistant;

 

Possible reason(s):

The application is not able to keep up with the hardware acquisition.
Increasing the buffer size, reading the data more frequently, or specifying a fixed number of samples to read instead of reading all available samples might correct the problem.

Property: RelativeTo
Corresponding Value: Current Read Position
Property: Offset
Corresponding Value: 0

Task Name: _unnamedTask<6>


Does it really say "out of memory"? Really?

Those "12-14s" correspond to your DAQ settings: to read 25.6k samples at 2kS/s takes exactly 12.8s!

 

The error message you wrote now is just like "the loop took too long to iterate and so you didn't keep up with DAQAssistent to read new samples fast enough" - or simply "buffer overflow"!

 

Ideas:

  • Set a reasonable number of samples to read. Usual recommendation is to read about 1/10 of the sample rate, in your case 200 samples…
  • Make sure the file writing does not need too much time by using a local and fast harddrive - otherwise you might get the same error (buffer overflow) again…

 

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 14 of 21
(1,879 Views)

While you implement GerdW's suggestions, I also recommend you reduce the chart history length which is currently set to 240000 waveforms. This caused 'Out of memory errors' on my computer as well, but I reduced the chart history length to 10 waveforms (considering that each waveform is 12.8 s long, that's still 128 s of data), and LabVIEW no longer runs out of memory. While I expect LabVIEW to handle that chart configuration better, we were asking it to allocate 240000*25600 sample (~6 GB) buffer for each chart.

Doug
NI Sound and Vibration
Message 15 of 21
(1,866 Views)

Sorry for my poor math. Above should have read (~6 GS) buffer...and that's close to 50 GB.

Doug
NI Sound and Vibration
0 Kudos
Message 16 of 21
(1,857 Views)

Thank you for your advice Gerd and Doug!

 

 


@GerdW wrote:

Some say this is like stopping your car by hitting a tree…

What about replacing the FALSE constant by a button?


 Done, what a great comparison 😄

 

 

After implementing both of your suggestions the "out of memory" promt is finally gone and the Vi runs again.

Your code works great Doug, do you may have an idea how I can get rid of the "spikes" in the processed plot (upper plot in picture)?

 

 

plots_unprocessed(bottom)_processed(top).JPG

 

 

Michael

 

0 Kudos
Message 17 of 21
(1,833 Views)

Hi Michael,

 


@Michael_Jr wrote:

do you may have an idea how I can get rid of the "spikes" in the processed plot (upper plot in picture)?


As usual it would help a lot when you would attach your code with some useful default data in the graphs/charts…

 

Suggestion:

Right now you analyze/filter the data based on checking each sample individually. When you would also check several samples before/after the current one you could easily filter such "spikes" too! (Those spikes come from the comparison limit set arbitrarily in this filtering: maybe you just need to adjust that limit value?)

 

You can improve your code even further:

(You don't need atleast 2 of those 6 Abs() function calls…)

 

  • Cleanup!
  • Think about using a (typedef'd) cluster to bundle related values (like those energies/velocities) instead of 6 scalar indicators and wires in the main VI.
  • Use the default 4-2-2-4 connector pane.
  • Try to stick with the default wire connection pattern at subVIs: upper left/right for references, lower left/right for error cluster…
Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 18 of 21
(1,828 Views)

Now that we know more about your application, I think we can also suggest getting rid of the invalid data instead of even trying to make it look valid. Do you have any reason not to just process the valid subset of the array?

Doug
NI Sound and Vibration
Message 19 of 21
(1,816 Views)

@GerdW wrote:

 

Suggestion:

Right now you analyze/filter the data based on checking each sample individually. When you would also check several samples before/after the current one you could easily filter such "spikes" too! (Those spikes come from the comparison limit set arbitrarily in this filtering: maybe you just need to adjust that limit value?)

I thought the "spikes" occur before the "Savitzky-Gokay-Filter" in the postion plot, before integrating them to velocity? Or how can I check several samples before/after the current one? (In the SG-Filter I use 7 sidepoints right now).

 

@Doug: yes, that would indeed be a good solution, I just have no idea how I can analyse only the desired subset (at the time where the peak/change in postition occurs...)

0 Kudos
Message 20 of 21
(1,750 Views)