06-28-2017 06:08 PM
Hello,
I am trying to perform an FFT calculation over multiple times in real-time on an incoming waveform generated from a .mat file. This waveform repeats it self every 0.1 seconds, and I would like to perform an FFT calculation for every period for peak magnitude.
My current methodology has been the following
1) I build a waveform by using the .mat file and defining dt between points
2) I index that signal so that it can be fed into the buck converter in multisim
3) I take the output voltage, and convert it back into a waveform so it has a time component
4) I convert that signal into an array, divide the array into 10 components (1 for each period) and feed that, along with the signal, into the FFT VI
Step four is where I am really lost. I don't believe my FFT calculation is being done correctly, and furthermore, I am always getting this error about 0.5 seconds (sim time) into the simulation.
Unable to converge during transient analysis.
Consider increasing the ABSTOL, VNTOL, and RELTOL options.
I am unsure how to fix that as well.
I have only been using labview for a week! Help!
mat file, VI and circuit have all been included
Thanks! 🙂
06-29-2017 08:24 AM
There's a lot about your problem I don't understand (personally, I've always considered a Cash Register to be a "buck-converter"), but if your problem is the last step, understanding and debugging the FFT routine, I suggest you write yourself a small LabVIEW program that (a) generates a known waveform with known components (for example, a second of a signal made by summing sinusoids of 31, 73, 89, and 101 Hz, with a little Gaussian noise thrown in for good measure), (b) running this waveform through the FFT computation of your choice, and (c) examining the results, whose values you should "know".
Bob Schor
06-29-2017 07:46 PM
Hi Bob,
Thanks for the response. I actually do have the FFT portion working properly now, however I cant seem to "parse" the signal I am generating. I don't want to do an FFT calculation on the whole signal, rather, I would like to calculate it every 0.1 ms which means I need to incorporate some sort of buffer. this is easy to do in matlab but seems nearly impossible in labview.
06-29-2017 08:52 PM - edited 06-29-2017 08:55 PM
Ah - reading here makes your problem a little clearer than in your new post.
Would Array Subset work for you? Or perhaps Delete From Array? If you know the rate of your data, you can work out how many points should be in 0.1ms. If you want to take for example 1s of data and split it into 10000 parts, you could use a While loop with Delete From Array and an Empty Array check, or a For loop with Array Subset and a known N.
I feel like the While loop is probably neater, but you should be careful to ensure that length input is at least 1.
Edit: I also realise that you could make the auto-indexing output of the While loop conditional, wire the Not of Empty Array to the conditional output, and avoid the trailing DFA.
07-01-2017 05:38 AM
Reshape Array would be a cleaner way to form your 2D array. Then you can just use a FOR loop autoindexing on that array to perform your FFTs.