LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

High speed AO data not matching cooresponding AI data

Solved!
Go to solution

I am building up a random noise signal generator system for vibration table applications. Before hooking it up to the vibe table, I wanted to ensure that my desired output signal matches up with what the shaker will be seeing. I simply connected an NI 9215 AI channel with the output channel on an NI 9263 on my cDAQ -- and voila! ...Of course the signals don't match...

 

At the moment, I'm using DAQmx instead of DAQ assistant, but I could be swayed towards the latter if anyone has suggestions/ preferences. 

In general, my data file is quite large (160k data points) and needs to be 'completed' in the span of 5 sec -- which means 32kS/sec. According to the datasheets, this should be no issue for either of my NI modules, however when I plot the acquired voltage data, it...

(A.) is different every single time

(B.) is scaled down by a factor of about 0.25

(C.) does not generate a bell curve histogram, as would be expected with random noise data. 

 

It should be noted that I tinkered around with acquiring the AI data via the headphone jack (play waveform function + mutilation of some pc speakers), instead of gathering the data coming out of 9263. That turned out much better, albeit, not perfect. By gathering through the headphone jack, the acquired signal looked like random noise, although it didn't match my original signal. Also, the generated histogram resulted in a bell curve, as hoped. 

The kicker is that upon performing an FFT transformation to get the data in the frequency domain, it was far from meeting my intended PSD envelope. 

 

That being said, I would still ultimately like to have NI 9263 serving as my AO device for driving the vibe table. After messing with it for a few hours, I'm truly stumped. I figure it has to be one of three things: 

- NI 9263 is not outputting correctly

- NI 9215 is not gathering data correctly

- Code structure is wrong (most likely)

 

Any help is appreciated! I apologize if the code is difficult to interpret -- still getting the hang of DAQmx. Also, fair warning, the raw data file is about 4.7 Mb (160k data points). 

0 Kudos
Message 1 of 9
(3,050 Views)
Solution
Accepted by PFour

Only have a couple minutes to respond now.   I could only look at the screenshot of your code and I see at least the following problems:

 

1.  You aren't waiting for the AO signal to be generated.  As soon as you write the data and auto-start the task, you immediately clear the task. That ends the AO generation.  

 

 

2. You aren't syncing the AO and AI in any way.

 

3. You'll want to think carefully about your AO task -- Finite vs Continuous, Allow Regeneration or Not, having a way to gently bring your shaker back to 0 when you stop your test waveform from generating.

 

4. A standard AO DAQ device like yours will generate piecewise-constant samples at the sample rate, then transition rapidly (almost instantly) at the AO slew rate to the next sample value, etc.  On a scope, you'll see a raggedy stair-stepping kind of appearance.  This could inject lots of additional high-frequency excitation into your shaker.  You might ought to consider something more like an arbitrary waveform generator.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 2 of 9
(3,037 Views)

Thanks for the input Kevin,

 

1. If I ditch the auto-start and place a "Start Task" function after the daqmx write, would that solve this problem? Seems to me like this would fix my issue with not finishing generation, prior to writing.

 

2. Hm.. This is the first I'm hearing of such a thing. I'll do some research and see if I can figure out how to do this.


3. Each dataset I use will be finite and I will know the exact size/ required rate. So wouldn't finite be the best way to go? Still learning what "Regen vs. Non-Regen" even means, but from various other example codes/ white pages, it seems that Non-Regen is what I want. Again, I'll continue to look into this. As for settling the shaker back to zero, I'm not sure how I could go about this other than ensuring that the last AO voltage value is zero, in each dataset. Would this be sufficient or in your experience, do vibe systems require some sort of 'ramp-down' event? 

 

4. As for the arbitrary waveform generator... I had originally looked into using this VI and 'loading' my data points from the .txt file. But I was having issues with getting the file format to be correct so I moved onto using a simple read-from-spreadsheet.vi.

I see what you mean though with pseudo high-frequency signals possibly coming into play in the actual output signal. I'll take another crack at using the arb waveform generator. 

 

I appreciate the help! 

0 Kudos
Message 3 of 9
(3,026 Views)

If you can "Save for previous version" back to LV 2016, I can help with a little tweak that will sync up your AO and AI.  It'll be easier than explaining in text.  There's 2 basic parts, 1 comes from correct software dataflow and the other comes from a shared sample clock on the board.

 

 

-Kevin P

 

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 4 of 9
(2,981 Views)

Hey Kevin,

UPDATE: I heeded your advice and made a few changes -- things are looking much better (albeit, not perfect). Here's what I changed (see attached photos).

 

So.. I DO want finite sampling, I DO want non-regen, the AO signal is now waiting to complete generation (no longer auto-starting), and I am syncing the AO and AI via the error line. Thumbs up all around!!

 

However, the data is still looking a little bit wonkier than I'm expecting. While it's pretty easy to visually matchup the AO and AI time waveforms and see that most of the peaks match up, it appears that I'm still losing some of my data points. This delta is most apparent in the resultant PSD -- I'm clearly losing any filtering effectiveness I previously had with the output data.

 

I had an 'aha' moment and realized that in order to acquire the AI data with any confidence, I should be sampling at at least 2x the rate of the output. However, since the AO and AI are now synced, changing this actually had the reverse effect.

I.e. increasing AI sample rate significantly decreases my ability to re-create the AO signal... which is a pretty weird/ unexpected response. 

 

Also, following your advice I've generated a .lvm measurement file from my .txt file data and have attempted to load that into an arb signal generator. This is all fine and dandy, BUT as soon as I try to 'load the data' and press 'ok' on the arb signal generator interface, labview decides to crash. Too much data for that little function, I suppose? 

 

Best,

 

Austin 

0 Kudos
Message 5 of 9
(2,973 Views)

Another update:

A few interesting discoveries.. I CAN increase my sample rate without screwing up the time history data too much if I increase AO and AI together, however, doing so increases the high-end frequency of my PSD (equivalent to raising the cutoff on a high-pass filter). Seems like 32kHz is the sweet spot for staying between 20Hz and 2kHz, which makes sense because I'm doing 160k samples in 5 sec. 

 

Another seemingly counter-intuitive response that I've noticed is that by increasing the max/ min voltage range of my AO/ AI units, I'm actually collecting data that more clearly resembles the original dataset. This is strange because I would imagine that the smaller window the voltage is oscillating over, the better precision that can be achieved. I.e. it's easier to go from 0.1 V to -0.1 V than to go from 10 V to -10 V. Either way, that data looks nearly perfect with a [10 , -10 ] range!

My PSD is low by an order of magnitude, but that shouldn't be an issue as I can just adjust the sensitivity on my power amplifiers which are delivering the signal to the shaker table. 

 

I'll continue trying to use the arbitrary signal generator, but I think that my dataset is simply too large for what that little function was designed to handle. 

 

Thanks again Kevin! I'll mark your original response as this thread's solution as you put me on the right track. Cheers!

 

Austin

0 Kudos
Message 6 of 9
(2,963 Views)

Hey [again] Kevin,

Briefly bringing this thread back to life to review one of your earlier comments.

I've thought about this concept of 'gently bringing the shaker' to/ from 0 to run my test.

 

At the moment, I'm using the 'Ramp by Samples' vi to preceed my primary test with a duplicate of the same dataset, but gradually ramping it up from x0 to x1. I also do the same thing at the end of the test, but in reverse, to ramp-down from x1 to x0. 

 

In other words, I went from 5 sec duration to 15 sec duration, by running the dataset 3 times wherein the first and last sets are ramped to/ from 0. In your experience, would you expect this to be sufficient to ensure I don't damage my shaker? 

 


3. You'll want to think carefully about your AO task -- Finite vs Continuous, Allow Regeneration or Not, having a way to gently bring your shaker back to 0 when you stop your test waveform from generating.

 


 

0 Kudos
Message 7 of 9
(2,932 Views)

I've been meaning to look over the things you posted in msg #5 more carefully but haven't been able to find time lately.   If code has changed since then, please post the latest version after saving back to LV 2016 format and describe any remaining questions/issues.

 

I haven't directly worked with shaker tables, but I would strongly expect that the "ramp in" and "ramp out" can be a very much shorter duration, like a fraction of a second.  The main idea is simply to avoid large step function commands from whatever voltage it happened to end on last time to whatever voltage it happens to want to start from this time.  The ramp time should only need to be slow enough to be within the bandwidth of the shaker's response capability.  I'd start with a guess like 100 msec for ramping -- that seems to me very likely sufficient while also not being a burdensome amount of additional time.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 8 of 9
(2,923 Views)

I haven't typically tried to sync tasks via reference to something like "aiStartTrigger" unless the AI task was configured to wait on a Start Trigger in the first place.   I've seen others' code here doing it that way and they've reported success, so I'll tentatively suppose that your screenshot in msg #5 accomplishes sync sufficiently.

 

The next simple thing you could do is to set your AI task for, say, 5x the sample rate and 5x the # samples.   One key reason your graphs for AO don't match your measurements on AI is that the AO waveform you graph isn't the one you actually generate and measure.  You do a couple more steps of waveform processing before writing the waveform to the AO task.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
Message 9 of 9
(2,903 Views)