LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

USB 6251 2- channel simultaneous sampling

thank you very much for the advice, and you are right, I found some answer to my problem by reading some post from Kevin and else. Best regards 

0 Kudos
Message 11 of 20
(1,602 Views)

I'll add a couple thoughts directly.  There are ways to get really excellent hw-precise timing sync between your AO and AI tasks.  But it might be overkill and kinda overwhelming if you're new to DAQ devices.  First I'll outline the hardware method:

 

1. This would be simpler with a newer X-series device that directly supports retriggerable finite AI tasks.  With your M-series device, you'll need to use an indirect method with one of your counters as illustrated here.   A little more thorough info can be found here (and you may want to download the legacy examples from the link).

 

2. The concept will work like this: the AO sample clock will pulse every 100 msec.  This clock will be used to keep re-triggering a finite analog acquisition that captures 5000 samples in less than 100 msec.  Let's aim for 50 msec, corresponding to a sample rate of 100 kHz.

 

3.  Each sample gives you an A/D conversion result for all the channels in your AI task.  You'll have 3 -- ai0, ai1, and one more channel to measure the ao signal.  You have two options for that.  One is to physically wire from your ao terminal to ai2.  Another is to designate a special "internal channel" that's kinda hidden away by default.

    Go to your GUI control for designating your DAQmx channels for your AI task.  Right-click and choose "I/O Name Filtering...".  Then check the box to include Internal Channels.  Now you can see and select one with a name like "_ao0_vs_aognd".   There's a chance this internal channel will carry less noise & interference compared to the signal that has to travel out into the cruel real world.  Of course, you may *prefer* to capture that real-world signal anyway since that's what you're sending to your other equipment.

 

 

On the OTHER hand, maybe your timing and sync needs are less demanding and you'd be fine with much simpler approach based on software timing and software re-acquiring.  Here's how *that* would go:

 

- The AO task becomes software-timed with no config call to DAQmx Timing.

- You make a software loop that starts every 100 msec.  (I'd use the "Wait Until Next ms Multiple" function that looks like a metronome.)   Establish dataflow to make sure the wait executes before your DAQmx functions (or after, either can work).

- Call DAQmx Write with the next single AO value.

- Then immediately start your finite AI acquisition task.  Read all the data and stop it again.  

- The data should be sent off to a separate loop using a producer / consumer technique based on a queue or channel wire.

 

The overhead time related to starting and stopping the AI task repeatedly in the loop can be reduced by taking advantage of the DAQmx task state model.  Namely, before entering the loop, call DAQmx Control Task with a "commit" action.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 12 of 20
(1,589 Views)

Dear Kevin, here is the vi for labview 16 as requested. 

 

"https://forums.ni.com/t5/Multifunction-DAQ/Number-of-sample-and-sampling-rate/m-p/4001236?profile.la...

 

I'm now realizing I'm discussing the same app with you in two different threads.  Let's shift the discussion over to the other one, where I think there's more detail laid out more clearly.

 

To briefly address your latest post here though:

- the 2 bits of code I linked to weren't meant to be exact versions of what you need.  They were just meant to illustrate how to share a sample clock between tasks.  However, by cross-referencing that other thread, I realize that you wan't *sync* but you don't want the same sample rate between your AO and AI tasks.  Bottom line: there's no point troubleshooting that -200284 error.

- I can't open the code you posted b/c I'm not using LV2019 yet.  Please back-save to LV2016 or earlier (menu choice: File-->Save For Previous Version) and post to that other thread.

-Kevin P

0 Kudos
Message 13 of 20
(1,565 Views)

Dear Kevin, here is a vi without sync Ao and Ai. But it does not scan the entire triangle function. I had to use on board clock on Ai input clock. I saved it for LV 16. 

Yes, I think that your comment about "no sync between Ao and Ai" might be the good way. 

I need to run Ao to complete 1 period while measuring Ai0:2 during the same period but with more samples compared to Ao. This is what I want to achieve. 

An alternative is to create the original Ao function that includes already the steps, means a number of samples that corresponds to the sampling of Ai. In other words, I have to create steps for each amplitude value of the initial triangle function. Each step contains the number of samples I need for statistics on each output amplitude. Whatever the duration of the triangle function. If I do that, then I need to change READ and select 1 sample on N channel.

 

Between those 2 methods, which one is the most relevant ?  

0 Kudos
Message 14 of 20
(1,559 Views)

Inspired by a vi made by someone else (sorry, I do not remind from whom), this is typically what I would like to achieve. 

Excepted that the recorded data does not follow the waveform of the triangle, because it starts recording at random amplitude. But one can see that there are many data for each amplitude. 

I cannot master well the definition of the READ period. The read data period is 4msec, but my triangle waveform is intended to cover a period of 1sec. 

So I would like my data also to have a total period of 1sec, with many data on each step (defined by user). This is my dream with this DAQ !  

So I believed that I should set the clock with a high rate (500 000) and a number of sample per channel of 2000. That is why I got 4msec period rather than 1sec. 

I though that by defining a high rate, the DAQ will give a small dt for recording, thence leading to a large amount of data for each output amplitude. But this is not the case : the clock is modifying the period of the output waveform

0 Kudos
Message 15 of 20
(1,553 Views)

Well, so, honestly, the fact that your sync and nosync vi's are nearly identical kinda tells me you may not have taken enough time to pause and *think* about the code.  The two approaches should show more substantial differences.

 

The "sync" version actually looks closer to mostly working, so I'll focus on that one.

 

1. Why do you set your AO sample rate to 500 kHz when your comments suggest you want 100 samples per 60 sec?  Set your AO sample rate to 1.666667 Hz instead.  (Frankly, I'd probably define the waveform with either 60 or 120 samples so I could set sampling at 1 or 2 Hz.  Reasons to follow.)

 

2. The ratio of your AI and AO sample rates should reflect the # AI samples you want for each AO value.  Code comments suggest you want 1000, so set the AI rate to be 1000x the AO rate.

 

3. Your attempt to sync AO and AI is a near-miss.  You specified the AO Start Trigger signal as a sample clock for AI.  That'll only pulse 1 time.  What you *should* do is specify the AO Start Trigger signal as a *Start Trigger* for AI.  Leave the 'source' input to DAQmx Timing for AI unwired, and let the board do the default thing of deriving its own sample clock at the rate you requested. 

    However, you will *also* need to start the AI task before the AO task.  Your sequence structure guarantees just the opposite.

 

4. Your AI read loop is requesting 1000 samples per iteration.  That's probably a good choice, once you address item #2 above.  Each chunk of data should be all the samples at 1 specific AO level.

    The producer / consumer structure with a queue was also a good idea.

 

Once you address those things, you should get some reasonable initial results.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 16 of 20
(1,530 Views)

Dear Kevin, many thanks again, especially #2 and #3 that made everything very clear in my mind. 

I will make a short comment about this sentence "you may not have taken enough time to pause and *think* about the code" :

Answer : I agree ...  I am working on this DAQ everyday since 2 weeks now, and if I decided to post on the forum, that's because I want to get some understanding or method because something is not clear in my mind despite having read a lot of docs. In order to go strait for clarifying what I don't understand, I made several SMALL vi and post it on the forum. Maybe too fast... Those codes are just necessary blocks for understanding my problem (discretisation of a complex problem). It is a working base. End. 

 

My new code(LV 2016) is attached here, as the results of having following (I guess ...) all your instructions, and I would like once again, to deeply thank you very much. Without your help, I would have taken even more weeks ! 

I hope this new code would be helpful for many people who wants (like me) to concentrate on Physics rather than Labview, but with a minimum of effort to make a program not perfect but acceptable to do almost what it should do ! 

Thanks to all your explanations, I was able to make it in reasonable time scale ! 

 

The vi is built for a DAQ USB 6251 16bits with 2 AO and 8 AI in BNC : 

1- generating a waveform on Ao0 (sinus, triangle, ...)

2- measure 3 inputs Ai0, Ai1, Ai2 , written as Ai0:2 (cause all are voltages here). 

3- the physical output Ao0 is physically connected to Ai0 using BNC cables ; The 2 other inputs Ai1 and Ai2 are physically connected to a circuit to measure voltages. It can be adapted to measure current or whatever by selecting the appropriate AI block. Then modify/add virtual channel(s) accordingly

4- the output is generating #s total samples on 1 period (seconds). You want to supply a signal during 60 sec, with 100 samples (means 100 points defining your waveform). This gives a sampling rate Fs = #s/period

5- the input are going to measure #dsi samples for each amplitude of Ao0, allowing to make statistics (mean+std) later.  So I am using a control that specify how many measurements are needed for each value of amplitude of your output waveform. 

6- a while loop allows repeating this measurement (here the value is set to 2, but you can make a control if you need). A for loop might also work, but I personally prefer while loop because I can control the STOP process, and it is more compatible with more complex program that are based on producer-consumer with event cases. With a for loop, the program is staying there until the loop ends. But with a while loop, it is more flexible to be inserted as an event case. 

Et voila ! 

0 Kudos
Message 17 of 20
(1,508 Views)

It's *closer*, but still needs some tweaks.

 

1.  You need to configure the AI task with a Start Trigger and specify the signal "/Dev1/ao/StartTrigger".  This sync's the *beginning* of the two tasks that run at different rates.

 

2. Your calc for # AI samples looks wrong and the calc for AI sample rate looks overly complicated (and also wrong -- the implied units are samp^2 / sec^2).

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 18 of 20
(1,472 Views)

Had a few spare minutes to try some tweaks myself.  Try it and see, I couldn't test it on any hardware.

 

I generally stuck with doing fairly minimal necessary changes.  What remains isn't perfect but is hopefully a decent working demo.

 

 

-Kevin P

 

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 19 of 20
(1,457 Views)

Dear Kevin, once again thank you so much to help me a lot in this project. 

Unfortunately I cannot test it until Jan 7th 2020, but I take the opportunity to wish you all the very best for this new year. 

 

1.  You need to configure the AI task with a Start Trigger and specify the signal "/Dev1/ao/StartTrigger".  This sync's the *beginning* of the two tasks that run at different rates.

--> yes, you are right. 

 

2. Your calc for # AI samples looks wrong and the calc for AI sample rate looks overly complicated (and also wrong -- the implied units are samp^2 / sec^2).

 

--> I am not sure it is wrong. Here, I intended to modify a little bit the definition of AI samples, and AI sampling is thence different than usual #s. My AI sampling is defined as the number of samples needed to make a statistic for each AO sampling value*. I believe that what I did is more oriented to my application rather than for a general case. 

 

* For example the AO is defined with 100 points. For each point, I need to measure AI 1000 times to get a good stats. This is a way to make AO output waveform as a stepwise function from the AI point of view. This is what I need. The number of samples on each step is user defined by AI sampling. And the AI rate (Fsi) is calculated to keep the same initial period (1/Fso) than the AO. 

 

 

 

 

0 Kudos
Message 20 of 20
(1,449 Views)