LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Create frequency sweep using non regenration mode of daqmx write and read it simuntaneoulsy

Hey All, 

I have made this code which can transmit a signal through output channel of the board into the water pipe from one end and then receive the signal from the other end of the pipe into the input channel of the same board. The write and read operations in my code are synchronized. Currently, I am sending a sine burst of 1msec duration followed by 1 second of zero voltage. So, my daqmx write task is sending this wave (sine burst+DC) repeatedly. But I want that my output channel should be able to do frequency sweep, ie my first output signal should be sine burst (5KHz) of 1msec + 1 sec DC, immediately after this next signal should be sine burst (6KHz) of 1msec + 1 sec DC......upto sine burst (50KHz) of 1msec + 1 sec DC. Basically, I want to sweep frequency using sine burst (with 1 sec gap between two burst) in step of 1KHz and simultaneously perform the read operation too. I have attached the vi. I have also attached the image which shows the desired output at the output channel. 

How can I continuously update the daqmx write after running the vi so that it can send the desired signal rather than repeating the same signal again and again?

Download All
0 Kudos
Message 1 of 8
(2,843 Views)

Hello Amartansh13,

 

first i do not realy see, that the input and output clocks are synchroniced or a start trigger is in use. What you need is the regeneration mode for the AO to update the values during the run. The name of the example in LabVIEW is Voltage (non-regeneration) - Continuous Output.vi. I attached an example in which you can manually change the settings for the AO during the run. You can now automate changing the paramaters. For logging is a VI inside from the DAQmx palette in use.

best regards
Alexander
Message 2 of 8
(2,794 Views)

Dear All,

 

I am using the code from your post to produce a swept sine wave but it does not work for me. I also modified since I don't need the trigger addition but I obtained the same error (here attached). The settings of the waveform are in the screenshot along with the error specification. I am using a NI USB-6212 DAQ, windows 10, labview 2022 Q3 64 bit. I have read several tutorials in DAQ timing, tried different methods (e.g., modify sample rate, reduce/increase # of samples, etc) but my knowledge in Lab View is reduced. Any comment/suggestion will be greatly appreciated.

 

Thanks,

 

 

0 Kudos
Message 3 of 8
(1,049 Views)

Can you use File->Save for Previous Version back to, say, LV 2018?  I can't open the actual code and the screencap isn't complete.

 

Of note: the front panel screencap seems to declare that the AO task will get its sample timing from the AI sample clock.  But the block diagram doesn't show the corresponding "sample clock source" terminal anywhere.  Meanwhile, the block diagram also shows that the AI task will get its sample timing from the AO sample clock.

 

If actually implemented this way, you get a circular dependency where both tasks expect the other to supply a sample clock, thus both get stuck and you end up with the timeout error you're seeing.  I recommend you change the "Sample Clock Source" for the AO task to be "Onboard Clock".

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 4 of 8
(1,033 Views)

Hi Kevin,

 

Thanks for your quick reply. As far I know, I haven't set any specific clock source for the AO task, they just share rate with the AI task. However, the AI task does receive its sample timing from the AO sample clock "Dev1/a0/SampleClock" which is in my DAQ NI USB-6212. I read that for fast sampling, this was a better option than the software timing, my objective is to create a sweeping sine function from 1 Hz to 4000 Hz (Analog output) to send it to an RC circuit and then acquire its respective response to compute a bode plot. Here, I have attached a new screenshot along with the VI saved in the 2012 version, it didn't show me a newer version than this. Please let me know if there is anything else I can do.

 

Thanks,

Download All
0 Kudos
Message 5 of 8
(1,020 Views)

Still can't open the code -- it's saved as LV 2021, not 2012.

 

The -200284 error you posted refers to acquisition, claiming that the attempt to read the specified # samples timed out before all of them became available.   The unwired default timeout is 10 seconds, while your front panels show either 10k or 100k sampling and either 1k or 10k samples to read respectively -- so only 0.1 sec worth, which should easily be small enough to avoid a timeout.

 

When this kind of error pops up on tasks with an external sample clock, it's often because that sample clock signal is not present at the time it's needed.  But because you're using the same DAQ device for both AO and AI, the signal routes internally and should be *able* to be present.

 

Hmm, let me take a closer look... <time passes>

 

Ok, now I see a little something.  It looks like your loop stops on this error during the second iteration of the loop when i=1.  You write & read the same # of samples each iteration and your AO task is set to disallow regeneration.  The shared sample clock syncs up sampling such that the last AO sample written gets generated at the same time the last AI sample is converted.  The AO task is then all out of available data, but its sample clock is running at 10k or 100k.  It's going to give a buffer underflow error unless you can give it more data before the next sample clock.

    But you won't be able to iterate until the AI sample gets transferred across USB and is made available to your Read function.  And then you also need to wait on the next iteration for the AO data to transfer across USB to be available to your DAQ device.  All that would have to happen in less than 100 microseconds -- and it can't.

     So your AO task errors out, causing its sample clock to stop, and *that* leads to the AI timeout error.  You gave precedence to the AI error when you merged the errors, that's why you saw the AI error which is an *effect* rather than the AO error which is the *cause*.

 

So, a solution?  You could just put a case structure around the AI Read and make a special case for i=0 where you *don't* call it.  You won't lose data, it'll be there in the buffer for you.  You just may need to do one additional Read after terminating the loop to get the final N samples.

 

 

-Kevin P

 

 

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 6 of 8
(988 Views)

Hi Kevin,

 

Thanks for your very detailed explanation, I precisely noticed what you said when I execute the code with the Highlight Execution button. However I am not sure how to create this new condition. You are saying to skip the read from the AI for the first iteration i=0, and thus I will have at least i+1 iterations saved on my buffer? What I did was to create the case structure around AI Read and add a first Call button "?". So for the first call it will skip the read structure since this button will return true. Here, I don't know how to retrieve the data or wire it to what?, I have a shift register for task in/out, other for error in/out but for data I just want to plot it and I won't have data for the first iteration. For the false case (e.g., second iteration and later), I will have the same case as before right?. In regard of the file version I just select "Save for Previous Version->LabVIEW Version->12.0", is this right or am I missing something?

Here I have attached the resulting file. Thanks for your help,

Download All
0 Kudos
Message 7 of 8
(977 Views)

I made a few mods and did a little tidying to illustrate.  This should get you to the point of running without generating DAQmx errors. 

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 8 of 8
(961 Views)