ni.com is currently undergoing scheduled maintenance.

Some services may be unavailable at this time. Please contact us for help or try again later.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

real-time - how to control speed of digital waveform

Deploying the VI attached to my RT desktop target I generate digital waveform on 8 channels on port0 and acquire data on AI [PXIe-6358].

 

The digital output frequency and analog input sample frequency are 10000 Hz therefore I expect that while one 8-bit digital code is active [valid] for 100us, one single analog sample is acquired. That means that I would expect one cycle of the while loop to finish in 100us x 2^8 = 100us x 256 = 25600us = 25.6ms.

 

However what happens is that the while loop finishes in 6ms regardless the set frequency. The loop finishes in 6ms even if I set 5/10/15kHz or if I put a 100us delay into the while loop. I also tried to set the source of the analog input sample clock to 'Dev/do/SampleClock' but nothing seemed to work so far.

 

How can I force the digital generation to be in correlation with the analog acquisition? I need to acquire 1 data point per channel while the respective digital code is active.

 

Thanks,

Krivan

0 Kudos
Message 1 of 3
(2,779 Views)

I forgot to mention in my previous post that if I run the digital generation, I can measure the correct output frequency with oscilloscope however I am unsure whether my analog acquisition is OK for the above mention reasons.

0 Kudos
Message 2 of 3
(2,777 Views)

Hello krivan,

 

Thank you for your post on the forum.

 

Do you want the digital generation to start at the same time as the analog acquisition?

Shalini B
Applications Engineer
National Instruments UK & Ireland
0 Kudos
Message 3 of 3
(2,713 Views)