From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

MyDAQ logic analyser

Solved!
Go to solution

At my university students use a NI MyDAQ to do measurements.

We use the NI ELVISmx instruments for almost everything except XY-plots.

 

I have a project that could really use a logic analyser, and the MyDAQ has 8 binary inputs, but the Digital Reader instrument is pretty useless for this application. So I opened LabVIEW and created attached vi.

 

However, the problem is the speed and timing. This for loop runs as fast as possible, which seems to be 50-100Hz. Now of course I did not expect megaherz performance from this device, but since the oscilloscope goes up to 20KHz, I was wondering if there is a way to make it faster and more controlled. (I found there are timed loops, but no clue how they work) Maybe I need to used a more low-level interface to the inputs?

 

logic_analyser.PNG

0 Kudos
Message 1 of 6
(4,221 Views)
Solution
Accepted by topic author Fliebel

Hi Fliebel,

 

the first step should be to read the specs for your DAQ device!

 

NI clearly states: DIO at myDAQ is software-timed.

This limits you at ~100Hz, yo will not get any faster than this…

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 2 of 6
(4,205 Views)

Okay. I figured as much. When I picked the DAQ assistent and tried to get multiple samples it told me I can only have "on demand" single samples.

 

After some more experimentation I got the timed loop to work. I added indicators for "iteration duration" and "finished late", and every iteration takes about 5ms.

 

Now my problem is that those signals are outside of the loop, but seem to refer to one iteration. How can I aggregate those results? I would like to show the longest duration and an indicator if ANY iteration took too long.'

 

Never mind. I added a "max/min" block and an "or" block with feedback. Seems to work.

0 Kudos
Message 3 of 6
(4,195 Views)

Hi Fliebel,

 

Now my problem is that those signals are outside of the loop

Why are they "outside the loop"? "Iteration duration" and "finished late should be inside the TWL!?

 

but seem to refer to one iteration.

They refer to the last iteration…

 

How can I aggregate those results? I would like to show the longest duration and an indicator if ANY iteration took too long.

Then you need to program this behaviour! (You know LabVIEW is a programming language!? :D)

Example:

"any iteration too late" := OR all "single iteration too late"

 

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 4 of 6
(4,190 Views)

Well, that's my confusion. They are positioned outside the loop, but they refer to a single iteration. It seems like the equivalent of doing

for (int i=0; i<100; i++) {
    do stuff
}
int max_i = i;

But maybe due to the magic of dataflow programming, there are actually a series of values "flowing" through those pipes.

logic_analyser.PNG

0 Kudos
Message 5 of 6
(4,179 Views)

Hi Fliebel,

 

do you notice the TWL terminal inside the loop - the one with the "Error" mentioned inside?

Try to resize it, then try to select other items than just "Error"…

 

I think this is explained in the LabVIEW help for the TWL structure!

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 6 of 6
(4,167 Views)