LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Counting AC cycles, then flipping a DO bit just_in_time!

I've read through the recent millisecond timer post, some of the links therein and have had this conversation many times in the past; I understand you cannot do RT in Windows...

 

Still, I wonder where the jitter I'm seeing on where my DO state change lands, relative to its mark, is really coming from? I believe it's worse on a Windows 10 target, than it is on the Windows 7 system I'm developing on, if that's any hint.

 

I have the counter input of a USB 6009 connected to an isolation transformer, with a full wave bridge in between driving an open collector transistor; this allows me to count @ 100 Hz.  In my program, I wait for the count to become "4" in a loop, with a 1 ms loop delay. Then I wait some # of milliseconds, then I change the DO state.

 

I cant tell if the jitter (in the DO state change, relative to the AC signal) I'm seeing is due to the millisecond wait function, the DO update, or both?  I'm guessing the DO update happens PDQ, as I happen to be aquiring two AI streams at 5kHz simultaneously - I dont see a chunk missing in the AI signals when the DO changes state.

 

Is there any hope of making a...er, any known kind of realizable LV delay of a few ms more consistent, work across different Windows versions, or am I simply SOL?  As Madonna and Pink Floyd would say, "It would be so nice".  

 

I could use a D flop to sync the jittery DO to the AC signal - unless they've stopped making those these days because there's no money in it.  It's supposed to be a commercial product, so I cant use a BB or a Pi programmed with LV to get around Windows.  Any suggestion would be much appreciated, thanks!

0 Kudos
Message 1 of 5
(2,526 Views)

Hello,

I would suspect the jitter is coming from your computer or code doing other processes in the background. Depending on your LabVIEW code structure it is possible other portions of your code are causing the jitter by executing before/after the wait millisecond function.

 

From your explanation it seems reasonable that the jitter could be coming from the Wait and DO update. To get more consistent delays from software timing you would need to look into a real time system or FPGA.

 

As for the differences in jitter you are seeing between Windows 7 and Windows 10, that as well seems reasonable as the OS will be running background programs and can be executing the LV code differently.

 

If you would like to post the code in question, maybe there will be more specific suggestions to minimize jitter. I don't suspect it to be disappear. 

 

Spencer R | NI

0 Kudos
Message 2 of 5
(2,480 Views)

1. Jitter source 1 -  I wait for the count to become "4" in a loop, with a 1 ms loop delay.  

Even if the loop itself ran without jitter (and it won't), you'd still have variable jitter of up to 1 msec due to timing quantization

 

2. Jitter source 2 - Then I wait some # of milliseconds

Not sure if there's timing quantization here, but you will be subject to jitter from the OS servicing your desired wait time

 

3. Jitter source 3 - then I change the DO state

Immediate IO across USB is subject to latency and jitter.  Nature of the beast.

 

I would *want* to solve this in hardware with a more capable DAQ board.  An X-series board, even USB, would have counters and DO that are capable of solving this problem in hardware with orders of magnitude less jitter.

 

If stuck with the USB-6009, the way I'd *try* to get more consistent timing is via a dummy AI task.  If you can program 10 kHz sampling for example, there are ways to request the *next* 10 samples.  That will make the Read function wait for them.   1 msec wait will be necessary, not sure how much extra you'll get in practice.  Jitter *might* prove to be lower than the msec timer.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 3 of 5
(2,467 Views)

I "got away" with using a Timed Loop under Windows (XP  maybe?) where i had hardware I could use as the timing source.

 

Ran at 1000Hz just fine.

 

That may get you close but I prefer Kevin's suggestion to do the work in hardware.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 4 of 5
(2,461 Views)

Thanks guys for helping me out with this.  I've given up on using the PC, even though it was a neat example of a "software defined" function; back to a chips on PCB layout to realize what I want to do. 

0 Kudos
Message 5 of 5
(2,429 Views)