LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Help with phase difference of sine waves

Hi. I am having a little trouble with measuring the phase difference of two sine waves.
The sine wave signals are; one simulated in LabView (and sent to an ultrasonic transmitter which in turn sends the signal to an ultrasonic receiver), the other is the sine wave acquired from an ultrasonic receiver.

My program appears to be able to measure the two phases and determine the difference, the problem is that the phase of the acquired wave is constantly changing (and quite radically too).

 

If it is essentially the same wave being sent and received (only the received is delayed, due to distance between sensors and therefore there should be a constant phase difference?)

I have attached my program (LabView 8.2 although I have 8.5 available to me as well) could someone take a quick look at it to see if everything is in order?

 

If anyone can offer any ideas as to why the phase measurement of the second wave is changing so rampantly it would be greatly appreciated.

 

Regards

0 Kudos
Message 1 of 9
(4,108 Views)
First I would split up the analog output and analog input into two different loops. You will need to get rid of the DAQ assistants and get more effecient programming. If you fix these two thing that will help a lot.
Tim
GHSP
Message 2 of 9
(4,086 Views)

Hi. I initially had the inputs and outputs in seperate loops, however when I attempt to bring the signals (phase) from tone measurements outside the loops to a common area, where I can subtract them to find difference, I get no readings. Is there a specific way to send the phase measurement, or any signal outside a loop.

I am working on this as a part of my final year project in college, however there is nobody here who is experienced in LabView the only instruction I received for send and acquiring signals was to us DAQ assistants. Can you point me to some literature on alternatives (the more effective programming you mentioned)

 

 

Thanks

0 Kudos
Message 3 of 9
(4,061 Views)
Here is a good place to start. Also look under the example finder in Hardware Input and Output>>DAQmx.
Now Using LabVIEW 2019SP1 and TestStand 2019
Message 4 of 9
(4,053 Views)

You need to syncronize the output and the input of the DAQ, otherwise you will have a phase difference between both and hence no correct phase difference between transmitter and receiver.

A simple way to do it will be to read the output back via a second channel. Setup yor ai task for both channels and use these to extract the phase.

 

Better but more complicated would be to use the DAQmx functions instead of the Express VIs and syncronize output and input via shared sample clock and common trigger.

 

Also note that there are better ways to measure the phase difference (you introduce some error/noise by allowing any frequency in both evaluations, where it is physically the same frequency).

 

Felix

Message 5 of 9
(4,029 Views)

Hello JacktheLad,

 

Express VIs are simple to configure and use in your VI, however, each time they are called, they introduce a larger amount of overhead head than lower level VIs.  When trying to accurately measure phase difference, this general overhead, along with your single loop architecture, this could introduce errors.

 

At the moment you can try and improve the current VI by implementing two good practices.

  1. Enforce Dataflow
  2. Execution Control Timing

 

I have used the Error Cluster (Green wire) between the Analog Output and the Analog Input.  This ensures that the Data is read into your VI after you have been able to output.  Furthermore, without Execution timing, your while loop will try to run very quickly.  A 100 ms iteration delay on the while loop will help.

 

A good example of synchronized Analog Input and Output with lower level VIs can be found in the LabVIEW Example finder

 

Run LabVIEW

  1. In the main menu, click 'Help'->'Find Examples'
  2. Under the 'Browse' tab, find 'Hardware Input and Output'->'DAQmx'->'Synchronization'->'Multifunction'->'Multifunction Synch AI-AO.vi'
  3. View the Example, and try to adapt it for Phase calculations.

Regards,

George T.
Senior Applications Engineer
National Instruments UK and Ireland
Download All
Message 6 of 9
(4,021 Views)
Thanks to all for the support. George, I will get to take the programme to the lab this wednesday, in the meantime I will look at the View the Examples fromLabView and the programme you have altered for me.

Regards.
0 Kudos
Message 7 of 9
(4,005 Views)

I recommend measuring both signals.  If you do, you eliminate a source of error.  Another source of error will be the delay between the sample of each channel (if you are not using a simultaneous sampling DAQ card).  This could be significant if the phase difference you are measuring is very small.

Message 8 of 9
(3,938 Views)

Two things:

In most cases you only need to trigger the output and can assume (and test) that you only need to measure the source phase  once. 

 

Take a look at  SAM  (sinus approximation method) to measure phase. Basically it's a 3 (or 4) parameter fit

 

y(t) = a*sin(w*t) + b*cos(w*t) + c

 

a,b,c  are the parameters (phase = atan(b/a)

w might be a parameter to fit, if greater or constant doppler shift is expected, otherwise it can be set to w_source

 

y(t) don't have to be equi-distance samples but should cover at least a half periode . You will find that, as better your input SNR as fewer point you will need.

 

I attach a quick and dirty prove of concept for a 6 parameter fit  (two tones or one tone with hum) I made as a starting point  

   

Greetings from Germany
Henrik

LV since v3.1

“ground” is a convenient fantasy

'˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'


Message 9 of 9
(3,885 Views)