LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Need help measuring stepper motor velocity

I am trying to measure the velocity of a stepper motor but I am getting incorrect values based on the dt value chosen for the Derivative x(t) PtByPt VI. Ultimately what my end goal is to 1) Measure velocity and 2) Use the velocity measurement as a feedback so that way I can control the stepper motor's velocity to a target velocity.  

 

Below is my hardware setup: 

  • A stepper motor that is connected to MyDAQ. The stepper motor only moves when a pulse train is given. Thus, the movement of the stepper motor is discrete. If the pulse train has 10 pulses, then the stepper motor moves by 10 steps.The minimum movement I can do is 1 step which is around 1um on average.
  • A laser displacement sensor is connected to MyDAQ. The sensor output an analogy voltage which translates to the actual position. Example: Voltage of -0.500 is a position of -0.500mm. 

Attached are my built VI and screenshots to calculate the velocity of a stepper motor that is continuously moving.

 

The stepper motor is given "x steps" after ever Y ms interval. In screenshot, I have set the stepper motor to move ever 1000ms, but this may not always be the case. If I try to measure the velocity at every 1000 ms, I get correct velocity value. But if my velocity loop is change to 10ms or 100ms, then I get incorrect velocity value. I addition, changing the dt also affects the final velocity value out. 

 

What exactly is the dt value? I know its suppose to be time between measured position points. But how can I determine the correct dt value when the position sensor is continuously running? 

 

velocity-test-screenshot-block-diagram.pngvelocity-test-screenshot-front.png

 

 

Download All
0 Kudos
Message 1 of 6
(2,886 Views)

Caveat: I don't really know the abilities and limitations of the MyDAQ device.  My primary experience is with the regular line of DAQmx boards.

 

Not sure what your overall goal is here because this is an oddly roundabout way to control stepper motor speed.  For a stepper motor, the (average) speed *is* the frequency of the pulse train used to drive it.  No need to read an analog position signal in a software loop, and then perform a noise-amplifying numerical derivative to calculate a much worse approximation of the speed.

 

If your concern is about dynamically compensating for the speed fluctations superimposed on the average stepping speed, then *that's* a more difficult problem by orders of magnitude and probably not feasible under Windows.  And probably not advisable elsewhere, better to choose a motor designed for smoother motion in the first place.

 

So what's the big picture here?  Why does the speed matter?  What does the laser displacement sensor mean?   Doesn't the motor cause the displacement change?   Why would your indirect measurement of the motor's movement be more useful than your knowledge of the step rate?

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 2 of 6
(2,839 Views)

The laser displacement sensor measures the position the position of an object. The object is being moved using a stepper motor. I understand that stepper motor speed is controlled using the frequency of the pulse train given to it. However my end goal is to measure/validate the velocity at which the object is moving.

0 Kudos
Message 3 of 6
(2,788 Views)

Again, I'm not familiar with the abilities and limitations of the myDAQ device.  Assuming it has some of the capabilities of other NI DAQ boards, here's some thoughts on an approach I'd consider:

 

1. If at all possible, use a hardware clock to to do your analog laser signal sampling at a known and fairly high rate.   Let's say maybe 1 kHz or higher.

 

2. Read multiple samples at a time from the task buffer.  A decent rule of thumb to get started is 1/10 sec worth -- here that would be 100 or more samples.

 

3. Do a linear regression on this position data.  The resulting best-fit slope becomes a considerably less noisy velocity measurement.  (For this app, you may not need the original samples any more, just the calculated slope.)

    For the linear regression, position data is the "Y" term.  "X" is derived from the known hardware-clocked sample intervals.  If you read the data as a waveform data type, there's a utility function to generate an "X" array of time from a waveform.  (Or maybe there's a function to do the linear regression direct from your waveform?   I'm not sure.  I'm only slowly working my way over to using waveforms more often after many years of of mainly working with arrays.)

 

All this being said, I'm not sure I see why you'd want to adjust the stepper speed based on the laser measurement.  If there's a direct linear relationship between the two, then it should be knowable ahead of time.  If there's a more complicated kinematic set of joints and linkages between them, then you'd need excellent knowledge of that kinematics built into your code to make sense of what to do with your data.  Since I see no sign of such kinematic interpretation, I'm guessing that you're mostly confirming that the stepper drive system is actually functioning and not stalled out.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 4 of 6
(2,785 Views)

@Kevin

 

Thank you for your insights. However I am still little confused. Why do I need to do linear regression? My laser displacement sensor is already sampling at a very high rate (> 1Khz). Also at the moment I am NOT adjusting the stepper motor stepping frequency. All I want to determine is the velocity. As the stepper motor moves, I just want to monitor the velocity. How can I use the Derivative x(t) PtByPt to do this? I don't really understand how dt for the Derivative x(t) PtByPt VI is calculated.

0 Kudos
Message 5 of 6
(2,773 Views)

You have a subvi that returns a single scalar position value in one loop and writes it to an indicator.  A separate loop reads that scalar value from a local variable and does a point-by-point derivative.  That loop is paced by a msec wait timer.

 

Generally, the "correct" value for dt would *want* to be the time between the 2 samples used to calculate the pt-by-pt derivative.  The way the code is constructed, some of that timing is obscured (to me) by the use of the subvi.  (I'm still on LV 2016 and can't open the code.)  The indirectness caused by using a local variable further obscures any sample timing info.

 

The best thing you *could* do is use the derivative loop's iteration time interval for dt.  You could estimate it to be at least the value wired into the msec wait timer, but it would be better to measure the actual delta time.  Even then, those times don't represent the instant when your values were sampled in the real world.  Due to the indirectness from querying a local variable, the position values have timing that merely means "at the time I queried the local variable, this value was the one most recently written to it at some point in the past, I have no way of telling how long ago."

   In a general and kinda crude way, you can get some idea of your velocity.  But a better approach could give you better data.

 

RE: doing a linear regression

   There are 2 main elements to my recommendation.  First, it's based on using a hardware clocked task to give you more accurate timing information.  Second, it's based on averaging the derivative across more sample points, which should noticeably reduce the noise in your velocity data.

 

The faster you iterate the pt-by-pt derivative loop, the noisier and more erratic your velocity calculations will get.  There's not much of a way around that due to the simple finite difference math being used.  Doing a linear regression does the math over a much larger set of points and will thus be less erratic numerically.  If you want to update your velocity measurement every 100 msec, you could sample at 1000 Hz and do a linear regression on each set of 100 pts or you could sample at 10000 Hz and do a linear regression on each set of 1000 pts.   The more points you get in each set, the less erratic the calculation.  The key is that you can *put* more points in each set just by making a corresponding increase in the sample rate.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 6 of 6
(2,760 Views)