LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Scan Engine with EtherCAT

Hi guys,

 

I have a problem in understanding the procedure of the Scan Engine during operation in EtherCAT-Mode.
I have the following setup communicating over EtherCAT:
- A cRIO 9039 configured as Master

- 3 Servo drives from LinMot (E1250-EC-UC) configured as slaves

 

I can access my drives with I/O variables in LabView that allows me to measure the motor position during a movement. I implemented a timed loop that starts reading the motor position from an Input Variable of all motors the moment I push a trigger button in my VI. Simultaneously with activating the trigger button the motor movement starts. For defining possible jitter sources of my received position curve in LabView I measure the same movement parallely with the oscilloscopes integrated in the drives. I use the voltage module seen in the project tree for generating a trigger signal. I simply write the modules' I/O-variables with the voltage value through my VI when activating the trigger button. My Measuring loop is attached to the message, too.

 

Both curves are attached. They show a movement of 40 mm within 30 ms. The green curve is measured with the oscilloscopes and builds the measuring reference as it starts measuring when the trigger signal arrives at the drive.The red one is measured by LabView. While it seems that my program reproduces the curve as it should, I constantly measure a jitter of 2 ms.

 

My timed loop runs at 1 ms and so does the Scan Period which means that my EtherCAT fieldbus sends and collects data from my drives every ms during the scan-procedure. Now what I dont really understand is, why there is a jitter of 2 ms instead of 1 ms. I know for sure that it has to do something with the Scan Period. E.g. as I reduced the loop time from both loops to 800 µs, the jitter became 1,6 ms. At 900 µs it suddenly became 900 µs.

 
My initial theory was, that when sending a value over LabView, the Scan Engine, when received the value through the network, first passes the trigger value to the VI. It then starts to measure the first position point before the actual movement starts which would shift my curve by 1 ms. After the following scan procedure, the written trigger value from my VI is passed via the Scan Engine memory map and via FPGA Scan Interface to my Voltage module. It almost immediately passes the voltage signal to my drives and the movement starts. But during the second scan procedure another position value from the slaves is already collected and written into my measuring loop before the actual movement starts. This would shift my curve by 1 ms again. So the total shift would be 2 ms. Only during the third scan period the first position point of the actual movement is collected by the slaves as the drives receive the trigger signal almost immediately after finishing the second scan procedure. But my shift theory was basically refuted with my measurement at 900 µs loop time where a shift of only 900 µs was measured.

 

Now my question is: How does the Scan Engine passes the trigger value send via LabView to the VI/to my voltage module? If my theory is not correct, what could be the reason that I measure a shift of 2 ms instead of 1 ms? Am I thinking too complicated here?

 

Any advice would be helpful!

Kind regards

Nivogr

 

 

 

 

Download All
0 Kudos
Message 1 of 1
(2,067 Views)