Contact Information
University: University of Leeds (UK)
Team Members: Mazvydas Mark Narvidas (2016)
Project Supervisor: Dr Paul Dean
Email Address: el11mn@leeds.ac.uk
Project Information
Title: LabAtar - Humanoid Robot Teleoperation System
Description: LabAtar is a fully immersive teleoperation robotics project.
This unique robotic control application enables the operator to be present from the robot's point-of-view (i.e. the user remotely perceives what robot cameras observe and all body movements are directly mapped to the humanoid, all in real-time). Such futuristic proof-of-concept opens up endless possibilities in areas that can benefit from increased human-machine interaction, including controlling search-and-rescue robots and interplanetary exploration.
Products:
Intro:
Even though we live in a technology age where ingenious innovations like colliding infinitesimally small particles at light speed and reaching speeds of Mach 25 in unpiloted reusable planes are possible, we still rely very heavily on tweaking knobs and pushing buttons when it comes to controlling robots. As technologies progress, new and futuristic control methods become possible, one of which is teleoperation. This approach to robot manipulation is at it's infancy and requires the initial boost of development and inspiration for future undertakings to take part. Therefore, LabAtar project focuses heavily on these aspects.
This undertaking is my individual engineering project as part of the MEng Mechatronics and Robotics course at the University of Leeds. Due to module timing arrangements, actual hands-on work began in January and the submission deadline was in early-May. This gave me about 4 months to complete this independent development with addition to attending other University modules. Therefore for a project of such scale and having tight time limitations, it was crucial to choose a set of tools that would not only be simple to use but also reliable and powerful enough to accommodate all goals that were set.
Challenge:
By leveraging state-of-the-art technologies available on the market, develop a proof-of-concept control system that would encompass all core functionality of teleoperation robotics. This approach to machine control promises much greater precision, dexterity and reliability. Therefore, it allows tackling numerous fields from a completely new perspective, including disaster relief, search-and-rescue robots, nuclear decommissioning applications and even suggests an alternative way of interplanetary exploration.
Solution:
Using NI LabVIEW system design software to develop a technical solution that would allow integration of diverse advanced technologies, including Oculus VR Rift virtual reality headset, Microsoft Kinect sensor and the NAO Humanoid robot. The system included limb position, head orientation and relative location mapping, as well as real-time FPV video feedback projected to the Oculus Rift head-mounted display. Most importantly by using National Instruments unified platform, the solution was capable of seamless scaling, where the off-the-shelf humanoid was later replaced by a custom and cost-effective robotics implementation, controlled by the NI myRIO embedded hardware device, with the help of LabVIEW Robotics and LabVIEW FPGA modules.
Video 1 University of Leeds footage
Video 2 Technical details of the project
Why NI LabVIEW?
An increasing number of industry professionals are recognising the graphical system design benefits and exploiting them to their own advantage. And it’s no secret that National Instruments LabVIEW is at the forefront of this on-going engineering revolution.
NI LabVIEW is most famous for it’s platform approach that allows seamless integration, reduced development times and increased productivity. Therefore choosing LabVIEW has really been a no brainer for me, because no other solution could have helped me meet my objectives with the time, money and expertise constraints I had.
By using the Queued Message Handler software architecture and modular programming principles, toolkit-like interfaces were built for the Oculus Rift headset and NAO Humanoid robot. This not only meant that the debugging and testing phases were seamless and simple, but also that the NI community can explore and build upon these ideas.
Moreover, the project is currently being scaled up to use a custom-built humanoid instead of the NAO. This next project iteration is powered by the NI myRIO embedded hardware device to create a more affordable, customisable and integrated version of the LabAtar (see "What's next?" section below). Again, this has only been possible due to the plug-and-play interface National Instruments provides when it comes to interfacing software to hardware.
I was very lucky to experience the key features of National Instruments products throughout my project first hand – I had a User Interface completed in less than a day, communications over TCP protocol implemented in hours and integration with Microsoft Kinect completed in minutes. I cannot name a single other environment that would provide such accelerated productivity allowing to spend more time on subjects that truly matter – innovating and discovering new ideas.
Figure 2 Front Panel of the main VI
Figure 3 Block-diagram of the main VI
What's next?
Very good question!
A DIY replacement for the NAO Humanoid robot is on it's way and should be finished within the next 10 weeks. It has been successfully accepted for the EPSRC Vacation Bursary scheme, hence the development will be funded and supervised by research staff at the University of Leeds.
The humanoid robot in development is powered by myRIO and LabVIEW Robotics module for real-time simulation of the dynamic robot behaviour. What's best, using the simplicity of LabVIEW FPGA and the integrated Xilinx FPGA chip, all 15 high-torque servos are controlled without any need for dedicated servo controllers.
This undertaking is another part of my large-scale LabAtar project vision - bringing affordable and usable teleoperation solutions to the world. Most importantly, such rapid and successful progress has only been possible due to platform-focused, scalable and reliable tools National Instruments provide!
A sneak peek preview of its first steps is embedded below.
Figure 4 NI myRIO-powered next iteration of LabAtar
Wow! This looks like a really awesome project. One ethos of robotics is to replace humans with robots in dangerous jobs or tasks, but there is always that feeling of seperation, of not connecting to the task. Projects like this bring us a step closer to being there on the scene through the body of a robot, while still distancing ourselves from any real danger. Great job!
Virtual Reality... Wireless sensing... Robotics... all tied together by LabVIEW... to achieve a forward-thinking, noble goal!?
Wow. An truly inspired project.
We all know that there are millions of school children out there who don't think engineering is "cool". They REALLY need to see this project!
Innovation at it's finest!
One of the first steps to something truly revolutionary.
Truly impressive.
Brilliant stuff Mark - imaginative and inspiring work. Let me know if you're running demo's in elec-eng, would love to see it in action!
Excellent project Mark, congratulations! I love how you integrated three pieces of hardware (Microsoft Kinect, Oculus Rift and NAO humanoid robot) and two programming languages (LabVIEW and Python) into one working system.
Mark fantastic work, great to see some of our pervious students earlier work with the kinect still being used, (the blue Kinect skeleton brings back some very happy memories !!) and briliant to see you bringing it up to date with such cutting edge systems being brought together !!
Best of luck in the competition......
Mark, it is a masterpiece! Let me be the first customer who buys LabAtar when you start a mass production!
This is a wonderful job. Very impressive.