Student Projects

cancel
Showing results for 
Search instead for 
Did you mean: 

Building an NI LabVIEW Toolkit for a Humanoid Robot for Assisting in Social Skills Improvement for Autistic Children

Contact Information

University: Qatar University

Team Members (with year of graduation): Nour Abdelmajid Musa (2014) and Waed Adel Hakouz (2014)

Faculty Advisers: Dr. Uvais Qidwai and Eng. Mohamed Shakir

Email Address: nm1001762@qu.edu.qa wh095645@qu.edu.qa uqidwai@qu.edu.qa

Submission Language: English

Project Information

Products

     - LabVIEW

Introduction of the Team Members

Nour Musa a Computer Engineering (CE) senior student in Qatar University who is expected to graduate on June 2014. She has taken several courses that use LabVIEW graphical programming environment as an application software for the concepts introduced in these courses, such as Fundamentals of Electronics, Digital Signal Processing and Computer Interfacing. Also she had worked on several projects using LabVIEW such as Dual Tone Multi Frequency (DTMF) Encoder/Decoder, Remote Client/Server Temperature Measurements and Control as well as Light Direction Indicator using a Stepper Motor.  She participated in Course Project Award (CPA) competition organized by Qatar University for DTMF Encoder/Decoder course project which had won the first prize. She is interested more in Network Security field and Robotics.

Wa’ed Hakouz a CE senior student in Qatar University who is expected to graduate on June 2014. She has taken several courses that use LabVIEW graphical programming environment as an application software for the concepts introduced in these courses, such as Fundamentals of Electronics, Digital Signal Processing and Computer Interfacing. Also she had worked on several projects using LabVIEW such as DTMF Encoder/Decoder, Remote Client/Server Temperature Measurements and Control as well as Light Direction Indicator using a Stepper Motor.  She participated in CPA competition organized by Qatar University for DTMF Encoder/Decoder course project which had won the first prize. She is interested more in Network Security field, Robotics and Neural Network.

The Challenge

Humanoid robots can be limited by the method used to control them. The goal is achieving a simple way of controlling a humanoid robot using a general graphical programming tool making it possible for developers to create applications for these robots very easily and educators to use it with equal ease as a teaching tool.

The Solution

To achieve the goal two possible solutions were developed. One solution is an intrusive solution which involves changing the internal circuitry of the robot. The other solution is a non-Intrusive solution which involves adding external components only. For both solutions a LabVIEW toolkit was designed to gain control of the robots’ functionalities. The toolkit structure is the same for both solutions therefore; high level subVIs will allow the user to choose between working with one of the two solutions to access the basic functionalities of the robot. However, low level subVIs are built to work only with the intrusive solution allowing the user to control each motor on the robots’ body giving the user the ability to have functionalities other than the basic functionalities that come with the robot.

As mentioned earlier the aim of this project is creating a simple and general way to control a humanoid robot. Using LabVIEW makes the solutions general and simple since LabVIEW is widely used by developers for developing various applications, and LabVIEW toolkits are easy to use and reduce time spent on building certain application aspects from scratch.

For both solutions the humanoid robot used is RoboSapien [1] which is one of the very simple humanoid robotic toys commercially available and is used with a sophisticated IR remote controller with 21 keys and several built-in functionalities. Internally, the robot is equipped with seven DC motors and uses IR signals to control the robot. The intrusive solution aims to change the control method to overcome the limitation of IR signals using Bluetooth signals instead. This was done using VISA blocks and protocols provided by LabVIEW. However, the non-intrusive solution removes the need of using the IR remote and instead sends signal from the remote using a USB-UIRT device. In this solution the LabVIEW USB-UIRT library was used to send the correct IR signals using LabVIEW VIs.

Robotic toys have been used in assisting teaching and training autistic children for the past few years now. However, the solutions seem to be either expensive, hard to change or limited to no change, or use sophisticated and not user friendly control environments. An expensive solution was introduced in a paper published in 2012 using NAO humanoid robot [2]. Another solution was presented on 2013 that is similar to the proposed non-intrusive solution, but with fewer functionalities and only introduced a joystick application[3]. Since IR communication is limited by the line of sight issue, the intrusive solution introduced in this paper will use the Bluetooth communication protocol to overcome the limitation of using IR signals. While the non-intrusive solution on the other hand, still uses IR signals to present a solutions that doesn’t involve changing any of the robot circuitry. The high level architecture of both solutions are presented in Figure1 and Figure2.


Figure 1.jpg

Figure 1: High Level Architecture for the intrusive solution.


Figure 2.jpg

Figure 2: High Level Architecture for the non-intrusive solution.


LabVIEW

First of all through this project and all its different solutions the graphical programming language LabVIEW was used for multiple reasons. Firstly, LabVIEW is considered as an easier and simpler way of programming compared to other programming languages which doesn’t use any graphical interface. This will help in focusing on the development part of the application rather than worrying about the language itself or its syntax. Secondly, LabVIEW is a powerful tool to use in real time applications because it’s considered as a fast and intuitive programming environment. Thirdly, LabVIEW has many powerful Toolkits for different hardware which makes it ideal for developing new applications using a wide variety of available hardware. Final point to add is, LabVIEW provides a large number of options to customize the user interfaces which is considered as an important point in this project, since most of the users won’t be familiar with the LabVIEW environment. Therefore, having an easy to use interface gives them an application that is simple to deal with.

Generally, the RoboSapien Toolkit is composed of two sets of subVIs, one group of subVIs are for the first solution which is the intrusive solution. The other set is for the non-intrusive solution.

Intrusive solution

        For this solution the microcontroller of the robot was replaced by the Chipkit Uno32 [4] microcontroller, motor drivers were added to the circuitry, and a Bluetooth module [5] was used to achieve effective wireless communication with the new microcontroller. Some of these changes are shown in Figure 3 and Figure 4. The Uno32 microcontroller was used in this project because of its’ low cost and because it was given for the university to be tested in student applications. To create basic functionalities for the robot, the motors were labeled as shown in Figure 5, then using LabVIEW low level subVIs were created for each motor enabling it to move in two directions. Then combining these subVIs together represent the structure of the Humanoid Robot Tookit.


          Figure 3.jpg

Figure 3: The replaced microcontroller

          Figure 4.jpg

Figure 4: Motor driver on the robot body.

          Figure 5.jpg

Figure 5: Motor Distribution on the RoboSapien robot body.

An example of these subVIs is shown in Figure 6 this subVI moves motor (2) to make the right elbow open outward. These low level subVIs were later used to create high level subVIs which represent the functionalities of the robot such as move forward subVI. This is a solution that adds functionalities to the robot that weren’t available such as moving the right leg forward, it is also ideal for teaching about robotics and interfacing and it gives full control of the robot’s motors. This solution is summarized by the tree diagram in Figure7.


Figure 6.jpg

Figure 6: Right Elbow outward Block Diagram.


Figure 7.JPG

Figure 7: RoboSapien Basic functionalities and actions.


Taking one subVI as an example to understand the Toolkit VIs structure would be sufficient because all other subVIs which the Toolkit is composed of are following the same structure. The overall solution design is based on sending certain data from the LabVIEW subVI serially to the Bluetooth module which is connected to the serial transmission pins (Tx, Rx) of the Uno32 microcontroller, then the Uno32 will interpret these data based on the firmware which is uploaded on it. The firmware is simply about turning digital ports on or off depending on the transmitted data so for example for the subVI in Figure4 the LabVIEW program will send ‘w’, ‘s’,’v’ and then ‘x’ serially using VISA protocol and when the Uno32 receives them it will turn the port no.33 ON , 31 and 32 ON and OFF respectively and finally it will turn the port no.33 OFF again. The complete firmware is added in Appendix A.

Now for understanding the criteria of making ports HIGH or LOW the connections to these ports should be explained. In this design totally four motor drivers were used in order to supply the motors with enough current levels to make them work. TB6612FNG Motor driver was used and it drives up to two DC motors. For each motor a PWM pin and two other pins for direction (CW, CCW) should be connected, and to reduce the number of connections the four direction pins of the two motors were connected to each other and only the motor which has the PWM pin as HIGH will work. So instead of using 6 ports of the microcontroller for each motor driver 4 ports were used, the following arrangement was followed: PWM for the first motor, the two direction pins then the PWM for the second motor.

After building the RoboSapien Toolkit three different applications for it were developed: Mimicking human movements, controlling the robot using a joystick.


Mimicking Human Movements

       In this application the Microsoft Kinect camera was used in order to capture the human movements and send the proper commands to the robot so it will mimic these movements in real time. In addition to the kinect camera the LabVIEW Toolkit Kinesthesia was used to get the joints coordinates and then map them to the proper subVI or subVIs of the RoboSapien Toolkit. And then the subVI will send a certain command via Bluetooth to the robot. This application is a very useful educational tool to be used with autistic children since the robotic toys can capture their attention faster than the teacher can do, so the teacher can contact with the child through the robot. This application is also implemented using the non-intrusive SubVIs, only the SubVIs are changed. A figure of the front panel of this application is only shown in the non-intrusive solution section of the text to avoid repeating figures.

Using Kinesthesia LabVIEW Toolkit was with a huge benefit because it offers a simple and direct way to use the Microsoft kinect camera, access all its functionalities and easily get joint coordinates. The rest was mapping these coordinates to the RoboSapien Toolkit intrusive subVIs.

Controlling the robot using a joystick

       In fact controlling a humanoid robot through a joystick is more interesting and entertaining than using the usual remote control for kids. So this application introduced another way to control the robot utilizing the RoboSapien Toolkit in an easy direct way through a LabVIEW VI. The VI front panel is shown in Figure8 which consists of the Logitech joystick image with different Boolean LEDs to show the pressed button when the VI is running. In the block diagram shown in Figure9 the different joystick buttons are mapped to one or more subVIs of the RoboSapien Toolkit depending on the functionality assigned to the button.


          Figure 8.jpg

Figure 8: Joystick application front panel


          Figure 9.jpg

Figure 9: Joystick Application Block Diagram.


Non-intrusive solution

For this solution the internal circuit of the robot and the robot itself wasn’t touched, instead this solution tried to replace the remote control of the robot with another control method. The IR transmission method is still used here and the originally programmed IR signals of the robot remain the same. The difference is that the commands are sent using the USB-UIRT device instead of the remote control.  In this solution another set of subVIs were created, these subVIs are responsible for transmitting the RoboSapien IR commands using the LabVIEW USB-UIRT Library. This solution delivers sixty different subVIs that introduce functionalities ranging from hand movements to leg movements, tilting right and left and much more. To create the subVIs for this solution the robots’ coded IR signals were obtained by hacking the communication method used to control the robot, so these IR commands are then transmitted through the subVIs. Figure10 shows the block diagram of one subVI. This is a solution that requires less knowledge of hardware interfacing and provides the original functionalities that are available for the robot using the built toolkit. And as an example for the usage of these RoboSapien Toolkit subVIs two different applications were developed. The first one is for mimicking the human movements and the second is LabVIEW remote control.


Figure 10.jpg

Figure 10: Move right arm up subVI (Non-intrusive)


Mimicking Human movements

This application is similar to the application implemented in the intrusive solution but here the subVIs of the non-intrusive solution are used. Here the kinect camera will capture the human movements and the positions of the joints will be provided by the LabVIEW kinesthesia Toolkit. Then these joint coordinates are mapped to the suitable non-intrusive subVIs in order to send the proper IR commands through the USB-UIRT device, thus the robot will mimic the movements in real time. Figure11 shows the front panel of the application VI.


Figure 11.jpg

Figure 11: Mimicking application front panel


LabVIEW RoboSapien Remote Control

        This application is based on replacing the actual RoboSapien Remote control with a virtual LabVIEW Remote, so one can control the robot through the LabVIEW VI easily as he/she is using the RoboSapien remote control. The front panel of this VI looks like the actual remote for the RoboSapien it consists of the different buttons which exist on the actual remote and perform the same functionalities. Figure12 shows the front panel of this application VI. The block diagram is composed of the different non-intrusive subVIs combined together to perform the required button functionality. Figure13 shows the block diagram for this application VI.


          Figure 12.jpg

Figure 12: RoboSapien Remote Control front panel


          Figure 13.JPG

Figure 13: RoboSapien Romote Control block diagram

Design Enhancements

One of the current enhancements that have been done for this project is shifting the whole system including the RoboSapien Toolkit and all its applications to a customized LabVIEW  8 inch tablet version for 32-bit windows 8 operating system . Due to the memory limitations of the tablet LabVIEW had to be installed in a special way so we came up with a strategy for installing minimal LabVIEW by eliminating unnecessary components. Figure14 shows the used tablet running one of the applications.


          Figure 14.jpg

Figure 14: Windows 8 tablet.


Game Applications

                Using the RoboSapien Toolkit one can develop endless number of applications, as mentioned earlier three different applications were created using the toolkit. Another type of applications that can utilize the designed Toolkit is game applications which considered as a wide domain to work with and its limitations linked to human creativity and imagination. The next two games will give the user a clue of what kind of applications can be developed. These games were created and tested with autistic children, on the other hand and like any other games they can be used as educational games for children in general.

- Get Me Out: this activity was done with a high functional child. In this activity the child will navigate the robot through a maze that will be placed on the floor. This will be done by making the child move his/her arms up/down/sideways in front of the kinect which will make the robot go through the maze. The same Kinect application program was used here but with different movements mapping. The Kids Pix application idea can be utilized to make the game more interesting where the child can use his/her finger to go through the maze on a tablet and the robot will follow on the maze placed on the floor. Figure15 shows the Maze used in this game. Figure16 shows a part of the testing video and the full video is also provided.

          Figure 15.jpg

Figure 15: The Maze (Get Me Out! game).


                         Figure 16.JPG

               Figure 16: Testing video for Get Me Out! game.


- RoboColor: this activity was done with a low functional child because in initial tests he didn’t understand the mimicking concept so we have designed this game to make the concept easier to be understood in reverse. Here the child will click on the robot picture in the middle and listen to the color that he/she should know and click on so that the robot will walk towards on a poster on the ground. This can be also done in reverse so the turn taking concept can be also used, in where the child tried to copy the robot. Figure17 and 18 shows the front panel and the block diagram of the RoboColor game program. Figure19 shows a part of the testing video and the full video is also provided.

          Figure 17.jpg

               Figure 17: RoboColor Front Panel.


                         Figure 18.jpg

               Figure 18: RoboColor Block Diagram.


                         Figure 19.jpg

               Figure 19: Testing video for RoboColor game.



Future Work

                The most interesting part of this project is the huge number of future works which can be done. Several plans are listed as a near future work following are some ideas:


  1. NAO Robot: NAO is a sophisticated humanoid robot with unlimited functionalities which allow the development of endless number of applications in which it can be used. The plan consists of developing a LabVIEW Toolkit for NAO as a first step then uses this Toolkit to create several applications. Figure 15 shows the NAO humanoid robot.
  2. Internet Based Connectivity: adding this feature to the project will increase its flexibility by allowing the users whether they are teachers or parents to control the robot and doing all the developed activities remotely. To do so the LabVIEW Remote Panel Connection feature will be utilized.


          Figure 20.jpg

Figure 20: NAO Humanoid Robot.

See attached Word document for references and appendices.

Contributors