Student Projects

cancel
Showing results for 
Search instead for 
Did you mean: 

OmnIVOice - Social robot

Contact Information

University: Technical University of Lodz

Michał Maciejewski BSc, Student, Lodz University of Technology, Poland (2014)

Marek Niemiec BSc, Student, Lodz University of Technology, Poland (2013)

Łukasz Ryszka BSc, Student, Lodz University of Technology, Poland (2013)

Faculty Advisers: PhD, DSc Grzegorz Granosik

Email Address: michalmaciejewski89@gmail.com

Submission Language English

Project Information

Title: OmnIVOice - social robot

Products

LabVIEW Robotics 2009

LabVIEW 2011 SP1

NI myDAQ

The Challenge

To build an autonomous mobile robot which interacts with humans, and works in closed space. A key part of the project was to design a PC application that integrates several hardware and software modules: ARM microcontroller, motion sensors, PC-104 embedded computer, complex control algorithms and graphical user interface.

The Solution

Using plentiful features offered by LabVIEW to develop an application on PC computer. The main application efficiently integrates various hardware and software components and controls the behavior of the robot. As a consequence of successful design, our OmnIVOice robot can be used for the research on human-machine interaction.


Introduction

          In recent years dynamic development of the social robots has been observed. In the contrary to the widely known and applied industrial robots that help humans in difficult and dangerous tasks, the social robots are designed to help people in basic, daily routines. There are already robots that work with people as personal assistants, guides or a caregivers. Plenty of mentioned robots are relatively big, therefore they are not able to work in small, closed environments like a desk or a table. Moreover, Human-Robot Interaction (HRI) is still the new field of research and lots of issues, like lack of ability to grow wisdom and autonomous collaboration have not been sufficiently addressed yet.

            In this text the student’s project titled „OmnIVOice – social robot” is described. OmnIVOice (Figure 1) is an autonomous mobile robot comprising several interconnected modules. The robot implements basic HRI functions like: voice generation, recognition of human body, obstacle avoidance, and wireless communication. Furthermore, in the future project aims at testing HRI.

     1.jpg

Figure 1: Mobile robot OmnIVOice

Our project has started with designing of mechanical part (Figure 2). As we assumed at the beginning, our platform should be able to work and cooperate with human in closed areas.

  2.jpg

Figure 2: 3D model of the platform

For that purpose we decided to use omnidirectional kinematics, i.e. our robot is able to move in any direction at any time and is also holonomic. To obtain that functionality we have used three motorized 90-degree Swedish wheels (with the rollers perpendicular to each main wheel) symmetrically arranged on the circumference of the robot platform.The robot kinematics, i.e. relation between angular velocity of wheels and linear and angular velocity od the platform can be described in the following way

3.jpg

Figure 3:Omnidirectional kinematics

We obtained good performance of robot’s kinematics. Robot platform is built with plexiglas and aluminium due to its strength and low weight. Moreover, together with a picture of lady, it creates a human-friendly image. After the base platform was done, we started electronic and software part of the project.

4.jpg

  Figure 4: System architecture

OmnIVOice architecture (Figure 4) is configured in the following way:

  1. STM32 microcontroller module is responsible for controlling motion of the platform. The application was developed in C++. It sends information about angular velocity of each wheel to stepper motor controllers based on L297 and L298 chips. Application also implements obstacle avoidance, which is very important feature, because robot cannot damage itself as well as its environment during operation. Additionally, to connect wirelessly with host computer we used radio frequency (RF) modules based on RS-232 standard.
  2. PC-104 module is based on industrial standard and running Windows XP. We have built an application in C++ based on Application Programming Interface (API) and Speech Application Programming Interface (SAPI) libraries. It gets text to be spoken from main application. Our robot uses voice of Maja supported by IVONA Software. Similarly as in case of previously described part, speech commands are sent wirelessly.
  3. LabVIEW application installed on the host PC controls behavior of the robot. The platform can be controlled by 3 Degrees of Freedom (DOF) joystick and motion sensor Microsoft Kinect. It also manages wireless communication with ARM microcontroller and PC-104 computer.

Modules 1. and 2. are located on mobile platform while user interface works on the desktop computer.

Programming With LabVIEW

During the design phase we decided to choose programming environment that offers intuitive programming, graphical user interface (GUI) transparent to code and lots of design patterns. Additionally, the programming environment should guarantee ease of code design due to ready-to use advanced functions, integration with third party software and hardware. NI LabVIEW was a natural choice for us due to plenty of its advantages. Another important feature of LabVIEW is that it encourages programmer to create scalable and modular code. As a consequence each project build in LabVIEW can be simply extended by new features. Since we didn’t have to create: drivers, functions or GUI, we were able to completely focus on the implementation of the complex algorithm to control robot’s behavior. Besides of NI products we also thought about using C++ or C# to develop main application. Such approach was more time consuming and difficult due to issues regarding event driven programming.

The code of main application is based on Event-Based design pattern. Event structure responds directly to the events in order in which they occur in the system, and executes the code written for each event. Since, in case of autonomous robot, events may occur simultaneously, the application may react to all of them in order they occurred, without utilizing 100% of CPU time. Timeout terminal of the Event Structure is set to 100 ms. This ensures that application responds to the front panel controls, does not confuse the user and reduces CPU requirements. The code created this way integrates STM32 microcontroller, that works synchronously with 1 kHz frequency, and PC-104 application, that works asynchronously. 

Additionally LabVIEW environment simplified design of graphical user interface (Figure 5).

  5.jpg

Figure 5: Graphical user interface

It is a consequence of the fact that in LabVIEW the code and the graphical user interface are created at the same time. To sum up, the LabVIEW application is the heart of our project, it is a hub around which everything else revolves. The application gets information about the current state of the robot, processes these data, and sends back commands with new velocities of each wheel and the text to be spoken.

Wireless Communication

In order to obtain freedom of movement we decided to use wireless communication between the host PC and the platform. For that purpose we have chosen radio frequency modules compatible with RS-232 standard. RF module (868 MHz) connected to the host computer is based on Future Technology Devices International (FTDI) devices. It is well known technology converting RS-232 serial communication interfaces to USB. Such solution saved our time because we easily and quickly created drivers in LabVIEW based on appropriate .dll libraries. Furthermore, embedded error handling code guarantees stable work of driver and indicates where the error occurred.

Motion Control

Platform can be controlled in two ways: by joystick or Kinect. The movements of the 3 DOF joystick are the same as of the platform and therefore, the control of the platform seems very natural. Main application was also integrated with Microsoft Kinect, well known motion sensing device used in video games. Kinect offers sophisticated vision processing with human recognition algorithms and 3D mapping. These functions extended the motion controller of our robot by new features like: human body tracking and gestures recognition. We used LabVIEW libraries supported by Florian Abry, Application Engineer from NI Germany.

Information about expected behavior of the robot (i.e. linear velocities and angular velocity) obtained from before mentioned devices are transformed to the angular velocities of each wheel on the basis of vector-matrix equations describing kinematics of the platform (so called inverse kinematics).

Conclusions

Realizing the OmnIVOice project we could observe significant differences in designing and building process for various modules of our system. Electronic components and associated software were developed in parallel. Application on STM32 microcontroller required dedicated drivers for peripheral devices, and they had to be created from scratch. Although, this process was simplified by libraries with configuration functions, we still had some problems with extended options. However, STM32 module implements low-level motion control functions with high frequency. In case of PC-104 computer mostly we could employ ready-to-use libraries, however some problems appeared with the voice generation. LabVIEW eliminates those problems by delivering ready-to-use functions, drivers as well as library of examples and multi-level help system. Moreover, it is worth to notice that LabVIEW integrates multiple programming languages. We used .dll libraries, .NET functions, and m-files. Therefore, we were able to focus on the implementation of the algorithm rather than dealing with problems pertained to drivers, image processing functions, etc. We found code created in STM32 and PC-104 more difficult in testing and development than in LabVIEW. With LabVIEW we efficiently developed modular and scalable high-level application with intuitive GUI, and the application was finished significantly faster.

LabVIEW application collects and synchronizes information about current state of robot behavior as well as its environment. Additionally, serial communication (with RS-232 standard) integrates different platforms without any problems. We believe that as a result it creates something greater than just simple combination of modules. This approach, based on the scalability and modularity, opens an easy way for further development. The project was awarded in the regional and domestic robotics competitions.

Contributors