Year Submitted: 2017
University: Cork Institute of Technology
List of Team Members (with year of graduation): Adam Burke (2018), Craig Davitt (2018), Monika Kozakiewicz (2018), Jacek Jankowski (2017)
Faculty Advisers: Donal O’Donovan
Main Contact Email Address: Adam.Burke1@mycit.ie
Title: Autonomous Robotic Tour Guide
Description: The goal of this project is to convert an exciting, tele-operated mobile robot platform to one capable of full autonomy. That can act as a fully autonomous tour guide for future first year students. The project will be using glyph recognition, obstacle avoidance and GPS waypoint navigation.
Software: LabVIEW 2015,
LabVIEW Real-Time Module,
LabVIEW MYRIO 2015 Toolkit,
LabVIEW Robotics Toolkit
Hardware: NI MYRIO 1900,
Other Products: Visual studios 2013,
Pmod GPS: GPS Receiver,
2x Basler acA1300-200uc,
2x Edmund Optics 6mm compact fixed focal lens.
Allen Vanguard responder.
Create a fully Autonomous Robotic Tour Guide. This self-automated robot would be used to guide future first years around the college campus inside and outside. On the robots guide it will use cameras (courtesy of yourselves) to recognise glyphs. When a glyph is recognised the robot would make decisions on what direction to travel next and at this point speak useful information to the first years. When the robot is in motion it will be using a ring of ultrasonic sensors that will guide the robot down the corridors and outside making the correct adjustments to avoid oncoming obstacles. This will be incorporated buy using the Advanced Vector Field Histogram in LabVIEW. When the route is taken outside the GPS waypoint navigation will be used for the robot to be guided to the next block on campus.
The Autonomous Robotic Tour Guide works by using two USB 3.0 cameras (thanks to yourselves) to scan the corridors of the campus looking for different glyphs. The robot then will react differently to each glyph. Each glyph will consist of different commands like forward, left, right and stop. Also, when the glyph reads a glyph it will stop and use speech to tell the students what department they are in and a piece of useful information. This algorithm was written in programming language C# using the Aforge framework in Visual Studios. This was then intercommunicated using UDP to LabVIEW. When the robot is in motion there will be using three Ultrasonic sensors for obstacle detection. One sensor at the front, one at the left safe and one at the right side of the robot. The obstacle detection will be running on the concept of when it sees and obstacle that it makes the appropriate steering corrections to avoid the obstacles. If the course of the robot is taken outside this is when the GPS waypoint navigation takes over. At this point in the project we can display the current GPS location the robot when it is outside. This location can be then updated when the robot is in motion. The GPS works off at least four satellites and at most seven. When the robot is inside or outside the obstacle detection will be always running.
The end aim of our project is to have a full automatous robot that will act as a tour guide around our college. We saw LabVIEW as the best option in undergoing this project for many reason, LabVIEW has its own robotics tool set, along with being able to intercommunicate between other languages using either UDP or TCP. LabVIEW gives us the ability to run different section of the code parallel to each other, not easily done in other languages. The graphical nature of the LabVIEW IDE facilitated the rapid development a working prototype within a month.
Figure 1 - Autonomous Robotic Tour Guide
The current level of the project would be in the early Alpha stage of completion. Very basic autonomous movement has been achieved but a long way from being completed. There are a few flaws in the project mainly the obstacle detection. This will be upgraded to the obstacle avoidance in the coming future. The obstacle avoidance is in research stage of project at this moment in time. The current completion of the waypoint navigation is that we have the Pmod GPS: GPS Receiver giving the longitude and latitude of the robot. Eventually being able to perform waypoint navigation. The glyph recognition is working but in very early stage of completion.
The base of the project was started in September of last year, with a team of one member working on it for 3 months. Much of the overall behaviours of the robot was completed before the start of this year like the basics of obstacle detection. A new team of three was created to work on glyph recognition ,GPS location and more advanced obstacle avoidance which was only started in the second week in February this year. The glyph recognition, obstacle avoidance and GPS wave point navigation is expected to be completed in May.
In the future, we will see the robot able to navigate from one waypoint to another, along the path set by the user (as number of GPS waypoints). Having information about current position and desired locations, algorithm will provide useful data. This data will lead robot to its destination(s). Because that particular navigation is going to be based on a GPS signal, it will not work inside facilities. Once an entrance to the collage is found glyph recognition will take over. In addition to that there will be a second camera used to scan both sides of the GRANT Bot searching for glyphs. The obstacle detection will be upgraded to obstacle avoidance. This will help the robot navigate around the interior of the collage. The plan is to create an algorithm using Advanced Vector Field Histogram in LabVIEW. There will be a ring of between eight and ten HC-SR04 Ultra01+ sensors all connected to a NI MYRIO. This will give the robot a three-hundred-and-sixty-degree view of the surroundings. The obstacle avoidance goal will be to avoid obstacles while staying on track to its destination.