University: ANNA UNIVERSITY
Team Member(s): K.SUNDER, J.SATHEESH KUMAR, P.PRIYANGA, D.SHITAL PARAKH
Faculty Advisors: Mr. P. MANOJ KUMAR
Email Address: email@example.com
Our approach to building the robot and its controlling software using LAB VIEW, which differs from that used in many other projects in a number of ways. This project builds no special environment for our robot and insists that it must operate in the same real world environment. LAB VIEW module helps to monitor the computational performance of the components of the control system, which refines the design of a real time control system for mobile robots based on a special purpose distributed computation engine. Various peripherals like vision Camera for Pattern Recognition, Proximity for metal detection and Infrared for Fire Detection and Ultrasonic sensors to local obstacle detection are implemented. The main objective of this project is to control the mobile robot by using LAB VIEW Real time software Interfaced with the robot; because of this interface we can use a personal computer as a robots brain. Also this robot can perform Special tasks like Path tracing & planning, remotely controlled and monitored by TCP\IP. And also we suggest this project for Industrial AGV’s [Automated Guided Vehicles], Surveillance Vehicles for Military Applications, Home Security Robot and driverless vehicle for kids, handicapped & blind people. Another one important feature of this robot is that, it is up-gradable by implementing new programming concepts.
NI HARDWARE PRODUCTS:
NI ELVISmx driver software.
NI SOFTWARE PRODUCTS:
This Project describes a navigation system for an autonomous vehicle using machine vision techniques applied to real-time captured images of the track, for academic purposes. The experiment consists of the automatic navigation of a remote control car through a closed circuit. Computer vision techniques are used for the sensing of the environment through a wireless camera. The received images are captured into the computer through the acquisition card NI USB cam, and processed in a system developed under the Lab VIEW platform, taking advantage of the toolkit for acquisition and image processing. Embedded logic control techniques are incorporated for the intermediate control decisions required during the car navigation
The building of the robot with the self-navigating capability and with on-board cameras serves as the best solution for the above mentioned challenges.
CONSTRUCTION OF THE BASE
Fabrication of Robot Contains Following Parts they are,
DC GEARED MOTOR
Selecting the appropriate motor to perform specific functions such as turning a wheel, lifting arms and squeezing claws is an important part of the design process. Fig. 2 shows the Assembled view of the DC Geared Motor. Although some choices are obvious, before you begin to make motor selections, you should have determined two pieces of information. First, know what type of robot you intend to build. Must it be fast and manoeuvrable, or methodical and precise? Factors, such as moments of inertia, friction characteristics of the load, and power needs, are relevant. Perusing the different motor properties such as RPM, torque, speed, power, and weight will help you to focus on those most important factors to facilitate your design. In the table, the motors are arranged by their peak power ranging from most powerful down to weakest. Each motor shown in the table above has a unique set of speed/torque characteristics that can be adapted to perform work by the robot. Some are quite powerful and draw high currents when loaded to their limits. Others have very high-speed capability. Some have integral transmissions that magnify their torque output. The table provides the peak power rating, torque and current at “stall”, and the “no load” speed and current for the kit motors typically based on the voltage input of 12Vdc (or as shown in the Notes column).
The table 1 shows the Specifications of the DC Geared Motor .The torque/speed data presented here is based on a single voltage level (12V). In actual operation the motor speed/torque will shift toward the origin as the voltage applied to a motor is reduced because of either your joystick setting or decreasing battery voltage. Lower voltage produces lower speed and torque. Conversely, higher voltage yields higher torque and speed with higher current and more heating of the motors. Most of the motors supplied in the Kit were not originally designed to drive robots. For example, the motor shafts on the Globe and Mabuchi motors are designed to provide axial torque only, and cannot withstand any significant side loads imposed on their motor shafts. When using these motors, take care to securely fasten the motors and then couple the shafts via flexible couplings to the rest of the drive train when used for motive power (as suggested in this graphic).
Selecting appropriate ratios for gear, sprockets, chains, etc. to perform within a motor’s power band is the goal. Poor motor performance or premature failure may occur if the employed transmission ratios are not properly chosen or suited to allow motors to operate within their preferred or “normal” torque/speed/current ranges. Using the 2005 Transmissions and sprockets will help toward making these choices easier for you.
Power Supply +12 V
Normal Load Current
At Normal Load
ULTRA-SONIC RANGE SENSOR
Ultrasonic sensors (also known as transceivers when they both send and receive) work on a principle similar to radar or sonar which evaluate attributes of a target by interpreting the echoes from radio or sound waves respectively. Ultrasonic sensors generate high frequency sound waves and evaluate the echo which is received back by the sensor. Sensors calculate the time interval between sending the signal and receiving the echo to determine the distance to an object. This technology can be used for measuring: wind speed and direction (anemometer), fullness of a tank and speed through air or water. For measuring speed or direction a device uses multiple detectors and calculates the speed from the relative distances to particulates in the air or water. To measure the amount of liquid in a tank, the sensor measures the distance to the surface of the fluid. Further applications include: humidifiers, sonar, medical ultrasonography, burglar alarms and non-destructive testing. Table 2 shows the Specifications of the Ultra Sonic Sensors.
Table 2 Ultra Sonic Sensors Specifications
5 V Power Supply
30 mA Typ. 50 mA Max
Detect 3cm diameter broom handle at > 3 m
10uS Min. TTL level pulse
Positive TTL level signal, width proportional to range
The SRF005 ultrasonic range sensor detects objects in it’s path and can be used to calculate the range to the object. It is sensitive enough to detect a 3cm diameter broom handle at a distance of over 3m.The 2D Design of the sensor shown in Fig 3.
The module can be used in two different modes:
This method is used to measure reflection time up to the object between transmitting pulse and receiving pulse of the ultrasonic wave. The relationship between the distance up to the object L and the reflecting time T is expressed by the following formula:
L=CT/2 where C is the velocity of sound.
That is, the distance to the object can be ascertained by measuring the reflection time involved in reaching the object. Fig 4 shows the Block Diagram of Ultra Sonic Sensors.
PYROELECTRIC INFRARED MOTION SENSOR
Compact and complete, easy to use Pyro electric Infrared (PIR) Sensor Module for human body detection. Incorporating a Fresnel lens and motion detection circuit. High sensitivity and low noise. Output is a standard 5V active low output signal. Module provides an Optimized circuit that will detect motion up to 6 meters away and can be used in burglar alarms and access control systems. Inexpensive and easy to use, it's ideal for alarm systems, motion-activated lighting, holiday props, and robotics applications.
The Output can be connected to microcontroller pin directly to monitor signal or a connected to Transistor to drive DC loads like a bell, buzzer, siren, relay, opto-coupler (e.g. PC817, MOC3021), etc. fig 3. Shows the Pin details of the ultra Sonic Sensor Module.
Connect to Ground or Vss
Connects to Vdd [3V to 5V] @ ~ 100 uA
Connects to an I/O pin set to INPUT mode (Transistor)
The PIR Sensor requires a ‘warm-up’ time in order to function properly. This is due to the settling time involved in ‘learning’ its environment. This could be anywhere from 10-60 seconds. During this time there should be as little motion as possible in the sensors field of view. Fig 4 Shows the PIR Sensors Jumper Details.
Output remains HIGH when sensor is retriggered repeatedly. Output is LOW when idle (not triggered).
Output goes HIGH then LOW when triggered. Output is LOW when idle.
VARIOUS CONTROL AVILABLE FOR THE ROBOT
MANUAL CONTROL THROUGH TARGET PC
The front panel of the system is shown in the fig 5. The various control of the robot available are FOREWARD, BACKWARD, TURN LEFT and TURN RIGHT. The graphical user interface also gives information about the distance of the robot from the user and this can be done by using the ultrasonic sensor mounted onto the robot. This ultrasonic sensor transmits the sound waves and receives the echoes from the material besides the robot. The time interval between the transmitted sound waves to the received echoes is calculated and the distance from the material is determined. The catch about this control is that the human assistance needs to move along with the robot in order to control it.
REMOTE SERVER CONTROL THROUGH HOST PC
The robot is controlled by using the computer held by the user at a stationary point. The robot is connected to the computer using the Wi-Fi connection. The distance over which the control is available depends upon the medium used. The medium could be Bluetooth, infrared or any kind of medium available. The pictures captured by the robot are transferred to the user through the medium used. The GUI for the remote control server is shown in the fig 6.
AUTOMATIC CONTROL USING VISION MODULE
The control of the robot can be done by vision control. The piece of paper that says FORWARD is analysed by the robot and the corresponding control is done. Robots Character recognition Feature by using IMAQmx Drivers is used for this control. The GUI for the vision control is shown in the fig 7. The GUI for character recognition is shown in the fig 8.
NAVIGATING USING OBSTACLE DETECTION
The important feature of the robot is navigating by analysing the obstacles nearby. The robot senses the materials beside it and maps its own path to reach its destination. This feature could be used for many other applications. The ultrasonic sensors make sure that the robot.