Professional Documents
Culture Documents
o Default Pin: 1 2 3 4
MOTOR DRIVERS
o Required for ensuring proper functioning of
motors
0 0 No Rotation
0 1 Anti-Clockwise Rotation
1 0 Clockwise Rotation
1 1 No Rotation
INFRARED SENSOR
The specifications for the infrared sensor can be
seen as shown below:
Range of Functioning 15 cm to 50 cm
Basic Outline Of
The Project
LABVIEW Robotics Module
The new LABVIEW Robotics Module
includes all of the software tools needed
to design a sophisticated autonomous or
semiautonomous system. Robots are very
complex mechatronics systems, so to
simplify them, we can apply a simple
paradigm sense,
think , act. Every autonomous or
semiautonomous robot, in some form or
another, must sense it's environment,
make a decision, and act on the
environment. The new LABVIEW
Robotics module provides APIs and
example programs for each step of the
sense think
act.
SENSE: Retrieve Data From Sensors
o LIDAR Hokuyo URG Series, SICK LMS 2XX Series, Velodyne HDL64E
o Infrared Sharp IR GP2D12, Sharp IR GP2Y0D02YK
o Sonar Devantech SRF02 & SRF05, MaxSonar EZ1
o GPS Garmin GPS Series, NavCom SF2050, ublox, 5 Series, Applanix
POS LV
o Compass Devantech CMPS03 (PWM or I2C), PNI Field Force TCM
o Inertial Measurement Unit (IMU) Microstrain 3DMGXx and
3DMGXx,
o Crossbow NAV440, Ocean Server OS4000
o Camera Axis M1011 IP camera, analog cameras (MoviMED Analog
Frame Grabber for CompactRIO), USB cameras (Windows only), NI
Smart Cameras
Process Flow Diagram
In the collision avoidance routine, every time the robot detects an
obstacle it reverses, goes in the right hand direction and then tries to
reach its original path of travel. However, it keeps of performing the
collision avoidance step until it sees a clearance. All movements in
this case are time dependent.
THINK: Obstacle Avoidance Algorithms
Once you are able to acquire data from sensors and control your mobile robots
movement, the next step might be to apply an obstacle avoidance or path planning
algorithm. LabVIEW Robotics includes new VIs and examples for avoiding
obstacles based on sensor feedback, and for calculating the shortest path between
a start location and a goal location.
The user can see the simulated robot from a birds-eye view in the
Main Camera indicator (large indicator in the middle of the front
panel can you see the tiny red robot?). The user can see what is in
front of the robot in the Camera on Robot indicator (top right
indicator on the front panel) . And the user can see what the robot
sees/interprets as obstacles in the Laser Range Finder indicator
(this indicator, right below Camera on Robot, is particularly
useful for debugging).
DRAWBACKS IN IMAGING ALGORITHM
The algorithm basically involves Vector Field Histogram technique which will
focus on colour change of the environment. The major drawback of this
algorithm that was experienced during testing of the code in simulation were
that the algorithm doesnt give the expected results when there is a change in
the floor environment.
In our experiment we found out that when our bot moves from a carpet of
brown red colour to a wooden floor the bot unexpectedly turned around
despite not facing any obstacle in front of it within the minimum specified
distance set in the algorithm. The basic reason for that is that with the change
in colour the values in the histogram correspondingly change which lets the
bot into believing that an obstacle is in front of it and it starts to move in the
reverse direction.
YO U
A NK
TH