You are on page 1of 13

TITLE:

DRUSE
ABSTRACT

 PROBLEM:
In the current scenario, anti-social groups around the globe are constantly posing
threat on the lives and freedom of humanity. Despite India’s constant peaceful
approaches to tackle this problem, it is still one of the main targets of these
groups. To curb this problem, India has to equip its arsenal to face any
confrontation in the near future. Evident from recent incidents, hostage situation
has been one of the main weapons of these militants to extract their malicious
demands from the government. In such critical circumstances, the soldiers are
unable to strategize the rescue mission because they have no idea about position
and equipped resources of the militants as well as the exact location of the
captured hostages.

 SOLUTION:
In dire situations like mentioned above, the consequences of sending a manned
rescue operation without proper intelligence could be fatal. To tackle such a
problem, we propose the development of an unmanned all-terrain amphibious
robot which is capable of locomotion in air. The bot would consist of two parts
one of which is a quadcopter and the other is an all-terrain vehicle. The
quadcopter stealthily carries the vehicle to the required location and then drops
the bot there after which the quadcopter will rest in a secure location. Then the bot
will furtively move throughout the area and generate a map of the localised
region. The camera attached to the bot would provide the live video coverage of
the area to the soldiers. When the task of the bot is over, it would detect the
location of the quadcopter and travel to its position. The UGV would attach itself
to the quadcopter after which the complete bot will return back to the base with
proper intelligence to guide the soldiers for their mission.
QUESTIONNAIRE

 What is your motivation behind participation?


Ans) Being the responsible citizens of India, we are endowed with all the rights that are
mentioned in our Constitution, which we use judiciously and equally demand them in
case of deprivation. But what we fail to realise is that the CONSTITUTIONAL RIGHTS
are followed by FUNDAMENTAL DUTIES. As a person, we like-minded people feel
that if nature has bestowed us with some exceptional abilities and we have got all the
4D's, ie. DETERMINATION to accomplish our goals, DEVOTION for our allegiance
towards our nation, DEDICATION to achieve it despite all odds, and DISCIPLINE to
methodically materialise it, then, being B. Tech. 2nd-year Undergraduates, we are
obliged and honoured to use our skills and knowledge for the progress and development
of the technology of our country. And we don't just count it as a part of our contribution,
but also as a part of our DUTY towards India.

 b) What are your specialised knowledge and expertise?


Ans) We are well versed in Computer Vision using OpenCV, ROS (Robotic Operating
system), Android App Development, SLAM (Simultaneous localization and Mapping),
CAD modelling using Solidworks, performing numerical computations and simulations
using MATLAB as well as some other softwares like Multisim, EAGLE, Psim and
Proteus. We have worked on various projects which equipped us with the knowledge of
RTOS, path planning, microcontroller programming and sensor interfacing using STM32,
TIVA, Arduino and Atmel ATmega microcontrollers. Also, being the B. Tech. 2nd-year
Undergraduates, we have sound theoretical knowledge of our academic course subjects as
well as their practical applications in real life. In spite of what preceded, we would like to
inform that if our DRUSE project requires any specific knowledge or expertise that we
aren't enriched with, we would love to learn, understand, experience and apply it on our
project.
TECHNICAL PROPOSAL:

As per the idea proposed, our robot essentially consists of two parts:
 THE QUADCOPTER:

The primary motive of the quadcopter is to implement the locomotion in the air. The basic
concept behind the flying mechanism of a quadcopter is:
The quadcopter balances the force acting on it because of gravity, air-drag and other additional
forces by controlling the velocity of the propeller motors. The position of the desired location is
fed to the flight control software using UBLOX GPS module. The prime function of the flight
controller is to Take-off, Hover, and Land. The IMU sensor is used to detect the yaw, roll or
pitch by which the drone is getting deflected at a particular altitude. The alternate propellers of
the drone are made to rotate in clockwise and anticlockwise directions respectively (opposite
directions) so that the angular momentum of the entire drone is conserved and the drone remains
still on its position. The altitude of the drone is identified using a Barometer. If the drone is at an
altitude below or above the specified value, then it is forced to return to the given altitude by
increasing or decreasing the velocity of the propeller motors.
The flight control module is linked to the ODROID operating system using MAVLINK. The
Odroid OS is used to perform the task of detection of the exact spot where the UGV has to be
dropped. The Odroid is programmed in such a way the that it uses pose estimation and enters a
decision-making loop that allows it detect the exact spot of landing. The precision of the location
provided by the GPS differs in meters due to which we have to use the Odroid OS to detect
(using image processing) any deviation from the specified.
Once the drone has reached the required location, it has to drop the UGV there. The mechanism
which we have adopted for pick/drop function is by the use of two servo motors attached to the
bottom of the drone. When the drone reaches the spot, it hovers of the region. The Odroid is
programmed in such a way that when the drone is hovering over the spot, it instructs the servo
motor to open, and the package is dropped. When the UGV returns to the drone, the Odroid
detects the presence of the UGV and opens the servo motors, and the UGV is again attached to
the drone.
The Cyber-Physical Architecture of the drone in a block diagram is given below for a better
understanding of the processes involved in the working of the drone.
CONNECTIONS:

PIXHAWK is the flight controller of the quadcopter. Its function is to direct RPM to the
motor in response to input. Different input and output devices are connected to the
PIXHAWK.
The list is as follows:
o Safety Switch
o Telemetry
o 10S power module
o Buzzer
GPS module
The safety switch is a push button switch with a red LED present inside it. Its work is to
enable/disable the outputs to ESCs (i.e. indirectly the motors). It is switched ON/OFF by
a long press of the pushbutton for 1 second.
It is connected via a 3 pin JST to the port present in PIXHAWK as shown in the figure.
The pattern of LED gives information about the status of switch.
Single Blink: System is ready. Switch status OFF. PIXHAWK can’t be armed in this
status
Double Blink: Switch status ON. PIXHAWK can be armed.
Solid LED: Copter is armed.

10S POWER MODULE:


The 10S Power Module enables you to both power your PIXHAWK Mini and
accessories and report battery voltage and current. These values can be viewed in the
PIXHAWK Mini stored logs or live with a telemetry radio connection. Communication is
over the included 6-pin cable. The on-board switching regulator outputs 5.3V at up to
2.25A and supports up to 45Volts (10S LiPo) at a maximum of 90 amps.
A 6 pin JST runs from 10S power module to PIXHAWK power port (picture) to power
the PIXHAWK.

TELEMETRY:

Telemetry is one of the easiest ways to setup a telemetry connection between your
APM/PIXHAWK and a ground station. It comes in set of two
One is for ground station, i.e. PC, and is connected by USB
Second is for air frame, and is connected to quad by 6 pin JST.

GPS MODULE:
GPS module consist of GPS and compass. It tracks position of the quad from
satellites data and gives directions to the quad to move autonomously. It has 2 JSTs, one
of 6 pin, to be connected to GPS port, and one of 4 pin, connected in I2C port.
 THE AMPHIBIOUS VEHICLE:
After the drone has successfully landed in a secure location, the stealth and compact
unmanned amphibious bot, which can travel through the land as well as water, detaches
itself from the drone and starts the surveillance of the entire area. Care is taken that the
whole chassis, as well as the components and gadgets installed with it, is waterproof. All
the components including the microcontroller (Raspberry pi 3B), motor driver, battery,
motors, propellers, wheels, camera, etc. are installed inside the chassis. The wheels are
designed in such a way that while traveling in water, it works as side propellers. It can be
controlled by the differential drive. It can give live video feed through the camera and
GSM module. We can also control and move the robot from a web browser over the
internet. We will use “Motion” for getting a live Video feed from USB camera and
“Flask” for sending commands from webpage to Raspberry Pi using python to move the
Robot.
RASPBERRY PI:
The Raspberry Pi is a series of small single-board computers. It is the main component of
the bot. It provides a signal to the l293d motor driver circuit to drive the bot. USB camera is used
to record live video. It has a sd card slot so we can save the video on a sd card.
UBLOX GPS:
The GPS module will be connected to Raspberry Pi using the USB port.
AUTONOMOUS:
The bot uses OpenCV to process video and LSD SLAM (Large Scale Direct Monocular
Simultaneous Localization and Mapping) to generate the map. LSD-SLAM is a novel approach
to real-time monocular SLAM. It is based upon direct approach (i.e., it does not use
keypoints/features) and creates large-scale, semi-dense maps in real-time.

Raspberry pi Ubuntu-Mate ROS LSD SLAM

To use LSD-SLAM, we will install Ubuntu MATE on Raspberry Pi 3. Ubuntu MATE is a free
and open-source Linux distribution and an official derivative of Ubuntu. Then we will install
ROS(Robot Operating System) kinetic on Raspberry Pi.

We will use GMapping and move_base together to build a map and navigate autonomously in an
unknown environment, but because we can't start GMapping with a partial map, this means that
our map will always start empty. GMapping also requires continually increasing memory over
the life of the map, so it isn't well suited for long-term use. GMapping algorithm is based on
RaoBlackwellised particle filter (RBPF). As proper detection of surrounded obstacles is required
for the accuracy of the algorithm, the RPLIDAR 3600 laser scanner which provides both
distance and bearing angle measurements to the nearby objects.

L293D (Front Motors)

GPS
Raspberry pi
USB Camera

ESP8266

L293D (Front Motors)

ESP8266 WI-FI MODULE:


We will use this module to get live video feed over the internet. It has a good working
range of 4 km. We can use an alternative technique to control the bot wirelessly through internet
by creating a webpage using HTML language and can give commands to the bot, but we need to
be on the range of 4 km.

extension: multiple bots, RC bot

REFERENCES:
https://www.raspberrypi.org/magpi/self-driving-rc-car/
http://perso.ensta-paristech.fr/~filliat/Courses/2011_projets_C10-
2/BRUNEAU_DUBRAY_MURGUET/monoSLAM_bruneau_dubray_murguet_en.html

You might also like