You are on page 1of 6

CHAPTER-1

INTRODUCTION

1.1 INFRARED (IR) :


Radiation is electromagnetic radiation whose wavelength is longer than thatof
visible(400-700 nm), but shorter than that of terahertz (100 µm - 1 mm)
andmicrowaves (~30,000 µm). Infrared radiation spans roughly three orders of
magnitude (750 nm and 100 µm).

Direct sunlight has a luminous efficacy of about 93 lumens per watt of radiant flux,
which includes infrared (47% share of the spectrum)), visible (46%), and ultra-
violet (only 6%) light. Bright sunlight provides luminance of approximately
100,000 candela per square meter at the Earth's surface.

Infrared imaging is used extensively for both military and civilian purposes. Military
applications include target acquisition , surveillance, night vision, homing and tracking.
Non-military uses include thermal analysis, remote temperature sensing, short-
ranged wireless, spectroscopy, and weather forecasting.. infrared astronomy uses sensor-
equipped telescopes to penetrate dusty regions of space, such as molecular clouds; detect
cool objects such as planets, and to view highly red-shifted objects from the early days of
the universe.

Humans at normal body temperature radiate chiefly at wavelengths around 10μm


(micrometers).

At the atomic level, infrared energy elicits vibration modes in a molecule through a
change in the dipole moment, making it a useful frequency range for study of these
energy states for molecules of the proper symmetry. Infrared spectroscopy examines
absorption and transmission of photonsin the infrared energy range, based on their
frequency and intensity

The name means below red (from the Latin infra, "below"), red being the color of the
longest wavelengths of visible light. IR light has a longer wavelength (a lower frequency)
than that of red light, hence below.

1
1.2 Different regions in the infrared

Objects generally emit infrared radiation across a spectrum of wavelengths, but only a
specific region of the spectrum is of interest because sensors are usually designed only to
collect radiation within a specific bandwidth. As a result, the infrared band is often
subdivided into smaller sections.

CIE division scheme


The International Commission on illumination (CIE) recommended the division of
optical radiation into the following three bands:

 IR-A: 700 nm–1400 nm


 IR-B: 1400 nm–3000 nm
 IR-C: 3000 nm–1 mm

A commonly used sub-division scheme is:

 Near-infrared (NIR, IR-A DIN): 0.75-1.4 µm in wavelength, defined by the water


absorption, and commonly used in fiber optic telecommunication because of low
attenuation losses in the SiO2 glass (silica) medium. Image intensifiers are sensitive to
this area of the spectrum. Examples include night vision devices such as night vision
goggles.
 Short-wavelength infrared (SWIR, IR-B DIN): 1.4-3 µm, water absorption increases
significantly at 1,450 nm. The 1,530 to 1,560 nm range is the dominant spectral
region for long-distance telecommunications.
 Mid-wavelength infrared (MWIR, IR-C DIN) also called intermediate infrared (IIR):
3-8 µm. In guided missile technology the 3-5 µm portion of this band is the
atmospheric window in which the homing heads of passive IR 'heat seeking' missiles
are designed to work, homing on to the IR signature of the target aircraft, typically the
jet engine exhaust plume.
 Long-wavelength infrared (LWIR, IR-C DIN): 8–15 µm. This is the "thermal
imaging" region, in which sensors can obtain a completely passive picture of the
outside world based on thermal emissions only and requiring no external light or

2
thermal source such as the sun, moon or infrared illuminator. Forward-looking
infrared (FLIR) systems use this area of the spectrum. Sometimes also called the "far
infrared."
 Far infrared (FIR): 15-1,000 µm (see also far infrared laser).

NIR and SWIR is sometimes called "reflected infrared" while MWIR and LWIR is
sometimes referred to as "thermal infrared." Due to the nature of the blackbody radiation
curves, typical 'hot' objects, such as exhaust pipes, often appear brighter in the MW
compared to the same object viewed in the LW.

Astronomy division scheme


Astronomers typically divide the infrared spectrum as follows:

 Near: (0.7-1) to 5 µm
 Mid: 5 to (25-40) µm
 Long: (25-40) to (200-350) µm

These divisions are not precise and can vary depending on the publication. The three
regions are used for observation of different temperature ranges, and hence different
environments in space.
Advances in technology affect many aspects of our live, including the way we travel and
the transportation systems in general. More and more vehicles are equipped with state of
the art equipments and sensors, and the road infrastructure is becoming smarter and
provides additional supports to the road users. First and foremost, these advances should
be utilized to improve road safety, for example, by reducing the number of accidents and
minimizing those that involve fatality or horrific injuries. On top of that, road travel
experience can be made more enjoyable thanks to intelligent transportation systems that
reduce the effort required by the drivers and the time saved through more efficient
systems.

It is not possible to achieve all these by relying solely on one sensor technology. Multiple
sensors need to be deployed in order to build a complete system where one sensor’s
capabilities can complement others’. This is the approach we took in our work. Starting
with two individual sensors – a near infrared remote identification sensor and a Zigbee
smartdust communication device – each with its own features and limitations, we
investigated how the two can complement each other. For example, the near infrared
identification sensor has a very good range and localization features, but it relies on
visual detection.

3
This makes it prone to errors when objects may obstruct the view, or in inclement
weather conditions. On the other hand, the Zigbee smartdust sensor cannot easily
determine the location of a target (since it relies on radio signal, which does not carry
information on the detection direction), but the radio signal is free from visual
impairment. By combining the positive characteristics of these two sensors, we have
developed a new system that provides a more robust detection and identification of
objects on the road.

In this paper, we demonstrate the benefits of this collaboration through two applications:
• A road sign detection application
This application enables road signs on the side of the road to be detected, identified and
localized accurately in advance. As a result, drivers will become aware of the road
conditions with sufficient time to react, hence improving the road safety.
• A cooperative traffic light application
By knowing how much time is left for a particular colour of the traffic light, drivers can
plan when to change the movement of the vehicle (for example, whether they should be
preparing to move again after a stop), hence they can save time.
There are many other applications that can be constructed using this collaboration, but for
now, we will focus on the two above. But before we get into details with these two
applications, we will introduce each individual sensor and discuss their capabilities.

4
5
6

You might also like