Вы находитесь на странице: 1из 3

REMOTE SENSING: AN OVERVIEW

Partho Protim Mondal1


1

Institute of Geographic Sciences and Natural Resources and Research, CAS

Abstract Summary: Remote sensing is a science (and to some extent art) of obtaining information about an object or
phenomenon through the analysis of data acquired by a device that is not in physical contact with that object under
investigation. It is an outcome of developments in various technological fields from 1960 onwards, in which satellite and
spacecraft are used for collecting information about the earth surface using various kind of sensors. There is two ways to
extract information from data acquired from remote sensing systems; one by visual interpretation, which is manual in nature,
and another by digital image processing using computers, which is largely automated process. The visual image
interpretation is the most intuitive method to extract information based on someone's ability to relate shape, colors and
patterns in an image to real world features. Whereas digital image processing relies on complex computer algorithms which
exploits the statistical distribution of image pixel values and other inherent image propeties.
Keywords: satellite, sensor, visual interpretation, digital image processing

1. INTRODUCTION
Remote sensing means acquiring information about a phenomenon, object or surface while at a
distance from it. This name is attributed to recent advancement in technology in which satellites and aircrafts
are used for collecting information about the earth's surface. In our everyday life when we watch and derive
inferences about an object, this may be characterized as some sort of remote sensing as we are acquiring
information about it without being in touch with it. In much of remote sensing, the process involves an
interaction between incident radiation and the targets object. However, remote sensing also involves the sensing
of emitted energy and the use of non-imaging sensors. Detection and discrimination of objects or surface
features means detecting and recording of radiant energy reflected or emitted by objects or surface material.
Different objects return different amount and kind of energy in different bands of the electromagnetic spectrum,
incident upon it. This unique property depends on the property of material (structural, chemical, and physical),
surface roughness, angle of incidence, intensity, and wavelength of radiant energy. The Remote Sensing is
basically a multi-disciplinary science which includes a combination of various disciplines such as optics,
spectroscopy, photography, computer, electronics and telecommunication, satellite launching etc. All these
technologies are integrated to act as one complete system in itself, known as Remote Sensing System.
There are seven stages/components in a remote sensing system.
They are (A) Energy Source or Illumination, (B) Radiation and the
Atmosphere, (C) Interaction with the Target, (D) Recording of Energy
by the Sensor, (E) Transmission, Reception, and Processing, (F)
Interpretation and Analysis and (G) Application. This can be illustrated
by the following figure 1.
1.1 TYPES OF REMOTE SENSING
There are two main types of remote sensing: Passive and Active
remote sensing.
Figure 1: Stages of Remote Sensing
1.1.1. Active Remote Sensing system emits energy in order to scan
objects and areas whereupon a sensor then detect and measures the radiation that is reflected from the target
(Ali, 2010). Radar (microwave energy) LIDAR systems (visible spectrum) are the example of active remote
sensing.
1.1.2. Passive remote Sensing system detects natural radiation that is emitted or reflected by the object or
surrounding area being observed. Reflected sunlight is the most common source of radiation. Examples of
passive remote sensors include film photography, infrared photography and radiometers.
1.2 ORBITS
Two satellite orbits are important for remote sensing observation of the Earth: the geo-stationary orbit
and the polar orbit.
1.2.1 Geo-stationary orbit: it is such a position for a satellite that it keeps pace with the rotation of the Earth.
These platforms are covering the same place and give continuous near hemispheric coverage over the same area
day and night. These satellites are put in equatorial plane orbiting from west to east Its coverage is limited to 700
N to 700 S latitudes and one satellite can view one-third globe. These are mainly used for communication and

meteorological applications. Weather satellites such as Meteosat, INSAT 3D and GOES are normally positioned
in this orbit. This geo-stationary orbit is located at an altitude of 36.000 km above the equator.
1.2.2 Polar orbit: Satellites in a polar orbit cycle the Earth from North Pole to South Pole. The polar orbits have
an inclination of approximately 99 degrees with the equator to maintain a sun synchronous overpass i.e. the
satellite passes over all places on earth having the same latitude twice in each orbit at the same local sun-time.
This ensures similar illumination conditions when acquiring images over a particular area over a series of days.
The altitude of the polar orbits varies from approximately 650 to 900 km.
1.3 RESOLUTION:
Resolution is defined as the ability of a system to render the smallest possible information. In case of a
remote sensing sensor it refers to the sensors capability to capture details at the smallest discretely separable
quantity in terms of distance (spatial) or wavelength band of EMR (spectral) or radiation quantity (radiometric)
or time (temporal).
1.3.1 Spatial Resolution: The detail discernible in an image is dependent on the spatial resolution of the sensor
and refers to the size of the smallest possible feature that can be detected.
1.3.2 Spectral resolution: It measures the width of the wavelength (bandwidth) which is used to generate the
image. Finer the spectral resolution, narrower the wavelength range for a particular channel or band. Very high
spectral resolution facilitates fine discrimination between different targets based on their spectral response in
each of the narrow bands.
1.3.3 Radiometric Resolution: Radiometric characteristics describe the actual information content in an image.
Its a measure of an imaging systems ability to discriminate very slight differences in radiance. Finer the
radiometric resolution of a sensor, more sensitive it is to detecting small differences in reflected or emitted
energy.
1.3.4 Temporal Resolution: It refers to the frequency of observation, i.e. number of days between two
consecutive observations for a particular ground target under similar viewing geometry. Absolute temporal
resolution of a remote sensing system to image the exact same area at the same viewing angle a second time is
equal to this period.
1.4 SPECTRAL SIGNATURE IN REMOTE SENSING
Any remotely sensed parameter, which directly or indirectly characterizes the nature and/or condition
of the object under observation, is defined as its signature. We actually use the spectral signature of the object
in remote sensing. This can be defined as a unique pattern of wavelengths radiated by an object. These can be
categorized as:
a) Spectral Variation: Variation in reflectivity and emissivity as a function of wavelength.
b) Spatial Variation: Variation of reflectivity and emissivity with spatial position (i.e. shape, texture and size of
the object).
c) Temporal variation: Variation of emissivity and reflectivity like that in diurnal and seasonal cycle.
d) Polarization variation: are introduced by the material in the radiation reflected or emitted by it.
Each of these four features of Electro Magnetic radiation may be interdependent i.e. reflectivity may be
different at different times, or in different spectral bands. A measure of these variations and correlating them
with the known features of an object provides signature of the object concerned. The knowledge of the state of
polarization of the reflected radiation in addition to spectral signatures of various objects in remote sensing adds
another dimension for analysis and interpretation of remote sensing data. These parameters are extremely
useful in providing valuable data for discriminating the objects.
2. VISUAL IMAGE INTERPRETATION
Image interpretation is defined as the act of examining images to identify objects and judge their
significance. An interpreter studies remotely sensed data and attempts through logical process to detect, identify,
measure and evaluate the significance of environmental and cultural objects, patterns and spatial relationships. It
is an information extraction process. The quality of recognition depends on the expertise in image interpretation
and visual perception. Recognizing targets is the key to interpretation and information extraction. Observing the

differences between targets and their backgrounds involves comparing different targets based on any, or all, of
the visual elements of tone, shape, size, pattern, texture, shadow, and association.
3. DIGITAL IMAGE PROCESSING
Digital Image process is a collection of techniques for manipulation of digital images by computers.
Image processing includes data operation (pre-processing) which normally precedes further manipulation and
analysis (classification) of the image data to extract specific information. Image pre-processing includes
geometric and radiometric correction as well as image enhancement process which aim to correct distorted or
degraded image data to create a more faithful representation of the original scene.
The overall objective of image classification is to automatically categorize all pixels in an image into
land cover classes or themes. Normally, multispectral data are used to perform the classification, and the
spectral pattern present within the data for each pixel is used as numerical basis for categorization. That is,
different feature types manifest different combination of DNs based on-their inherent spectral reflectance/
emittance properties. The traditional methods of classification mainly follow two approaches: unsupervised and
supervised. The unsupervised approach attempts spectral grouping based on statistical distribution of DN values.
Having established these, the analyst then tries to associate an information class with each group. In the
supervised classification process, the image analyst supervises the pixel categorization process by specifying
numerical descriptors of the various land cover types present in the scene. To do this, representative sample sites
of known cover types, called training areas or training sites, are used to compile a numerical interpretation key
that describes the spectral attributes for each feature type of interest. In the supervised approach the user defines
useful information categories and then examines their spectral separability whereas in the unsupervised
approach the user first determines spectrally separable classes and then defines their informational utility. In
recent times more advanced classification approach like Hybrid classification, Fuzzy classification, Object
Oriented classification etc. are used. An image classification is often followed by pos-classification accuracy
assessment.
REFERENCES
Campbell, John B. 1996: Introduction to Remote Sensing (2nd Edition). Taylor & Francis
Floyd F. Sabins. 2007: Remote Sensing: Principles and Interpretation (3rd Edition). Waveland Pr Inc
Lillesand Thomas M. & Kiefer Ralph 1994: Remote Sensing and Image Interpretation (3rd Edition). Wiley & Sons
Anonymous. Fundamentals of Remote Sensing: A Canada Centre for Remote Sensing Tutorial. Natural Resources, Canada.
Richards, J. A. and Jia, X., 2006: Remote Sensing Digital Image Analysis: An Introduction. Springer.
http://www.crisp.nus.edu.sg/~research/tutorial/process.htm
http://uotechnology.edu.iq/appsciences/filesPDF/Laser/Lacture/3c/4-Remote_Sensing1.pdf

Вам также может понравиться