Академический Документы
Профессиональный Документы
Культура Документы
On
Snow Cover Mapping in Indus basin
using Remote Sensing
Submitted By:
Vinay Kumar G
2011A2PS402H
(JULY 2013)
A Report
On
Snow Cover Mapping in Indus basin
using Remote Sensing
Submitted By:
Vinay Kumar G
2011A4PS318H
ii
Centre: Roorkee
To: 13/07/13
Submitted By:
Vinay Kumar G
B.E. Civil
Abstract:
The title of my project is snow cover mapping in Indus basin using remote sensing. The aim of
my project was to find the snow cover in Indus basin on various dates and analyse the data. This
report briefly discusses about Remote Sensing and its working. This report also explains how to
process the data obtained using remote sensing. This report explains how watershed delineation
and snow mapping of basins is done using GIS. This report discusses the snow cover of Indus
basin over an extent of period. This report is finally concluded by analysing snow cover data in
Indus basin of 6 separate dates.
__________________
__________________
Signature of Students
Signature of PS Faculty
iii
ACKNOWLEDGEMENT
I would like to thank all those who supported me during the Practice School at NIH. I would like
to thank Dr. D.S.Rathore and Dr. Tanveer Ahmed sir who supervised my training here at National
Institute of Hydrology.
I would also like to thank Dr. S.K. Jain and Dr. V.C. Goyal who mentored our training at
National Institute of Hydrology. I would like to thank our instructor Dr. Chandra Shekhar and
our co-instructor Siddharth Arora who invested their effort and time for me. I would also like to
thank all my friends who helped me during my training program.
iv
TABLE OF CONTENTS
Sr no.
Topic
1.
Pg No.
Introduction
1.1
Remote Sensing
1.2
1.3
Data Acquisition
1.4
Data Analysis
17
2.1
Data Source
17
2.2
Soft Ware
17
2.3
Methodology
19
2.3.1
Watershed Delineation
19
2.3.2
Elevation Zones
25
2.3.3
Snow Mapping
26
2.3.4
Zonal Statistics
29
Study Area
30
Results
31
4.1
Watershed Delineation
31
4.2
Elevation Zones
36
4.3
Snow Mapping
38
Conclusion
45
References
46
Bibliography
47
LIST OF ILLUSTRATIONS
IMAGES
1. Components of remote sensing
2. Differences between Active and Passive remote sensing
3. Electro-magnetic spectrum
4. Energy interactions with atmosphere
5. Types of noise in remotely sensed data
6. Satellite image and its Fourier transformation image
7. Image classification using pixel values (spectral signal)
8. An illustration showing the flow direction evaluation process
9. Raster calculator used to reassign pixel values in flow accumulation raster
10. Model used for snow delineation in ERDAS IMAGINE
11. Indus basin
12. SRTM 250 DEM used for watershed delineation
13. SRTM 250 DEM along with Indus basin watershed
14. FILL raster of Indus basin
15. FLOW DIRECTION raster of Indus basin
16. FLOW ACCUMULATION raster of Indus basin
17. FLOW ACCUMULATION raster along with OUTLET point
18. Indus basin DEM with classified ELEVATION ZONES
19. Snow Cover raster on 09 March 07
20. Snow Cover raster on 18 April 07
21. Snow Cover raster on 04 May 07
22. Snow Cover raster on 05 June 07
23. Snow Cover raster on 07 July 07
24. Snow Cover raster on 08 August 07
TABLES
1.
2.
3.
4.
5.
6.
7.
vi
GRAPHS
1. Spectral reflectance graph of various features
2. Hypsometric graph of elevation zones of Indus basin
3. Snow Cover Comparison graph of elevation zones of Indus basin on all dates.
vii
1. INTRODUCTION
Radiation Principles
Remote sensing can be done using various sensors to obtain information about our terrain
of interest. But mostly electromagnetic energy sensors are used widely in both airborne and
spaceborne platforms.
Electromagnetic energy is a form of energy in which visible light is only a part. It is
classified as radio waves, microwaves, infrared, visible, ultraviolet, x rays, rays, cosmic rays
depending on their wavelengths. All these radiations travel at the velocity of light following
wave theory. These waves also follw wave theory and are defined by wave charecteristics like
wavelength, frequency, energy etc
Diffuse reflectors are rough surfaces with uniform reflection in all directions. Near diffuse
reflectors are those that reflect in all directions but not uniform.
Image 4; Source [4]
Every object when electromagnetic radiation is incident it absorbs a little, transmits and also
reflects some amount of energy no matter how small it is.
EI () = ER () + EA () + ET ()
EI is the incident energy. ER is the reflected energy. EA is the energy absorbed. ET is the energy
transmitted.
REFLECTANCE OF RADIATION is the property that is used to distinguish between features.
It is the percentage of incident energy that a surface reflects back. It is a fixed characteristic of an
object. Unique objects might show different reflectance if there is a physical or chemical change.
Reflectance is not the same as reflection.
Reflectance is not obtained to be the same value always because of energy interactions
with the atmosphere. No matter what kind of radiation for it to travel to the terrain of interest
from the source and from there to the sensors it must pass through space. And this space always
interacts with the radiation and might cause changes in the radiation. This again depends on the
length of path, strength of signal, wavelength and some other factors. The effects caused through
these interactions are because of
1) SCATTERING
2) ABSORPTION
windows.
In the above graph we can observe characteristic spectral signatures of some features like water,
soil and vegetation. Even though termed as signature these graphs cant be considered as unique
because of the atmospheric interactions and also spatial and temporal effects. As discussed above
atmospheric interactions effect the signals and so they change according to the atmosphere,
climate which is temporal. And hence spectral signature is never unique. Temporal effects can be
observed as climatic changes like clouds can be effecting the signal, reflectance can be different
if it is rainy due to which soil might become moist. Spatial effects refer to factors that cause same
types of features at a given point of time vary at different geographical locations. Some spatial
effects are climates, types of soil, practices in that locations etc
Photographically
Electronically
Photographic processes use chemical reactions on the surface of a photographic films. These
methods are relatively simple and cheaper. They provide a high degree of spatial detail and
geometric integrity. These films act as both detecting and recording medium.
Electronic sensors generate an electronic signal corresponding to the energy variations in the
original scene. They are more advantageous than photographic methods because of the broader
spectral range of sensitivity, improved calibration potential and can transmit data electronically.
But they are not as cheap and simple as photographic methods. Electronic sensors record data on
magnetic tapes. These are later converted into photographs depending upon the requirements.
These films act as only recording medium.
Linear or area array photodiode (or) charge coupled device (ccd) digitization
Video digitization
Digital image processing involves developing and rectifying images with computer aid.
Fundamental methods involved in digital image processing are image rectification or restoration,
image enhancement and image classification. Digital analysis is mostly dependent on colour and
tone of the individual pixels. Digital images are usually processed by the computers on the basis
of some equations and results in some more pictures or tabular values etc
Image rectification or restoration:
Image restoration and rectification techniques involve the correction of distortion, degradation
and noise during the imaging process. This involves both radiometric and geometric corrections.
To correct the data internal and external errors must be detected. Internal errors can be due to the
sensors or technical failures. External errors can be like atmospheric interactions which tend to
distort the signals. These processes are usually called pre-processing of the digital imagery.
Geometric Correction:
Sometimes the images can be so distorted geometrically that they cannot be used for the
processing. This may be caused by various factors like altitude, velocity if the sensor, curvature
of the earth, earths rotations (often cause for panoramic distortions), atmospheric refractions,
relief displacements etc So geometric corrections rectify these geometric distortions to an
extent such that they can be used again.
Symmetric distortions can be easily rectified by developing a mathematic model depending upon
the source of distortion and then applying some transformations corresponding to it.
Random distortions are usually corrected by geo-referencing the Ground Control Points on the
map to their coordinates. An equations is thus developed by calculating the least squares
regression and thus resampled.
Radiometric corrections
Radiometric corrections are required to rectify the errors caused because of changes in scene
illumination, viewing geometry, atmospheric conditions etc Viewing geometry corrections are
required in air-borne remote sensing more than in space-borne remote sensing.
Noise removal:
Image noise is any unwanted disturbance in image data that is due to limitation in the sensing,
signal digitization, or data recording process. The potential sources of noise range from periodic
drift or malfunction of a detector, to electronic interference between sensor components, to
intermittent hiccups in the data transmission and recording sequence. Noise can either degrade
or totally mask the true radiometric information content of a digital image.
Image Enhancement:
Image enhancement algorithms are applied to an image to increase interpretability and
appearance of the image data. Image enhancement always depends upon the requirements of the
user and no ideal image enhancement. Image enhancement techniques are used for easier
processing and to decrease the complexity of the image and also to lose unwanted information
from the image.
Image enhancement techniques can be classified as
10
Local operations: modify the brightness values of the pixel based on neighbouring
pixels
Both operations can be done on any kind of imagery. Image enhancements are done after the
image restoration process and before image classification process.
Most commonly used image enhancement techniques are
canonical
components,
intensity-hue-saturation
(I)
colour
space
Gray-level thresholding
It is used to classify an image into two classes.
Ex: - one for all pixels with gray level greater than user defined value
One for all pixels with gray level lesser than user defined value.
Thresholding is usually used to develop binary masks and later these masks are used to
operate on the image separately on each class without effecting the other.
Level slicing
Level slicing is a technique where the DNs distributed along the x-axis of an image
histogram are divided into a series of user defined intervals or slices. All the DNs in
the same interval are assigned a single DN. Each level can also be shown as a single
colour.
Level slicing is used extensively in the display of thermal infrared images in order to
show discrete temperature ranges coded by gray level or colour.
Contrast stretching
Contrast stretching is a technique wherein a particular set of the gray scale level is
stretched to the complete gray scale level.
Ex: - consider an image with gray scale varying from 80 to 190. Now these gray scale
values are stretched from 80 190 to 0 255. Hence this will increase the
interpretability
of
the
image
and
11
all
features
are
more
distinguished.
This stretching can be done on any basis like linear stretching, depending on the
frequency or we can omit some values also.
Spatial feature management:
Spatial filtering
Spatial filtering is a local operation. Spatial filters emphasize or deemphasize various
spatial frequencies of an image. Spatial frequency means the roughness of the tonal
variations in an image. If the gray level of pixels change very abruptly over a small area
then it is said to be rough tonal area or high spatial frequency and vice-versa.
Low pass filters emphasize low frequency detail and deemphasize high frequency detail
while high pass filters emphasize high frequency detail and deemphasize low frequency
detail. Low pass filters can be used to reduce random noise.
Edge enhancement
Edge enhancement delineates the edges of the shapes and details of an image and hence
making it more conspicuous and easy to interpret. These edges may be enhanced using
linear edge enhancement or non-linear edge enhancement.
Linear edge enhancement is done by applying a directional first difference algorithm
which approximates the first derivative between two adjacent pixels. Edge smoothness
or roughness depends on the kernel size that operates. Larger the kernel, smoother the
edge.
Non-linear edge enhancements are performed using non-linear combinations of the
pixels. Many algorithms are applied using various sized kernels. Sobels edge detector
and Roberts edge detector are some non-linear edge enhancement operators.
Fourier analysis
Fourier analysis is a mathematical technique for separating an image into its various
spatial frequency components. Fourier magnitude images are symmetric about the
centre and the intensity at the centre represents magnitude of lowest frequency
12
component.
Image 6; Source [7]
Fourier transforms are majorly used to remove noise from these images. It is also used
to apply filters. Low-pass or high-pass filters are applies on the Fourier transform image
and then it is converted back into the original image.
Multi image manipulation:
Spectral ratioing
Ratio images can be obtained by dividing DNs of one spectral band with corresponding
values in another spectral band. Great advantage of ratioing is that we can avoid
variations caused by scene illuminations and noise can be reduced. By ratioing correct
pair of bands we can obtain valuable information that is difficult to interpret from single
spectral band data.
Image classification:
Classification process categorizes all pixels in the image into several classes or features or
themes depending upon the spectral pattern present in the pixel which is calculated on numerical
basis. This process is usually done using multispectral data.
Image classification is majorly two types namely supervised classification and unsupervised
classification.
Supervised classification
13
With supervised classification, we identify examples of the Information classes (i.e., land cover
type) of interest in the image. These are called training sites. The image processing software
system is then used to develop a statistical characterization of the reflectance for each
information class. This stage is often called signature analysis and may involve developing a
characterization as simple as the mean or the rage of reflectance on each bands, or as complex as
detailed analyses of the mean, variances and covariance over all bands. Once a statistical
characterization has been achieved for each information class, the image is then classified by
examining the reflectance for each pixel and making a decision about which signature it
resembles the most.
14
Parallelepiped classification
The parallelepiped classifier uses the class limits and stored in each class
signature to determine if a given pixel falls within the class or not. The class
limits specify the dimensions (in standard deviation units) of each side of a
parallelepiped surrounding the mean of the class in feature space.
If the pixel falls inside the parallelepiped, it is assigned to the class. However, if
the pixel falls within more than one class, it is put in the overlap class (code
255). If the pixel does not fall inside any class, it is assigned to the null class
(code 0).
The parallelepiped classifier is typically used when speed is required. The
drawback is (in many cases) poor accuracy and a large number of pixels
classified as ties (or overlap, class 255).
Unsupervised classification
Unsupervised classification is a method which examines a large number of unknown pixels and
divides into a number of classed based on natural groupings present in the image values. Unlike
supervised classification, unsupervised classification does not require analyst-specified training
data. The basic premise is that values within a given cover type should be close together in the
15
measurement space (i.e. have similar gray levels), whereas data in different classes should be
comparatively well separated (i.e. have very different gray levels).
The classes that result from unsupervised classification are spectral classed which based on
natural groupings of the image values, the identity of the spectral class will not be initially
known, must compare classified data to some form of reference data (such as larger scale
imagery, maps, or site visits) to determine the identity and informational values of the spectral
classes. Thus, in the supervised approach, to define useful information categories and then
examine their spectral seperability; in the unsupervised approach the computer determines
spectrally separable class, and then define their information value.
Unsupervised classification is becoming increasingly popular in agencies involved in long term
GIS database maintenance. The reason is that there are now systems that use clustering
procedures that are extremely fast and require little in the nature of operational parameters. Thus
it is becoming possible to train GIS analysis with only a general familiarity with remote sensing
to undertake classifications that meet typical map accuracy standards. With suitable ground truth
accuracy assessment procedures, this tool can provide a remarkably rapid means of producing
quality land cover data on a continuing basis.
16
MODIS:
The product provides 8- day composite of the surface reflectance in 1 to 7 bands at 500 m
resolution for MODIS sensor. The product is gridded level- 3 product. Projection system is
sinusoidal. The product is derived by selecting an observation from MODIS daily L2G products
over 8 day period. The selection is based on several factors e.g. maximum observation coverage,
low view angle, absence of cloud or its shadow, aerosol loading etc. The accompanying data are
quality assessment, day of observation, solar azimuth, and view and zenith angles. The versionfive product is validated Stage two product. It has been validated spatially and temporally
through ground truth etc. The data is recommended for scientific use. Data is available in tiles of
size 10 in HDF-EOS format. The file size in pixels and lines is 2400 X 2400.
17
ArcCatalog - used to organize and manage your GIS data. It also allows you to preview
datasets and view and manage metadata.
ArcMap - used to view, edit, and analyse spatial data and create maps.
ArcScene - provides the interface for viewing multiple layers of 3D data, visualizing 3D
data on a 2D surface data, creating 3D surfaces, analysing 3D surfaces.
18
for individual band. Limited support of datum conversion is also available. Spatial sub setting is
done using diagonal opposite corners points. If no output pixel size is specified, default pixel size
is taken. Default fill values are taken from bands of the MODIS data products.
ERDAS IMAGINE:
ERDAS IMAGINE is the raster-centric software GIS professionals use to extract information
from satellite and aerial images. Because it is easy to use and easy to learn, ERDAS IMAGINE
is perfect for beginners and experts alike. The vast array of tools allowing users to analyse data
from almost any source and present it in formats ranging from printed maps to 3D models,
makes ERDAS IMAGINE a comprehensive toolbox for geographic imaging and image
processing needs.
ERDAS IMAGINE is aimed primarily at geospatial raster data processing and allows the user to
prepare, display and enhance digital images for mapping use in geographic information system
(GIS) or in computer-aided design (CADD) software. It is a toolbox allowing the user to perform
numerous operations on an image and generate an answer to specific geographical questions.
By manipulating imagery data values and positions, it is possible to see features that would not
normally be visible and to locate geo-positions of features that would otherwise be graphical.
The level of brightness, or reflectance of light from the surfaces in the image can be helpful with
vegetation analysis, prospecting for minerals etc. Other usage examples include linear feature
extraction, generation of processing work flows ("spatial models" in ERDAS IMAGINE), and
import/export of data for a wide variety of formats, ortho-rectification, mosaicking of imagery,
and stereo and automatic feature extraction of map data from imagery.
2.3 Methodology
2.3.1 Delineating watershed:
Open a new, blank map document in ArcMap and use the Add Data button (
elevation model (DEM) you will be using to delineate your watersheds.
19
The Fill tool in the Hydrology toolbox is used to remove any imperfections (sinks) in the
digital elevation model. A sink is a cell that does not have a defined drainage value
associated with it. Drainage values indicate the direction that water will flow out of the cell,
and are assigned when creating a flow direction grid for the landscape. The resulting drainage
network depends on finding the 'flow path' of every cell in the grid, so it is important that the
fill step be performed prior to creating a flow direction grid.
Double-click the Fill tool to open its dialog.
The Input surface raster is the DEM grid.
Leave the Z limit blank and click OK to run the tool. Note that this process is CPU intensive,
and may take quite some time depending on the processing power of your workstation.
Once the fill process is complete, a new grid will be added to the data frame. There should be
a difference in the lowest elevation value between the original DEM and the filled DEM.
Remove the original DEM layer from the map (right-click > Remove).
20
Double-click the Flow Direction tool to open it. The Input surface raster should be set to the
filled DEM. The Output flow direction raster should once again default to your working
directory. Open the Environment Settings using the Environments button and confirm that the
Raster Analysis > Cell Size is set to the same as your filled DEM.
Click OK to run the tool. This process will take some time to complete, and once it has run a
new flow direction raster will be added.
3. Create Flow Accumulation:
The Flow Accumulation tool calculates the flow into each cell by accumulating the cells that
flow into each downslope cell. In other words, each cell's flow accumulation value is
determined by calculating the number of upstream cells that flow into it.
Double-click the Flow Accumulation tool to open it.
The Input flow direction raster should be set to the flow direction grid created in Step 3.
The Output accumulation raster will default to your working directory.
Accept all other defaults, check the Environment Settings to ensure that the Raster Analysis
> Cell Size property is set to the same as your filled DEM, and click OK to run the tool. This
process may take quite some time to complete.
The new flow accumulation raster will be added to your map. Each cell in the grid contains a
value that represents the number of cells upstream from that particular cell. Cells with higher
flow accumulation values should be located in areas of lower elevation, such as in valleys or
drainage channels.
21
It is very likely that the flow accumulation grid will appear dark and uninformative when
first added to the map. This can be fixed by altering the symbolization of the layer. Use the
raster calculator tool in spatial analyst-map algebra to change the symbology of flow
accumulation raster. Use the set null function. Change the pixel values of the raster. Syntax of
the function is Setnull(flow_accumulation raster<____,1). Enter the value lower than
which you want to set to 0 and all the values higher than that value are reset to 1. This
simplifies the flow accumulation raster.
Image 9
Each cell has an outlet point called a pour point that indicates the location where water
would flow out of the cell. Pour points must be located in cells of high cumulative flow or
the watersheds you delineate in the steps below will be very small.
4. Create outlet pour points:
The placement of pour points is an important step in watershed delineation. A pour point
should exist within an area of high flow accumulation because it is used to calculate the total
contributing water flow to that given point. In many cases you will already have a shape file
containing the locations of your pour points, whether they are sampling sites, hydrometric
22
stations, or another data source. However, it is also possible to create pour points yourself.
The instructions below include both procedures.
Creating pour points through visual inspection:
Open the ArcCatalog window (
Shape file. Create a new point shape file, give it a descriptive name and apply the appropriate
projection information (the coordinate system should be the same as the DEM or Flow
Direction Grid you will be using). Click OK. The new, empty point layer will be added to
your map.
Zoom in to your area of interest so that you are able to see the individual flow accumulation
cells. Use the Identify tool (
chosen pour point cell should be a natural outlet for the streams flowing above it and must be
on the high flow accumulation path. Your choice essentially determines the end of your
catchment; everything upstream from this point will define a single watershed.
To add a pour point, open the Editor Toolbar (Customize > Toolbars > Editor) and choose
Editor > Start Editing.
If necessary, in the Start Editing dialog, highlight the empty pour point layer and click OK.
The Create Features window will open. Highlight the pour point shape file and then move
your cursor onto your map. Add a pour point by clicking in the centre of the high flow
accumulation cell you have chosen as your outlet point. Try to place points in the centre of
the cells. Also remember to place the points 1 or 2 cells away from stream confluences.
If you are defining only one watershed then save your edits, stop the editing session and
move on to Step 5.
If you are creating more than one watershed, add a pour point for each watershed then save
your edits and exit the editing session. Open the attribute table for the layer by right-clicking
the layer name and selecting Open Attribute Table. Click the Table Options icon (
) and
select Add Field. Create a field of type Integer, precision 0 and call it UNIQUEID. Start
another editing session and enter an ID number for each individual pour point (1, 2, 3, and so
23
on). Stop editing and choose to save your edits. Watersheds are delineated based on unique
identification numbers, so this step ensures that a separate watershed will be delineated for
each individual pour point.
24
best value. If your pour point is not located on the high flow path, the tool will move it to the
cell within the search radius with the highest accumulated value. If your pour point is already
located on the high flow path, the tool will move the point to a downstream cell. It should be
taken care that the outlet point always stays on the path of flow accumulation (pixel with
value 1 as mentioned in raster calculator). The points must remain at the same location if they
are imported or placed by visual inspection. So, it is suggested that snap radius be set to 0.
6. Delineating Watershed:
Double-click the Watershed tool to open it (ArcToolbox > Spatial Analysis Tools >
Hydrology).
The Input flow direction raster is the flow direction raster created in Step 3, or the enhanced
flow direction layer from the OMNR.
The Input raster or feature pour point data is the raster pour point output from the Snap Pour
Points tool in Step 6.
The Pour point field can be left as default, or optionally you may choose to enter the unique
ID field created in Step 5.
The Output Raster will default to your working directory.
Click OK to run the tool.
When complete, the new watershed raster(s) will be added to your map.
7. Watershed raster to polygon:
You can convert the watershed raster to a polygon shape file for area calculations or to clip
other data sets to the watershed boundary. To do so one can use the raster to polygon tool
(Arc Toolbox > Conversion Tools> From Raster).
Double-click the Raster to Polygon tool to open it.
The Input raster is the watershed raster file created in Step 7 above.
The Output polygon features will default to your working directory.
Leave all other defaults and click OK to run the tool. The new polygon shape file will be
added as a layer to your map.
25
26
1. Input all the files (.hdf) that comprise the snow data of the study area in the input field.
2. Select the bands 2, 4, 6 which are required for snow mapping and exclude the rest of the
bands.
3. Specify an output location for the files to be produced.
4. MODIS reprojection tool provides some basic projections. Select a projection depending on
the requirement. Enter the data required for reprojecting the data files. Set the Datum
according to your requirement.
5. Set the resampling to nearest neighbour or others depending on the requirements.
6. Now run the program and the files are mosaicked and output is produced for each band
separately.
After getting the bands 2, 4 and 6 of MODIS data separately mosaicked and reprojected then
they are processed in ERDAS IMAGINE to delineate the snow.
1. Start ERDAS IMAGINE.
2. Go to interpreter tab> utilities> layer stack.
3. Input the band 2 file in input file field. Now click on add button. Now input the band 4 file in
input field. Again click on add button. Similarly input the band 6 file in input field. Now click
on add button. After adding all the three band layers toggle on the Ignore zero in stats field.
4. Specify the output location in output file field and click ok.
5. After the 3 bands are stacked on a single image, use modeler to delineate snow.
27
10. This
data
is
processed
by
$n1_PROMPT_USER(2)-$n1_PROMPT_USER(3)
and
Image 10
28
12. This map is then added to the DEM file and the file with elevation zones for finding various
values like snow cover in each zone.
29
30
3. STUDY AREA
The Indus basin is formed by Indus and its tributaries. Indus River is a major river in Asia which
flows through Pakistan and India. It also has courses through western Tibet. Originating in the
lake Manasa Sarovar, the river runs through Jammu and Kashmir, Himachal Pradesh and Punjab
in India. It flows for over 3180 km. The river has a total drainage area exceeding 1165000 km2.
Annual flow of the river is estimated to be 207 km3. . The part of Indus Basin that is studied in
this report lies in between 76o east 36o north and 81o east and 31o north.
31
4. RESULTS
4.1 Watershed of Indus basin:
DEM file that covers Indus Basin is taken from SRTM 250. All the method discussed above is
done and Indus basin watershed has been delineated. After the watershed raster is converted to
polygon file and used to extract just the basin from the whole SRTM 250 tile.
Image 12
Image 13
32
The watershed extends from 76o east 36o north and 81o east and 31o north. It occupies a total of
178339.1 sq km.
Image 14
This is the DEM of Indus basin after fill tool is applied to remove sink pixels which dont have
any drainage data.
33
Image 15
Flow direction tool applied on the DEM after applying fill tool. This determines the direction in
which the water flows out from the pixel and the direction in which the water enters the pixel.
34
Image 16
Flow accumulation tool is applied on the flow direction raster of Indus basin DEM. After flow
accumulation tool, raster calculator is used to make the flow accumulation raster more discrete
by resetting the pixel value to 2 values (0 and 1) only. Thus changing higher flow accumulation
values to 1 and lower flow accumulation pixel values to 0. Thus it helps in visually identifying
the pour point of the basin.
35
Image 17
The green circle on the image indicates the outlet point of the basin. Outlet point is the location
where all the water flows to from all the watershed. Outlet point always lies on high flow
accumulation pixels. Outlet point can also be thought of as the exit point of the basin or the
lowest elevation point in the basin. Outlet point is downstream of the whole basin and the whole
basin is upstream of outlet point of the basin.
36
Image 18
Above image is the elevation zone wise classified Indus basin. Each zone is 1000m wide.
Least elevation point
0948 m
37
ZONE
PIXELS
MIN
ELEV
MAX
ELEV
RANGE
MEAN
STD
MEDIAN
321
16.98
948
1000
52
989.77
9.43
993
37943
2007.18
1001
2000
999
1616.34
264.62
1652
138910
7348.33
2001
3000
999
2578.02
278.48
2610
442473
23406.82
3001
4000
999
3584.40
279.50
3619
1409476
74561.28
4001
5000
999
4564.68
271.91
4586
1277074
67557.21
5001
6000
999
5394.74
256.30
5362
63025
3334.02
6001
7000
999
6210.16
212.02
6137
2012
106.43
7001
7997
996
7255.86
208.96
7199
16
0.84
8022
8572
550
8263.37
183.93
8203
TABLE 1
The above table is an analysis of the elevation zones of Indus basin obtained using zonal
statistics tool.
The column pixels refers to the total number of pixels present in the zone.
The column area refers to the total area occupied by the zone in square meters.
The column mean refers to the mean elevation of the zone.
The column median refers to the median elevation in the zone.
The column std refers to the standard deviation of the elevation in the zone.
From observation it can be noticed that zones 5 and 6 occupy most of the area in Indus basin.
HYPSOMETRIC GRAPH
7948
Elevation
6948
5948
4948
3948
2948
1948
948
0
20000
40000
60000
80000
Cumulative Area
GRAPH 2
The above graph is a hypsometric graph with cumulative area on x-axis and elevation on y-axis.
38
Image 19
ZONES
1
2
3
4
5
6
7
8
9
TOTAL
39
Image 20
ZONES
1
2
3
4
5
6
7
8
9
TOTAL
PIXELS
321
37941
138907
442451
1408970
1275673
62936
2012
16
AREA (sq
PIXELS WITH SNOW
km)
SNOW
COVER(sq km)
16.98
0
0
2007.07
0
0
7348.18
388
20.525
23405.66
96489
5104.26
74534.51
591694
31300.61
67483.1
632157
33441.11
3329.31
59093
3126.02
106.43
1967
104.05
0.84
11
0.58
1381799
73097.17
TABLE 3
40
Image 21
ZONES
1
2
3
4
5
6
7
8
9
TOTAL
41
Image 22
ZONES
1
2
3
4
5
6
7
8
9
TOTAL
PIXELS
AREA(sq km)
321
16.98
37941
2007.07
138907
7348.18
442451
23405.66
1408970
74534.51
1275673
67483.1
62936
3329.31
2012
106.43
16
0.84
42
Image 23
ZONES
1
2
3
4
5
6
7
8
9
TOTAL
43
Image 24
ZONES
1
2
3
4
5
6
7
8
9
TOTAL
PIXELS
AREA(sq km)
321
16.98
37942
2007.13
138910
7348.33
442473
23406.82
1409459
74560.38
1277036
67555.2
63020
3333.75
2012
106.43
16
0.84
PIXELS WITH
SNOW
0
0
148
1668
38349
198124
42016
1521
12
281838
SNOW COVER
(sq km)
0
0
7.82
88.23
2028.66
10480.76
2222.64
80.46
0.63
14909.23
TABLE 7
44
Snow mapping of Indus basin was done for six different dates. The image files obtained after
delineating snow using ERDAS IMAGINE are added to the DEM of Indus basin and the
elevation zones. Zonal statistics of the snow map image with respect to the elevation zone map is
calculated and analysed. Care is to be taken that all the maps are in the same projection and all
the maps are of same pixel size or resolution.
Area of the snow cover has been calculated for every date zone wise also. The data has been
analysed in tabular and graphical format.
112520.6
73097.17
58998.22
44875.86
23754.01
14909.23
TABLE 8
09-Mar
18-Apr
04-May
05-Jun
07-Jul
08-Aug
zone 2
zone 3
zone 5
80
zone 6
zone 7
zone 8
60
zone 9
40
20
0
15-Feb
7-Mar
27-Mar
16-Apr
6-May
26-May
DATE
GRAPH 3
45
15-Jun
5-Jul
25-Jul
14-Aug
3-Sep
5. CONCLUSION
As observed in the graphs and tables discussed above, it is clear that the area of snow cover
decreased drastically during May, June, July months which is supported by the fact that it is
summer season. Snow cover area in zone 9 didnt follow the trend due to the fact that it is at a
high altitude of more than 8000m from main sea level and also since zone 9 comprises of small
area.
46
REFERENCES:
1. http://www.ngdir.ir/Data_SD/GeoLab/Pics/GeoLabPic_1223_2.jpg
2. http://www.tankonyvtar.hu/en/tartalom/tamop425/0027_DAI6/images/DAI605.png
3. http://maxstudy.org/Chemistry/AP/2000px-EM_spectrum.svg_.png
4. http://www.csc.noaa.gov/products/gulfmex/img/lightdle.gif
5. http://tutor.nmmu.ac.za/uniGISRegisteredArea/Intake13/Remote%20Sensing%20and%20GI
S/reflect.gif
6. http://bfast.r-forge.r-project.org/seasonalbreak_TreeMort.jpg
7. http://www.sc.chula.ac.th/courseware/2309507/images/ch10_9.jpg
8. http://resources.arcgis.com/en/help/main/10.1/index.html#//00v200000005000000
9. http://www.nrcs.usda.gov/wps/portal/nrcs/detail/nh/technical/?cid=nrcs144p2_015680
10. http://modis.gsfc.nasa.gov/
11. https://lpdaac.usgs.gov/products/modis_products_table/mod09a1
12. Remote sensing and image interpretation by Lilesand and Kiefer.
13. Remote sensing and GIS applications by P.S Roy and R.S Dwivedi
14. http://foeme.files.wordpress.com/2012/12/map-of-the-indus-basin-source-us-senatereport.jpg
47
Bibliography:
1. Remote sensing and image interpretation by Lilesand and Kiefer.
2. Remote sensing and GIS applications by P.S Roy and R.S Dwivedi
3. Introductory Digital Image Processing by John R Jensen
48