Академический Документы
Профессиональный Документы
Культура Документы
Why Remote Sensing? What is Remote Sensing? Remote Sensing: Basic Principle An Overview Stages in Remote Sensing Types of Remote Sensing Electromagnetic Spectrum Spectral Reflectance Satellite Sensor Resolutions Understanding Hyperspectral Remote Sensing
A note on Digital Satellite Images Reading a Satellite Image Digitally Reading a Satellite Image
Applications
Satellite Sensors
Miscellaneous
Cost Effective Remote sensing especially when conducted from space, is an intrinsically expensive activity. Nevertheless, cost-benefit analysis demonstrates its financial effectiveness, and much speculation or developmental remote sensing activity can be justified in this way. It is a cost-effective technique as again and again fieldwork is not required and also a large number of users can share and use the same data.
An Overview
There are many interrelated process, which are involved in acquisition of remotely sensed images, hence an isolated focus on any single component produces a fragmented picture. It is necessary to identify these components to have proper knowledge of remote sensing. There four essential components of it: The first component includes the physical features, like buildings, vegetation, soil, water and rocks. Knowledge of the physical features resides within such specific disciplines as geology, forestry, soil science, geography, and urban planning. Sensor data are formed as a sensing device views the physical features by recording electromagnetic signals emitted or reflected from the landscape. The effective use of sensor of sensor data requires analysis and interpretation of to convert data into useful information. These interpretations create extracted information, which consists of transformations of sensor data designed to reveal specific kinds of
information. A more realistic view demonstrates that the same sensor data can be examined from alternative perspectives to yield different interpretations. Therefore a single can be interpreted to provide information about, for instance, vegetation, soils, rocks, water, depending on the specific data, information required and the purpose of the analysis. The fourth component is the applications, in which the analyzed remote sensing data can be combined with other data to address a specific practical problem, such as land use planning, mineral exploration or vegetation mapping.
interaction with the earth features. The kind of EMR which can be sensed by the device depends upon the amount of EMR and sensors capabilities. Data Transmission and Processing The EMR recorded by the remote sensing device is transmitted to earth receiving and data processing stations. Here the EMR are transformed into interpretable output- digital or analogue images. Image Processing and Analysis The digital satellite images are processed using specialized software meant for satellite image processing. The image processing and further analysis of satellite data leads to information extraction, which is required by the users. Application The extracted information is utilized to make decisions for solving particular problems. Thus remote sensing is a multi-disciplinary science, which includes a combination of various disciplines such as optics, photography, computer, electronics, telecommunication and satellite-launching etc.
this range, e.g., bands of IRS P6 LISS IV sensor are in optical range of EMR. Thermal Remote Sensing The sensors, which operate in thermal range of electromagnetic spectrum record, the energy emitted from the earth features in the wavelength range of 3000 nm to 5000 nm and 8000 nm to 14000 nm. The previous range is related to high temperature phenomenon like forest fire, and later with the general earth features having lower temperatures. Hence thermal remote sensing is very useful for fire detection and thermal pollution. e.g., the last five bands of ASTER and band 6 of Landsat ETM+ operates in thermal range. Microwave Remote Sensing A microwave remote sensor records the backscattered microwaves in the wavelength range of 1 mm to 1 m of electromagnetic spectrum. Most of the microwave sensors are active sensors, having there own sources of energy, e.g, RADARSAT. These sensors have edge over other type of sensors, as these are independent of weather and solar radiations.
Electromagnetic Spectrum
Electromagnetic spectrum (EMS) represents the continuum of electromagnetic radiation (EMR) arranged on the basis of wavelengths or frequency. Electromagnetic spectrum ranges from shorter wavelengths (gamma rays to x rays) to the longer wavelengths (microwave and radio waves). Most common remote sensing systems operate in one or several of the visible, infrared and microwave portions of the electromagnetic spectrum. Within the infrared portion of the spectrum it should be noted that only thermal infrared energy is directly related to the sensation of heat; not the near and mid infrared ones. Before discussing about EMS with reference to remote sensing it is important to understand it fully. Different radiations which constitute the EMS are as follows: Radio waves These are the longest wavelength (lowest frequency) radiations of the EMS. The wavelength of radio waves is more than 100 cm. These passes through Earth's atmosphere easily. Radio signals are used in radios, televisions, aircrafts, ship etc. These are also emitted by stars.
Microwaves Their wavelength ranges between 1mm & 1m. RADAR (Radio Detection And Ranging) is the most common device used in Microwave Remote Sensing. Other applications are in cooking food (microwave oven), in broadcasting transmissions etc. Infrared Radiations When we feel hot it is because of infrared (IR) radiations. For common understanding we can call them as 'Heat'. The wavelength of IR is longer than visible light and shorter than microwaves approximately ranges between 1micron to 100 microns. These are very useful radiations for remote sensing. Thermal Imaging Systems detect objects by recording their temperature (infrared emissions). Visible Radiations As their name itself suggests these are the EMRs which are visible to our eyes in different colours. These ranges between 700nm to 400nm. Most of the remote sensing systems and camera records images in this range. Ultraviolet (UV) Radiations These radiations have wavelength shorter violet colour of visible light and longer than X-rays. UV radiations can be divided into near UV (400200nm), far UV (20010) and extreme UV 131 nm. X-rays These are very short length electromagnetic radiations wavelength in the approximate range from 0.01 to 10 nanometers. In EMS these falls between UV radiations and Gamma-rays. Mostly used in medical sciences. Gamma-rays Gamma-rays are the electromagnetic radiations with shortest wavelength in the range of the range of 10^-11 m to 10^-14 m. Their very high energy can cause serious damage to living cells.
Spectral Reflectance
The reflectance characteristics of earth surface features may be quantified by measuring the portion of incident energy that is reflected. This is measured as a function of wavelength (l) and is called spectral reflectance, rl.
rl = ER (l) /EI (l) Where ER is reflected energy and EI is incident energy. A graph of the spectral reflectance of an object as a function of wavelength is termed as spectral reflectance curve. Spectral Reflectance of Vegetation The spectral characteristics of vegetation vary with wavelength. A compound in leaves called chlorophyll strongly absorbs radiation in the red and blue wavelengths but reflect green wavelength. The internal structure of healthy leaves act as diffuse reflector of near-infrared wavelengths. Measuring and monitoring the infrared reflectance is one way that scientists determine how healthy particular vegetation may be. Leaves appear greenest to us in summer and become red or yellow with decrease in chlorophyll content in autumn. Spectral Reflectance of Water Majority of the radiation incident upon water is not reflected but either is absorbed or transmitted. Longer visible wavelengths and near-infrared radiations are absorbed more by water than the visible wavelengths. Thus water looks blue or blue-green due to stronger reflectance at these shorter wavelengths and darker if viewed at red or near-infrared wavelengths. The factors that affect the variability in reflectance of a water body are depth of water, materials within water and surface roughness of water. Spectral Reflectance of Soil The majority of radiation on a surface is either reflected or absorbed and little is transmitted. The characteristics of soil that determine its reflectance properties are its moisture content, texture, structure iron-oxide content. The soil curve shows less peak and valley variations. The presence of moisture in soil decreases its reflectance. By measuring the energy that is reflected by targets on earths surface over a variety of different wavelengths, a spectral signature for that object can be made. And by comparing the response pattern of different features we may be able to distinguish between them.
Spatial resolution is the measure of smallest object that can be detected by a satellite sensor. It represents area covered by a pixel on the ground. Mostly, it is measured in meters. For example, CARTOSAT-1 sensor has a spatial resolution of 2.5x2.5 m , IRS P6 LISS IV sensor has a spatial resolution of 5.6x5.6 m for its multispectral bands and LISS III has spatial resolution of 23.5x23.5 m in its first three bands. The smaller the spatial resolution, the greater the resolving power of the sensor system. That's why one can detect even a car in the satellite image acquired by IKONOS (spatial resolution 1x1 m) but can see hardly even a village in a satellite image acquired by AVHRR (spatial resolution 1.1x1.1 km). Spectral resolution Spectral resolution refers to the specific wavelength intervals in the electromagnetic spectrum for which a satellite sensor can record the data. It can also be defined as the number and dimension of specific wavelength intervals in the electromagnetic spectrum to which a remote sensing instrument is sensitive. For example, band 1 of the Landsat TM sensor records energy between 0.45 and 0.52 m in the visible part of the spectrum.The spectral channels containing wide intervals in the electromagnetic spectrum are referred to as coarse spectral resolution and narrow intervals are referred to as fine spectral resolution. For instance the SPOT panchromatic sensor is considered to have coarse spectral resolution because it records EMR between 0.51 and 0.73 m. on the other hand; band 2 of the ASTER sensor has fine spectral resolution because it records EMR between 0.63 and 0.69 m. Radiometric resolution Radiometric resolution defined as the sensitivity of a remote sensing detector to differentiate in signal strength as it records the radiant flux reflected or emitted from the terrain. It refers to the dynamic range, or number of possible data-file values in each band. This is referred to by the number of bits into which the recorded energy is divided. For instance, ASTER records data in 8-bit for its first nine bands, it means the data file values range from 0 to 255 for each pixel, while the radiometric resolution of LISS III is 7-bit, here the data file values for each pixel ranges from 0 to 128. Temporal Resolution The temporal resolution of a satellite system refers to how frequently it records imagery of a particular area. For example, CARTOSAT-1 can acquire images of the same area of the globe every 5 days, while LISS III doest it every 24 days. The temporal resolution of a satellite sensor is very much helpful in change detection. For instance, agricultural crops have unique crop calendars in
each geographic region. To measure specific agricultural variables it is necessary to acquire remotely sensed data at critical dates in the phenological cycle. Analysis of multiple-date imagery provides information on how the variables are changing through time. Multi-date satellite images are also used to detect change in forest cover.
stresses in the plants. The narrow bandwidth allows HSI to discriminate between the plant species and vegetation types having very small difference in the spectral reflectance. Hyperspectral remote sensing is becoming popular among botanists, plant biochemists environmentalists, agriculturists and geologists. It is proving to be a good tool for studying plant physiology, canopy biochemistry, plant productivity, biomass, detecting health of the plants and for vegetation mapping. This recent technique of remote sensing generates a large volume of data hence requires a lot of space to store it. Hyperspectral image processing is different from multipsectral one hence it requires special tools for processing and analysis.Also a lot of expertise and skills are needed for interpreting data acquired from HSI instruments correctly and for getting desired results. Some of the Hyperspectral Remote Sensing systems are as follows: Airborne Visible Infrared Imaging Spectrometer (AVIRIS) AVIRIS acquires images in 224 spectral bands which are 9.6 nm wide. The range of these bands is in between 400nm to 2500 nm region of electromagnetic spectrum. Compact Airborne Spectrographic Imager (CASI) This imaging spectrometer collects data in 288 bands in the range between 400nm to 1000nm. The spectral interval of each band is 1.8nm. Hyperspectral Mapping (HYMAP) System It is an across-track hyperspectral imaging instrument. It collects data in 128 bands in the range of 400-2500nm. Moderate Resolution Imaging Spectrometer (MODIS) This hyperspectral imaging sensor is one of the sensors on TERRA satellite. It acquire data in 36 spectral bands and its spatial resolution ranges between 250m to 1 km (to be precise- Band 1 & 2 : 250m x 250m, Band 3 to 7 : 500m x 500m and Band 8 to 36 : 1km x 1km.)
Shape Shape of ground objects and features is one of the most important elements to identify them. For example- Road and rail both appear like lines but road has sharp curves while railway track has smooth curves. Natural water bodies are with irregular shape while most of the man-made water bodies have definite shape (rectangular, circular etc.). Same is true for natural drainage and man-made canal. Size Sizes are important in identifying many objects, e.g. on the basis of size we can differentiate between trees and bushes. Length, width, height (i.e. dimensions) and area provide clue for many objects. We can interpret in terms of absolute and relative sizes. When we talk about absolute size of a feature we go for exact dimensions while in relative size we look in terms of smaller or bigger. In high resolution satellite image one can easily assume size of a building by comparing it with size of a car parked next to it. Tone It refers to reflectance of features which we see in the form of tone of colours in a satellite image. As we discussed earlier different objects appear
different in an image depending upon their spectral signatures. Hence this element gives firm evidence in identification of many features. On the basis of tone we can differentiate between plant species, age of plants, shallow & deep water bodies, dry & wet soil, crop types etc. Texture Texture of ground feature refers to how tones vary in the image. In other terms, how frequently tone varies. Textures are often said to be coarse and fine/smooth e.g. young plants generally have smooth texture while mature ones appear coarse-textured. Crops have smoother texture than vegetation. Scale or resolution of satellite image should be considered while interpretation is being done on the basis of texture because low resolution images will show most of the features smooth-textured and after certain scale we can not differentiate between objects only on it. Pattern Arrangement of objects also helps in image interpretation. Most of the manmade features show definite pattern hence these can easily be differentiated from natural objects. For example, plantations have definite arrangement of trees with well defined pattern while natural vegetation will not have uniform pattern. Shadow Shadows are both good and bad for image interpretation. These are good for studying relief and identifying hilly regions. Tall objects (like clock tower, overhead water tanks etc.) which are sometimes difficult to locate, can easily identified with the help of their shadows. Shadows are bad because these mask most of the features coming in their zone. These particularly create problems in hilly terrain where hill shadows hide information about vegetation and many other features. Location/site Site of presence of an object helps in avoiding misinterpreting it as other same looking object. For example- tones of two vegetation species may appear similar in an image but their geographical location can help to identify them correctly. Association
While interpreting an image we should always consider how a particular feature is associated with its surroundings. For example- one can identify a village by- its small number of settlements, connecting roads, agriculture land in its adjoining area and often a water body. Whether we are experts or novice- in image interpretation we should always consider a number of elements before concluding about features. Only considering one element may lead to erroneous identification of objects.
Satellite images contain loads of information regarding the land area in their coverage. These are like mines having tons of metal but you have to explore, process and convert them into finished products to make them usable and to get desired information. Information extraction from satellite images requires basic knowledge of image interpretation, skills in image processing and compatible software which can convert data into information. We have already discussed in detail regarding elements of visual image interpretation in the section Reading a Satellite Image. With the advancement of technology digital image processing is also advanced a lot and is being practiced for information extraction from satellite images very effectively. What we often skip in visual image interpretation due to limitation of our eyes can be bring out using digital image processing methods. In digital image processing number of algorithms are used to process satellite images. These algorithms digitally manipulate the raw images and convert them into desired information. These are mostly used to emphasize and extract features of our interests. For example- vegetation indices are applied for deriving valuable information regarding vegetation. Digital image processing, in simple word, is playing with the digital numbers (DN) of pixels. But we have to be master of this play if we want to get desirable results. It is as easy and as difficult as playing with simple numbers. Image processing techniques are based on our day-to-day addition, subtraction, multiplication and division operators. A good knowledge of statistics is also required because in many of image processing
techniques statistics is very frequently applied. For example- Principal Component Analysis (PCA) of satellite images is statistics based process. PCA is widely used to extract useful information from multiple bands filtering noise from the data. Vegetation Index Vegetation indices are very frequently and commonly used for vegetation related studies. These indices are (mostly) based on 'ratioing' of infrared and red bands. This is because vegetation reflects a large number of EMRs in infrared regions while absorbs EMRs in red region. Some of the vegetation indices are ratio vegetation index (RVI), normalized difference vegetation index (NDVI), transformed vegetation index (TVI) etc. NDVI is particularly helpful in analysis of vegetation health and vegetation cover density. Principal Component Analysis (PCA) PCA techniques are used to compile information from a large number of bands to lesser number of bands. Suppose we have to extract information from five bands of Landsat ETM+ image. When we do PCA- it will analyze all the five bands and remove redundant information to provide output in the form of PCA images. The first PCA image will contain most of the information and the information content will keep on decreasing in second, third and subsequent images. In other words we can say PCA compresses multiple band information into one or two images. Need not to say this technique enhances variance in the satellite images.
Applications
Applications of Remote Sensing
Forestry & Ecosystem
Forest cover & density mapping Deforestation mapping Forest fire mapping Wetland mapping and monitoring Biomass estimation Species inventory
Agriculture
Crop type classification Crop condition assessment Crop yield estimation Mapping of soil characteristic Soil moisture estimation
Geology
Lithological mapping Mineral exploration Environmental geology Sedimentation mapping and monitoring Geo-hazard mapping Glacier mapping
Hydrology
Watershed mapping & management Flood delineation and mapping Ground water targeting
Urban Planning
Land parcel mapping Infrastructure mapping Land use change detection Future urban expansion planning
Ocean applications
Storm forecasting Water quality monitoring Aquaculture inventory and monitoring Navigation routing Coastal vegetation mapping Oil spill
Urban areas behave as heat islands due to very high emission of thermal radiations. Their temperature remain higher than surrounding non-urban areas- basically due to high energy consumption, very large surface area covered by asphalt and concrete, lesser soil surfaces and low vegetation cover. Energy is mostly consumed in the form of electricity and in the process large amount of heat is released. While concrete and asphalt surfaces (like buildings, roads, pavements etc.) absorb solar radiations and later release thermal emissions. Vehicular and industrial pollution in cities also contribute to heat emissions and so in increasing urban ambient temperature. Thermal imaging devices help in detecting relative warmness/coolness of urban islands. High energy consumption areas and hot spots can be mapped using thermal remote sensing images. As we all know vegetation reduces ambient temperature significantly so thermal imaging can help in identifying urban areas where plantation is required to cool down the hot spots.
Satellite Sensors
Popular Remote Sensing Systems
LANDSAT Landsat satellite sensors are one of the most popular remote sensing systems, the imagery acquired from these are widely used across the globe. NASAs Landsat satellite programme was started in 1972. It was formerly known as ERTS (Earth Resource Technology Satellite) programme. The first satellite in the Landsat series Landsat-1 (formerly ERTS-1) was launched on July 23, 1972 .Since then five different types of sensors have been included in various combinations in Landsat mission from Landsat-1 through Landsat-7. These sensors are Return Beam Vidicon (RBV), the Multispectral Scanner (MSS), the Thematic Mapper (TM), the Enhanced Thematic Mapper (ETM) and the Enhanced Thematic Mapper plus (ETM+). Landsat ETM (or Landsat 6) was launched in 1993 but it could not achieve the orbit. Six year later in 1999 Landsat ETM+ (or Landsat 7) was launched and it is the recent one in the series. Landsat ETM+ contains four bands in Near Infrared-visible (NIR-VIS) region with 30mx30m spatial resolution, two bands in Short Wave Infrared (SWIR) region with same resolution, one in Thermal Infrared (TIR) region with spatial resolution of 60mx60m and one panchromatic band with resolution. Its revisit period is 16 days. SPOT SPOT (Systeme Pour lObservation de la Terre) was developed by the French Centre National d Etuded Spatiales with Belgium and Sweden. The first satellite of SPOT mission, SPOT-1 was launched in 1986. It was followed by SPOT-2 (in 1990), SPOT-3 (in 1993), SPOT-4 (in 1998) and SPOT-5 (in 2002). There are two imaging systems in SPOT-5- HRVIR and Vegetation. The HRVIR records data in three bands in VIS-NIR region with 10mx10m spatial resolution, one band in SWIR region with 20mx20m spatial resolution and one panchromatic band with 5mx5m resolution. The Vegetation instrument is primarily designed for vegetation monitoring and related studies. It acquires images in three bands in VIS-NIR region and in one band in SWIR region (all with 1000mx1000m) spatial resolution. Advanced Very High Resolution Radiometer (AVHRR) Several generations of satellites have been flown in the NOAA-AVHRR series. NOAA-15 is the recent in the series. The sensor AVHRR (Advanced Very High Resolution radiometer) contains five spectral channels two in VIS-NIR region and three in TIR. One thermal band is of the wavelength range 3.55-3.93 mm, meant for fire detection. Spatial resolution of AVHRR is 1100mx1100m. NOAA-AVHRR mainly serves for global vegetation mapping, monitoring land cover changes and agriculture related studies with daily coverage.
Indian Remote Sensing (IRS) Satellites The Indian Remote Sensing programme began with the launch of IRS-1A in 1988. After that IRS-1B (1999), IRS-1C (1995) and IRS-1D (1997) was launched. IRS-1D carries three sensors: LISS III with three bands of 23.5mx23.5m spatial resolution in VIS-NIR range and one band in SWIR region with 70.5x70.5 m resolution, a panchromatic sensor, with 5.8mx5.8m resolution and a Wide Field Sensor (WiFs) with 188mx188m resolution. WiFS is extensively used for vegetation related studies. ISROs IRS-P6 (RESOURCESAT-1) is very advanced remote sensing system. It was launched in 2003. It carries high resolution LISS IV camera (three spectral bands in VIS-NIR region) with spectral resolution of 5.8mx5.8m which has capability to provide stereoscopic imagery. IRS-P6 LISS III camera acquires images in VIS-NIR (3 spectral bands) and SWIR (one spectral band) with spatial resolution of 23.5mx23.5m. IRS-P6 AWiFS (Advanced Wide Field Sensor) operates in VIS-NIR (3 spectral bands) and SWIR (one spectral band) with spatial resolution of 56mx56m.
CARTOSAT is the one of the most advanced remote sensing systems launched by Indian Space Research Organization (ISRO) so far. In the history of remote sensing it is a milestone achieved by India. The main purpose of this mission is to provide large-scale high resolution satellite imagery which can efficiently be used for cartographic purposes. These are particularly very useful for urban planners in urban mapping and planning. Till date two satellites are launched in CARTOSAT seriesCARTOSAT-1 and CARTOSAT-2. CARTOSAT-1 It is the first in the CARTOSAT series and was launched in May 2005 by satellite launch vehicle PSLV-C6. It operates in panchromatic mode (black & white) in the EMRs range of 0.50 micron to 0.85 micron. It carries two cameras to generate stereoscopic images. It provides spatial resolution of 2.5mx2.5m and radiometric resolution of 10-bit. Its revisit period is of 5 days. The stereoscopic capability of CARTOSAT-1 gives it edge over many other remote sensing systems internationally. We can generate Digital Elevation Models (DEMs) and 3-D maps using its stereo-images. Apart from cartographic applications CARTOSAT-1 images are useful in resource
management, pre & post disaster planning & management. CARTOSAT-2 The second satellite of CARTOSAT series was launched in January 2007. It also operates in panchromatic mode with a spectral band of 0.50-0.85 micron. It is a very high resolution sensor system with less than one meter spatial resolution. Its revisit period is of 4-days.It will be highly useful for urban, rural planning and in cadastral level studies. It has already started acquiring images successfully. ISRO recently released some sample images acquired by CARTOSAT-2.
Miscellaneous
Before Selecting a Satellite Image!!
Choice of satellite imagery very much depends on purpose of the user, in other words what kind of information we want. There are certain things which always kept in mind while procuring a satellite image:
Time of the year Cloud coverage Spatial resolution Spectral resolution Financial constraints
Time of the Year Time is very important factor to decide whether to pick a satellite image or not. For example, if you want to study deciduous vegetation one must not go for image that is acquire by the sensor in autumn. Or if you want to map water holes and have procured summer time imagery it will be disastrous. One should not misunderstand that autumn-time images are useless; instead these are very useful if evergreen vegetation are to be studied and mapped. In fact these images are helpful in differentiating deciduous and evergreen vegetation type. Cloud Cover Cloud is also important factor as these may mask the ground features partially or completely. Thats why rainy season images are not preferred (until unless one wants to do some specific study for this season). If you are lucky you can get cloud-free image in rainy season too. Sometimes winter & summer time images may also contain clouds. So always see preview of images before procuring them and avoid images having more than 10% cloud coverage. Clouds are the main constraint in various types of land use/land cover studies for North-East India and Andmans & Nicobar Islands as most of the time these areas remain covered with clouds. Spatial & Spectral Resolution
These resolutions are important in the sense- what kind of study you want to do? If you want satellite images for cartographic purpose then high resolution panchromatic images are good (e.g., CARTOSAT-1 & 2 images, IKONOS panchromatic images etc.). On the other hand for forest mapping, multispectral images are required which may be low in spatial resolution like Landsat ETM+ Images or IRS-1D LISS III images. In some cases when builtup area & vegetation types are mapped to be togather- then multispectral images with high spatial resolutions are used (e.g. IRS-P6 MX LISS IV images). Financial Constraints If you have selected a cloud free satellite image for a right season with right resolution then you must see you pocket also- whether you have sufficient money or not. Going for high resolution images are definitely going to cost much higher. So where a Landsat ETM+ or IRS-1D images can work dont go for IKONOS multispectral images. Also processed images cost high. If you have good image processing software and skilled team then dont buy processed images- as processing you can do in your lab itself. If you can spend some extra pennies then procuring georeferenced images from your vendor (like NRSA, Space Imaging etc.) can be a good idea as it will save your time and also you get data with acceptable accuracy.
1903: Airplane invented by Wright brothers. 1909: Photography from airplanes. 1910s: Aerial Photo Reconnaissance: World War I. 1920s: Civilian use of aerial photography and Photogrammetry. 1934: American society of Photogrammetry founded. 1935: Radar invention by Robert Watson-Watt. 1939-45: Advances in Photo Reconnaissance and applications of non-visible portion of EMR: World War II. 1942: Kodak patents first false colour infrared film. 1956: Colwells research on diseases detection with IR photography. 1960: Term Remote Sensing coined by Office of Naval Research personnel 1972: ERTS-1 launched (renamed Landsat-1). 1975: ERTS-2 launched (renamed Landsat-2). 1978: Landsat-3 launched. 1980s: Development of Hyperspectral sensors. 1982: Landsat-4 TM & MSS launched. 1984: Landsat-5 TM launched. 1986: SPOT-1 launched. 1995: IRS 1C launched. 1999: Landsat-7 ETM+ launched. 1999: IKONOS launched. 1999: NASAs Terra EOS launched.
2002: ENVISAT launched. 2003: ISRO's RESOURCESAT-1 (IRS P6) launched. 2005: ISRO's CARTOSAT-1 launched. 2007: ISRO's CARTOSAT-2 launched.