Вы находитесь на странице: 1из 16

Image Processing and Analysis Image Processing and Analysis can be defined as the "act of examining images for

the purpose of identifying objects and judging their significance" Image analyst study the remotely sensed data and attempt through logical process in detecting, identifying, classifying, measuring and evaluating the significance of physical and cultural objects, their patterns and spatial relationship. Digital data In a most generalized way, a digital image is an array of numbers depicting spatial distribution of a certain field parameters such as reflectivity of !" radiation, emissivity, temperature or some geophysical or topographical elevation. #igital image consists of discrete picture elements called pixels. Associated with each pixel is a number represented as #$ #igital $umber%, that depicts the average radiance of relatively small area within a scene. &he range of #$ values being normally ' to ()). &he size of this area effects the reproduction of details within the scene. As the pixel size is reduced more scene detail is preserved in digital representation. *emote sensing images are recorded in digital forms and then processed by the computers to produce images for interpretation purposes. Images are available in two forms + photographic film form and digital form. ,ariations in the scene characteristics are represented as variations in brightness on photographic films. A particular part of scene reflecting more energy will appear bright while a different part of the same scene that reflecting less energy will appear blac-. #igital image consists of discrete picture elements called pixels. Associated with each pixel is a number represented as #$ #igital $umber%, that depicts the average radiance of relatively small area within a scene. &he size of this area effects the reproduction of details within the scene. As the pixel size is reduced more scene detail is preserved in digital representation. Data formats for digital satellite imagery #igital data from the various satellite systems supplied to the user in the form of computer readable tapes or .#+*/". As no worldwide standard for the storage and transfer of remotely sensed data has been agreed upon, though the .!/0 .ommittee on !arth /bservation 0atellites% format is becoming accepted as the standard. #igital remote sensing data are often organised using one of the three common formats used to organise image data . 1or an instance an image consisting of four spectral channels, which can be visualised as four superimposed images, with corresponding pixels in one band registering exactly to those in the other bands. &hese common formats are2 3and Interleaved by Pixel 3IP% 3and Interleaved by 4ine 3I4% 3and 0e5uential 36% Image Processing and Analysis #igital image analysis is usually conducted using *aster data structures + each image is treated as an array of values. It offers advantages for manipulation of pixel values by image processing system, as it is easy to find and locate pixels and their values. #isadvantages becomes apparent when one needs to represent the array of pixels as discrete patches or regions, where as ,ector data structures uses polygonal patches and their boundaries as fundamental units for analysis and manipulation. &hough vector format is not appropriate for digital analysis of remotely sensed data. Image resolution *esolution can be defined as "the ability of an imaging system to record fine details in a distinguishable manner". A wor-ing -nowledge of resolution is essential for understanding both practical and conceptual details of remote sensing. Along with the actual positioning of spectral bands, they are of paramount importance in determining the suitability of remotely sensed data for a given applications. &he major characteristics of imaging remote sensing instrument operating in the visible and infrared spectral region are described in terms as follow2

0pectral resolution *adiometric resolution 0patial resolution &emporal resolution Spectral Resolution refers to the width of the spectral bands. As different material on the earth surface exhibit different spectral reflectances and emissivities. &hese spectral characteristics define the spectral position and spectral sensitivity in order to distinguish materials. &here is a tradeoff between spectral resolution and signal to noise. &he use of well +chosen and sufficiently numerous spectral bands is a necessity, therefore, if different targets are to be successfully identified on remotely sensed images. Radiometric Resolution or radiometric sensitivity refers to the number of digital levels used to express the data collected by the sensor. It is commonly expressed as the number of bits binary digits% needs to store the maximum level. 1or example 4andsat &" data are 5uantised to ()7 levels e5uivalent to 8 bits%. 9ere also there is a tradeoff between radiometric resolution and signal to noise. &here is no point in having a step size less than the noise level in the data. A low+5uality instrument with a high noise level would necessarily, therefore, have a lower radiometric resolution compared with a high+5uality, high signal+to+noise+ratio instrument. Also higher radiometric resolution may conflict with data storage and transmission rates. Spatial Resolution of an imaging system is defines through various criteria, the geometric properties of the imaging system, the ability to distinguish between point targets, the ability to measure the periodicity of repetitive targets ability to measure the spectral properties of small targets. &he most commonly 5uoted 5uantity is the instantaneous field of view I1/,%, which is the angle subtended by the geometrical projection of single detector element to the !arth:s surface. It may also be given as the distance, # measured along the ground, in which case, I1/, is clearly dependent on sensor height, from the relation2 # ; hb, where h is the height and b is the angular I1/, in radians. An alternative measure of the I1/, is based on the P01, e.g., the width of the P#1 at half its maximum value. Image Processing and Analysis A problem with I1/, definition, however, is that it is a purely geometric definition and does not ta-e into account spectral properties of the target. &he effective resolution element !*!% has been defined as "the size of an area for which a single radiance value can be assigned with reasonable assurance that the response is within )< of the value representing the actual relative radiance". 3eing based on actual image data, this 5uantity may be more useful in some situations than the I1/, /ther methods of defining the spatial resolving power of a sensor are based on the ability of the device to distinguish between specified targets. /f the concerns the ratio of the modulation of the image to that of the real target. "odulation, ", is defined as2 M = Emax -Emin / Emax + Emin =here !max and !min are the maximum and minimum radiance values recorded over the image. Temporal resolution refers to the fre5uency with which images of a given geographic location can be ac5uired. 0atellites not only offer the best chances of fre5uent data coverage but also of regular coverage. &he temporal resolution is determined by orbital characteristics and swath width, the width of the imaged area. 0wath width is given by (htan 1/,>(% where h is the altitude of the sensor, and 1/, is the angular field of view of the sensor. How to impro e your image! Analysis of remotely sensed data is done using various image processing techni5ues and methods that includes2

Analog image processing #igital image processing. Visual or Analog processing techniques is applied to hard copy data such as photographs or printouts. Image analysis in visual techni5ues adopts certain elements of interpretation, which are as follows2 &he use of these fundamental elements of depends not only on the area being studied, but the -nowledge of the analyst has of the study area. 1or example the texture of an object is also very useful in distinguishing objects that may appear the same if the judging solely on tone i.e., water and tree canopy, may have the same mean brightness values, but their texture is much different. Association is a very powerful image analysis tool when coupled with the general -nowledge of the site. &hus we are adept at applying collateral data and personal -nowledge to the tas- of image processing. =ith the combination of multi+concept of examining remotely sensed data in multispectral, multitemporal, multiscales and in conjunction with multidisciplinary, allows us to ma-e a verdict not only as to what an object is but also its importance. Apart from these analog image processing techni5ues also includes optical photogrammetric techni5ues allowing for precise measurement of the height, width, location, etc. of an object. Elements of Image Interpretation 3lac- and =hite &one Primary !lements .olor 0tereoscopic Parallax 0ize 0patial Arrangement of 0hape &one ? .olor &exture Pattern 3ased on Analysis Primary !lements .ontextual !lements of 9eight 0hadow 0ite Association

Image Processing and Analysis #igital Image Processing is a collection of techni5ues for the manipulation of digital images by computers. &he raw data received from the imaging sensors on the satellite platforms contains flaws and deficiencies. &o overcome these flaws and deficiencies inorder to get the originality of the data, it needs to undergo several steps of processing. &his will vary from image to image depending on the type of image format, initial condition of the image and the information of interest and the composition of the image scene. #igital Image Processing undergoes three general steps2 Pre+processing #isplay and enhancement Information extraction Pre-processing consists of those operations that prepare data for subse5uent analysis that attempts to correct or compensate for systematic errors. &he digital imageries are subjected to several corrections such as geometric, radiometric and atmospheric, though all these correction might not be necessarily be applied in all cases. &hese errors are systematic and can be removed before they reach the user. &he investigator should decide which pre+processing techni5ues are relevant on the basis of the nature of the information to be extracted from remotely sensed data. After pre+

processing is complete, the analyst may use feature extraction to reduce the dimensionality of the data. &hus feature extraction is the process of isolating the most useful components of the data for further study while discarding the less useful aspects errors, noise etc%. 1eature extraction reduces the number of variables that must be examined, thereby saving time and resources. Image Enhancement operations are carried out to improve the interpretability of the image by increasing apparent contrast among various features in the scene. &he enhancement techni5ues depend upon two factors mainly &he digital data i.e. with spectral bands and resolution% &he objectives of interpretation As an image enhancement techni5ue often drastically alters the original numeric data, it is normally used only for visual manual% interpretation and not for further numeric analysis. .ommon enhancements include image reduction, image rectification, image magnification, transect extraction, contrast adjustments, band ratioing, spatial filtering, 1ourier transformations, principal component analysis and texture transformation. Information Extraction is the last step toward the final output of the image analysis. After pre+ processing and image enhancement the remotely sensed data is subjected to 5uantitative analysis to assign individual pixels to specific classes. .lassification of the image is based on the -nown and un-nown identity to classify the remainder of the image consisting of those pixels of un-nown identity. After classification is complete, it is necessary to evaluate its accuracy by comparing the categories on the classified images with the areas of -nown identity on the ground. &he final result of the analysis consists of maps or images%, data and a report. &hese three components of the result provide the user with full information concerning the source data, the method of analysis and the outcome and its reliability. Image Processing and Analysis pre processing of remotely sensed images =hen remotely sensed data is received from the imaging sensors on the satellite platforms it contains flaws and deficiencies. Pre+processing refers to those operations that are preliminary to the main analysis. Preprocessing includes a wide range of operations from the very simple to extremes of abstractness and complexity. &hese categorized as follow2 @. 1eature !xtraction (. *adiometric .orrections A. Beometric .orrections C. Atmospheric .orrection &he techni5ues involved in removal of unwanted and distracting elements such as image>system noise, atmospheric interference and sensor motion from an image data occurred due to limitations in the sensing of signal digitization, or data recording or transmission process. *emoval of these effects from the digital data are said to be "restored" to their correct or original condition, although we can, of course never -now what are the correct values might be and must always remember that attempts to correct data what may themselves introduce errors. &hus image restoration includes the efforts to correct for both radiometric and geometric errors. "eature Extraction 1eature !xtraction does not mean geographical features visible on the image but rather "statistical" characteristics of image data li-e individual bands or combination of band values that carry information concerning systematic variation within the scene. &hus in a multispectral data it helps in portraying the necessity elements of the image. It also reduces the number of spectral bands that has to be analyzed. After the feature extraction is complete the analyst can wor- with the desired channels or bands, but inturn the individual bandwidths are more potent for information. 1inally such a pre+processing increases the speed and reduces the cost of analysis.

#adiometric $orrections *adiometric .orrections are carried out when an image data is recorded by the sensors they contain errors in the measured brightness values of the pixels. &hese errors are referred as radiometric errors and can result from the @. Instruments used to record the data (. 1rom the effect of the atmosphere *adiometric processing influences the brightness values of an image to correct for sensor malfunctions or to adjust the values to compensate for atmospheric degradation. *adiometric distortion can be of two types2 @. &he relative distribution of brightness over an image in a given band can be different to that in the ground scene. (. &he relative brightness of a single pixel from band to band can be distorted compared with spectral reflectance character of the corresponding region on the ground. Image Processing and Analysis &he following methods defines the outline the basis of the cosmetic operations for the removal of such defects2 %ine-Dropouts A string of adjacent pixels in a scan line contain spurious #$. &his can occur when a detector malfunctions permanently or temporarily. #etectors are loaded by receiving sudden high radiance, creating a line or partial line of data with the meaningless #$. 4ine dropouts are usually corrected either by replacing the defective line by a duplicate of preceding or subse5uent line, or ta-ing the average of the two. If the spurious pixel, sample x, line y has a value #$x,y then the algorithms are simply2 #$x,y ; #$x,y+@ #$x,y ; #$x,y+@ D #$x,yD@%>( De-&triping 3anding or striping occurs if one or more detectors go out of adjustment in a given band. &he systematic horizontal banding pattern seen on images produced by electro+mechanical scanners such as 4andsat:s "00 and &" results in a repeated patterns of lines with consistently high or low #$. &wo reasons can be thus put forward in favor of applying a :de+striping: correction 2 @. &he visual appearance and interpretability of the image are thereby improved. (. !5ual pixel values in the image are more li-ely to represent areas of e5ual ground leaving radiance, other things being e5ual. &he two different methods of de+striping are as follow2 First method entails a construction of histograms for each detector of the problem band, i.e., histograms generated from by the six detectors2 these histograms are calculated for the lines @,E,@A,FF, lines (, 8, @C, FF, etc. &hen the means and standard deviation are calculated for each of the six histograms. Assuming the proportion of pixels representing different soils, water, vegetation, cloud, etc. are the same for each detector, the means and standard deviations of the 7 histograms should be the same. 0tripes, however are characterised by distinct histograms. #e+striping then re5uires e5ualisation of the means and standard deviation of the six detectors by forcing them to e5ual selected values + usually the mean and standard deviation for the whole image. &he process of histogram matching is also utilised before mosaic-ing image data of adjacent scenes recorded at diferent times% so as to accommodate differences in illumination levels, angles etc. A further application is resolution merging, in which a low spatial resolution image is sharpened by merging with high spatial resolution image. Second method is a non+linear in the sense that relationship between radiance rin received at the detector% and rout output by the sensor% is not describable in terms of a single linear segments.

Image Processing and Analysis #andom noise /dd pixels that have spurious #$ crop up fre5uently in images + if they are particularlt distracting, they can be suppressed by spatial filtering. 3y definition, these defects can be identified by their mar-ed differences in #$ from adjacent pixels in the affected band. $oisy pixels can be replaced by substituting for an average value of the neighborhood #$. "oving windows of A x A or ) x ) pixels are typically used in such procedures. 'eometric corrections *aw digital images often contain serious geometrical distortions that arise from earth curvature, platform motion, relief displacement, non+linearities in scanning motion. &he distortions involved are of two types2 @. $on+systematic #istortion (. 0ystematic #istortions *ectification is the process of projecting image data onto a plane and ma-ing it conform to a map projection system. *egistration is the process of ma-ing image data conform to another image. A map coordinate system is not necessarily involved. 9owever rectification involves rearrangement of the input pixels onto a new grid which conforms to the desired map projection and coordinate system. *ectification and *egistration therefore involve similar sets of procedures for both the distortions. (on-&ystematic Distortions &hese distortions are caused due to variations in spacecraft variables. &hese distortion can be evaluated as follow2 Distortion E aluated from )rac*ing Data #ue to fig @ &he amount of earth rotation during (7 sec re5uired to scan an image results in distortion. &he correction for this distortion can be done by scanning @7 successive group of lines, offset towards the west to compensate for the earth rotation, which causes the parallelogram outline of the restored image. Its is true for &" Image. 1ig.(% Image Processing and Analysis Distortion E aluated from 'round $ontrol .aused during the spacecraft scan of the ground . Altitude ,ariation 1ig.A% Attitude ,ariation + pitch, roll ? yaw 1ig.C% .orrection Process for $on+systematic #istortions @. %ocating 'round $ontrol Points &his process employs identification of geographic features on the image called ground control points B.Ps%, whose position are -nown such as intersection of streams, highways, airport, runways etc. 4ongitude and latitude of B.Ps can be determined by accurate base maps where maps are lac-ing BP0 is used to determine the 4atitude and 4ongitude from navigation satellites. &hus a B.P is located in the field and determing its position using BP0. Accurate B.Ps are essential to accurate rectification. B.Ps should be G *eliably matched between source and reference e.g., coastline features, road intersection, etc.% G =idely disperced throughout the source image G #esampling Met+ods &he location of output pixels derived from the ground control points B.Ps% is used to establish the geometry of the output image and its relationship to the input image. #ifference between actual B.P location and their position in the image are used to determine the geometric transformation re5uired to restore the image. &his transformation can be done by different resampling methods where original pixels are resampled to match the geometric coordinates. !ach resampling

method employs a different strategy to estimate values at output grid for given -nown values for the input grid. G Nearest Neighbor &he simplest strategy is simply to assign each corrected pixel, the value from the nearest uncorrected pixel. It has the advantages of simplicity and the ability to preserve original values in the altered scene, but it may create noticeable errors, which may be severe in linear features where the realignment of pixels is obvious. 1ig. )%. G Bilinear Interpolation &he strategy for the calculation of each output pixel value is based on a weighted average of the four nearest input pixels. &he output image gives a natural loo- because each output value is based on several input values. &here are some changes occurred when bilinear interpolation creates new pixel value. 1ig.7% G 3rightness values in the input image are lost G As the output image is resampled by averaging over areas, it decreases the spatial resolution of the image G Cubic Convolution It is the most sophisticated and complex method of resampling. .ubic convolution uses a weighted average of values within a neighborhood of () adjacent pixels. &he images produced by this method are generally more attractive but are drastically altered than nearest neighbor and bilinear interpolation. 1ig.E%. G Image $orrection using Mapping Polynomial Polynomial e5uations are used to convert the source coordinates to rectified coordinate, using @st and (nd order transformation . &he coffiecients of the polynomial such as ai and bi are calculated by the least s5uare regression method, that will help in relating any point in the map to its corresponding point in the image. x' ; b@ D b(xi D bAyi y' ; a@ D a(xi D aAyi =here xI yI % are the input coordinates and x' y' % are the output coordinates. Initially few B.Ps cofficients are re5uired to calculate the transformation matrix and the inverse transformation that could convert the reference coordinates of the B.Ps bac- to the source coordinate system. &his enables determination of *"0 error for chosen transformation. &he best order of transformation can be obtained using trial and error process while ignoring the highest *"0 error from the least s5uare computation. Image Processing and Analysis &ystematic Distortions Beometric systematic distortions are those effects that are constant and can be predicted in advance.&hese are of two types2 &can &*ew It is caused by forward motion of the spacecraft during the time of each mirror sweep. In this case the ground swath scanned is not normal to the ground trac-. 1ig.8%. ,nown Mirror -elocity -ariation &he -nown mirror velocity variation are used to correct the minor distortion due to the velocity of the scan mirror not being constant from start to finish of each scan line. 1ig.H% $ross )rac* Distortion &hese generally occur in all the unrestored images acc5uired by the cross trac- scanners. &hey result from sampling pixels along a scan line at constant time intervals. &he width of a pixel is proportional to the tangent of the scan angle and therefore is wider at the either margins of the scan line that compresses the pixel. &his distortion is restored using trignometric functions. 1ig.@'%

0ystematic #istortions are well understood ands easily corrected by applying formulas derived by modelling the sources of distortions mathematically.

Atmosp+eric $orrections &he output from the instrument on satellite depends on the intensity and spectral distribution of energy that is received at the satellite. &he intensity and spectral distribution of energy>radiation has traveled some distance through the atmosphere and accordingly has suffered both attenuation and augmentation in the course of journey. &he problem comes whenone is not able to regenerate the correct radiation properties of the target body on the earth surface with the data generated by the remote sensing Effect .f )+e Atmosp+ere on #adiation /#adiati e )ransfer )+eory0 1ig.@@. !ffect of the atmosphere in determining various paths for energy to illuminate a pixel and reach the sensor &he path radiation coming from the sun to the ground pixel and then being reflected to the sensor. In this on going process, absorption by atmospheric molecules ta-es place that converts incoming energy into heat. In particular, molecules of oxygen, carbon+di+oxide, ozone and water attenuate the radiation very strongly in certain wavelengths. 0cattering by these atmospheric particles is also the dominant mechanism that leads to radiometric distortion in image data. *adiative &ransfer theory is used to ma-e 5uantitative calculations of the difference between the satellite received radiance and earth leaving radiance. Image Processing and Analysis *adiation traveling in a certain direction is specified by the angle f between that direction and the vertical axis z and setting a differential e5uation for a small horizontal element of the transmitting medium the atmosphere% with thic-ness dz. &he resulting differential e5uation is called the radiative transfer e5uation. &he e5uation will therefore be different for different wavelengths of electromagnetic radiation because of the different relative importance of different physical process at different wavelength. (eed for Atmosp+eric $orrection =hen an image is to be utilized, it is fre5uently necessary to ma-e corrections in brightness and geometry for accuracy during interpretation and also some of the application may re5uire correction to evaluate the image accurately. &he various reason for which correction should be done2 #erive ratios in ( bands of multi spectral image since the effect of atmospheric scattering depends on the wavelength, the two channels will be une5ually affected and the computed ratio will not accurately reflect the true ratio leaving the earth:s surface =hen land surface reflectance or sea surface temperature is to be determined. =hen two images ta-en at different times and needed to be compared or mosaic the images $orrection Met+ods *ectifying the image data for the degrading effects of the atmosphere entails modeling the scattering and absorption processes that ta-e place. &here are number of ways of correcting the image data for atmospheric correction Ignore the atmosphere .ollecting the ground truth measurements of target temperature, reflectance etc and calibrating these values or 5uantities on the ground and the radiance values by the sensor. "odeling the absorption or scattering effects for the measurement of the composition and temperature profile of the atmosphere. Itilizing the information about the atmosphere inherent to remotely sensed data i.e use the image to correct itself.

$orrecting "or Atmosp+eric &cattering &his correction is done when the two bands of image are subjected to ratio analysis. Atmospheric scattering scatters short wavelength and causes haze and reduces the contrast ratio of images. &his follows two techni5ues for example &" bands @ ? E, where &" @ has the highest component of @ and the &"E infrared% has the least. 3oth techni5ues are #$ value dependent as &" band E is free from scattering effect there it has #$ value either ' or @ shadows%. @. In &" E the shadows having #$ value ' ? @. $ow for each pixel the #$ in &" E is plotted against &" @ and a straight line is fitted through the plot using least s5uare techni5ues. If there was no haze in &" @ then the line would pass through the origin. 3ut as there is haze the intercept is offset along the band @. 9aze has an additive effect on scene brightness. &herefore to correct the haze effect on &" @, the value of the intercept offset is subtracted from the #$ of each band @ pixel for the entire image. 1ig @(% (. &he second techni5ue also uses the areas with #$ as ' or @ in &" E. &he histogram of &" E has pixels with ' where as the histogram of &" @ lac-s the pixel in the range from ' to (' approximately because of light scattered into the detector by atmosphere thus this abrupt increase in pixels in &" @ is subtracted from all the #$s in band @ to restore effects of atmospheric scattering. 1ig @A% Image Processing and Analysis &he amount of atmospheric correction depends upon =avelength of the bands Atmospheric conditions 0hort wavelength cause more severe scattering. 9umid, smoggy and dusty cause more scattering than clear and dry atmospheres. Implementing t+e Models #ocumented information on the atmospheric conditions is used to estimate atmospheric using computer codes in standard Atmospheric "odels. 4/=&*A$, "/#&*A$ and 9I&*A$ are some standard models providing them with type of sensor, target altitudes and loo-, the atmospheric correction could be done. Image En+ancement )ec+ni1ues Image !nhancement techni5ues are instigated for ma-ing satellite imageries more informative and helping to achieve the goal of image interpretation. &he term enhancement is used to mean the alteration of the appearance of an image in such a way that the information contained in that image is more readily interpreted visually in terms of a particular need. &he image enhancement techni5ues are applied either to single+band images or separately to the individual bands of a multiband image set. &hese techni5ues can be categorized into two2 0pectral !nhancement &echni5ues "ulti+0pectral !nhancement &echni5ues &pectral En+ancement )ec+ni1ue Density Slicing #ensity 0licing is the mapping of a range of contiguous grey levels of a single band image to a point in the *B3 color cube. &he #$s of a given band are "sliced" into distinct classes. 1or example, for band C of a &" 8 bit image, we might divide the '+()) continuous range into discrete intervals of '+7A, 7C+@(E, @(8+@H@ and @H(+()). &hese four classes are displayed as four different grey levels. &his -ind of density slicing is often used in displaying temperature maps. ontrast Stretching

&he operating or dynamic , ranges of remote sensors are often designed with a variety of eventual data applications. 1or example for any particular area that is being imaged it is unli-ely that the full dynamic range of sensor will be used and the corresponding image is dull and lac-ing in contrast or over bright. 4andsat &" images can end up being used to study deserts, ice sheets, oceans, forests etc., re5uiring relatively low gain sensors to cope with the widely varying radiances upwelling from dar-, bright , hot and cold targets. .onse5uently, it is unli-ely that the full radiometric range of brand is utilised in an image of a particular area. &he result is an image lac-ing in contrast + but by remapping the #$ distribution to the full display capabilities of an image processing system, we can recover a beautiful image. .ontrast 0tretching can be displayed in three catagories2 !inear ontrast Stretch &his techni5ue involves the translation of the image pixel values from the observed range #$min to #$max to the full range of the display device generally '+()), which is the range of values representable in an 8bit display devices%&his techni5ue can be applied to a single band, grey+scale image, where the image data are mapped to the display via all three colors 4I&s. It is not necessary to stretch between #$max and #$min + Inflection points for a linear contrast stretch from the )th and H)th percentiles, or J ( standard deviations from the mean for instance% of the histogram, or to cover the class of land cover of interest e.g. water at expense of land or vice versa%. It is also straightforward to have more than two inflection points in a linear stretch, yielding a piecewise linear stretch. Image Processing and Analysis "istogram Equalisation &he underlying principle of histogram e5ualisation is straightforward and simple, it is assumed that each level in the displayed image should contain an approximately e5ual number of pixel values, so that the histogram of these displayed values is almost uniform though not all ()7 classes are necessarily occupied%. &he objective of the histogram e5ualisation is to spread the range of pixel values present in the input image over the full range of the display device. #aussian Stretch &his method of contrast enhancement is base upon the histogram of the pixel values is called a Baussian stretch because it involves the fitting of the observed histogram to a normal or Baussian histogram. It is defined as follow2 1 x% ; a>p%'.) exp +ax(% Multi-&pectral En+ancement )ec+ni1ues Image Arit+metic .perations &he operations of addition, subtraction, multiplication and division are performed on two or more co+ registered images of the same geographical area. &hese techni5ues are applied to images from separate spectral bands from single multispectral data set or they may be individual bands from image data sets that have been collected at different dates. "ore complicated algebra is sometimes encountered in derivation of sea+surface temperature from multispectral thermal infrared data so called split+window and multichannel techni5ues%. Addition of images is generally carried out to give dynamic range of image that e5uals the input images. $and Su%traction /peration on images is sometimes carried out to co+register scenes of the same area ac5uired at different times for change detection. &ultiplication of images normally involves the use of a single:real: image and binary image made up of ones and zeros.

$and Ratioing or #ivision of images is probably the most common arithmetic operation that is most widely applied to images in geological, ecological and agricultural applications of remote sensing. *atio Images are enhancements resulting from the division of #$ values of one spectral band by corresponding #$ of another band. /ne instigation for this is to iron out differences in scene illumination due to cloud or topographic shadow. *atio images also bring out spectral variation in different target materials. "ultiple ratio image can be used to drive red, green and blue monitor guns for color images. Interpretation of ratio images must consider that they are "intensity blind", i.e, dissimilar materials with different absolute reflectances but similar relative reflectances in the two or more utilised bands will loothe same in the output image. Image Processing and Analysis "istogram Equalisation &he underlying principle of histogram e5ualisation is straightforward and simple, it is assumed that each level in the displayed image should contain an approximately e5ual number of pixel values, so that the histogram of these displayed values is almost uniform though not all ()7 classes are necessarily occupied%. &he objective of the histogram e5ualisation is to spread the range of pixel values present in the input image over the full range of the display device. #aussian Stretch &his method of contrast enhancement is base upon the histogram of the pixel values is called a Baussian stretch because it involves the fitting of the observed histogram to a normal or Baussian histogram. It is defined as follow2 1 x% ; a>p%'.) exp +ax(% Multi-&pectral En+ancement )ec+ni1ues Image Arit+metic .perations &he operations of addition, subtraction, multiplication and division are performed on two or more co+ registered images of the same geographical area. &hese techni5ues are applied to images from separate spectral bands from single multispectral data set or they may be individual bands from image data sets that have been collected at different dates. "ore complicated algebra is sometimes encountered in derivation of sea+surface temperature from multispectral thermal infrared data so called split+window and multichannel techni5ues%. Addition of images is generally carried out to give dynamic range of image that e5uals the input images. $and Su%traction /peration on images is sometimes carried out to co+register scenes of the same area ac5uired at different times for change detection. &ultiplication of images normally involves the use of a single:real: image and binary image made up of ones and zeros. $and Ratioing or #ivision of images is probably the most common arithmetic operation that is most widely applied to images in geological, ecological and agricultural applications of remote sensing. *atio Images are enhancements resulting from the division of #$ values of one spectral band by corresponding #$ of another band. /ne instigation for this is to iron out differences in scene illumination due to cloud or topographic shadow. *atio images also bring out spectral variation in different target materials. "ultiple ratio image can be used to drive red, green and blue monitor guns for color images. Interpretation of ratio images must consider that they are "intensity blind", i.e, dissimilar materials with different absolute reflectances but similar relative reflectances in the two or more utilised bands will loothe same in the output image. Image Processing and Analysis Principal $omponent Analysis

0pectrally adjacent bands in a multispectral remotely sensed image are often highly correlated. "ultiband visible>near+infrared images of vegetated areas will show negative correlations between the near+infrared and visible red bands and positive correlations among the visible bands because the spectral characteristics of vegetation are such that as the vigour or greenness of the vegetation increases the red reflectance diminishes and the near+infrared reflectance increases. &hus presence of correlations among the bands of a multispectral image implies that there is redundancy in the data and Principal Component Analysis aims at removing this redundancy. Principal Components Analysis P.A% is related to another statistical techni5ue called factor analysis and can be used to transform a set of image bands such that the new bands called principal components% are uncorrelated with one another and are ordered in terms of the amount of image variation they explain. &he components are thus a statistical abstraction of the variability inherent in the original band set. &o transform the original data onto the new principal component axes, transformation coefficients eigen values and eigen vectors% are obtained that are further applied in alinear fashion to the original pixel values. &his linear transformation is derived from the covariance matrix of the original data set. &hese transformation coefficients describe the lengths and directions of the principal axes. 0uch transformations are generally applied either as an enhancement operation, or prior to classification of data. In the context of P.A, information means variance or scatter about the mean. "ultispectral data generally have a dimensionality that is less than the number of spectral bands. &he purpose of P.A is to define the dimensionality and to fix the coefficients that specify the set of axes, which point in the directions of greatest variability. &he bands of P.A are often more interpretable than the source data. Decorrelation Stretch Principal omponents can %e stretched and transformed %ac' into R#$ colours - a process 'no(n as decorrelation stretching) If the data are transformed into principal components space and are stretched within this space, then the three bands ma-ing up the *B3 color composite images are subjected to stretched will be at the right angles to each other. In *B3 space the three+color components are li-ely to be correlated, so the effects of stretching are not independent for each color. &he result of decorrelation stretch is generally an improvement in the range of intensities and saturations for each color with the hue remaining unaltered. #ecorrelation 0tretch, li-e principal component analysis can be based on the covariance matrix or the correlation matrix. &he resultant value of the decorrelation stretch is also a function of the nature of the image to which it is applied. &he method seems to wor- best on images of semi+arid areas and it seems to wor- least well where the area is covered by the image includes both land and sea. $anonical $omponents P.A is appropriate when little prior information about the scene is available. .anonical component analysis, also referred to as multiple discriminant analysis, may be appropriate when information about particular features of interest is available. .anonical component axes are located to maximize the separability of different user+defined feature types. Hue2 &aturation and Intensity /HI&0 )ransform 9ues is generated by mixing red, green and blue light are characterised by coordinates on the red, green and blue axes of the color cube. &he hue+saturation+intensity hexcone model, where hue is the dominant wavelength of the perceived color represented by angular position around the top of a hexcone, saturation or purity is given by distance from the central, vertical axis of the hexcone and intensity or value is represented by distance above the apex of the hexcone. 9ue is what we perceive as color. 0aturation is the degree of purity of the color and may be considered to be the amount of white mixed in with the color. It is sometimes useful to convert from *B3 color cube coordinates to 9I0 hexcone coordinates and vice+ versa

&he hue, saturation and intensity transform is useful in two ways2 first as method of image enhancement and secondly as a means of combining co+registered images from different sources. &he advantage of the 9I0 system is that it is a more precise representation of human color vision than the *B3 system. &his transformation has been 5uite useful for geological applications. Image Processing and Analysis "ourier )ransformation &he 1ourier &ransform operates on a single +band image. Its purpose is to brea- down the image into its scale components, which are defined to be sinusoidal waves with varying amplitudes, fre5uencies and directions. &he coordinates of two+dimensional space are expressed in terms of fre5uency cycles per basic interval%. &he function of 1ourier &ransform is to convert a single+band image from its spatial domain representation to the e5uivalent fre5uency+domain representation and vice+versa. &he idea underlying the 1ourier &ransform is that the grey+scale valuea forming a single+band image can be viewed as a three+dimensional intensity surface, with the rows and columns defining two axes and the grey+level value at each pixel giving the third z% dimension. &he 1ourier &ransform thus provides details of &he fre5uency of each of the scale components of the image &he proportion of information associated with each fre5uency component &patial Processing &patial "iltering Spatial Filtering can be described as selectively emphasizing or suppressing information at different spatial scales over an image. 1iltering techni5ues can be implemented through the 1ourier transform in the fre5uency domain or in the spatial domain by convolution. $on olution "ilters 1iltering methods exists is based upon the transformation of the image into its scale or spatial fre5uency components using the 1ourier transform. &he spatial domain filters or the convolution filters are generally classed as either high+pass sharpening% or as low+pass smoothing% filters. !o(-Pass *Smoothing+ ,ilter 4ow+pass filters reveal underlying two+dimensional waveform with a long wavelength or low fre5uency image contrast at the expense of higher spatial fre5uencies. 4ow+fre5uency information allows the identification of the bac-ground pattern, and produces an output image in which the detail has been smoothed or removed from the original. A (+dimensional moving+average filter is defined in terms of its dimensions which must be odd, positive and integral but not necessarily e5ual, and its coefficients. &he output #$ is found by dividing the sum of the products of corresponding convolution -ernel and image elements often divided by the number of -ernel elements. A similar effect is given from a median filter where the convolution -ernel is a description of the P01 weights. .hoosing the median value from the moving window does a better job of suppressing noise and preserving edges than the mean filter. Adaptive filters have -ernel coefficients calculated for each window position based on the mean and variance of the original #$ in the underlying image. "igh-Pass *Sharpening+ ,ilters 0imply subtracting the low+fre5uency image resulting from a low pass filter from the original image can enhance high spatial fre5uencies. 9igh +fre5uency information allows us either to isolate or to amplify the

local detail. If the high+fre5uency detail is amplified by adding bac- to the image some multiple of the high fre5uency component extracted by the filter, then the result is a sharper, de+blurred image. 9igh+pass convolution filters can be designed by representing a P01 with positive centre weightr and negative surrounding weights. A typical AxA 4aplacian filter has a -ernal with a high central value, ' at each corner, and +@ at the centre of each edge. 0uch filters can be biased in certain directions for enhancement of edges. A high+pass filtering can be performed simply based on the mathematical concepts of derivatives, i.e., gradients in #$ throughout the image. 0ince images are not continuous functions, calculus is dispensed with and instead derivatives are estimated from the differences in the #$ of adjacent pixels in the x,y or diagonal directions. #irectional first differencing aims at emphasising edges in image. Image Processing and Analysis "re1uency Domain "ilters &he 1ourier transform of an image, as expressed by the amplitude spectrum is a brea-down of the image into its fre5uency or scale components. 1iltering of these components use fre5uency domain filters that operate on the amplitude spectrum of an image and remove, attenuate or amplify the amplitudes in specified wavebands. &he fre5uency domain can be represented as a (+dimensional scatter plot -nown as a fourier spectrum, in which lower fre5uencies fall at the centre and progressively higher fre5uencies are plotted outward. 1iltering in the fre5uency domain consists of A steps2 1ourier transform the original image and compute the fourier spectrum 0elect an appropriate filter transfer function e5uivalent to the /&1 of an optical system% and multiply by the elements of the fourier spectrum. Perform an inverse fourier transform to return to the spatial domain for display purposes. Image $lassification Image .lassification has formed an important part of the fields of *emote 0ensing, Image Analysis and Pattern *ecognition. In some instances, the classification itself may form the object of the analysis. #igital Image .lassification is the process of sorting all the pixels in an image into a finite number of individual classes. &he classification process is based on following assumptions2 Patterns of their #$, usually in multichannel data 0pectral .lassification%. Spatial relationship with neighbouring pixels *elationships between the data acc5uired on different dates. Pattern Recognition, Spectral Classification, e!tural Analysis and Change "etection are different forms of classification that are focused on A main objectives2 @. #etection of different -inds of features in an image. (. #iscrimination of distinctive shapes and spatial patterns A. Identification of temporal changes in image 1undamentally spectral classification forms the bases to map objectively the areas of the image that have similar spectral reflectance>emissivity characteristics. #epending on the type of information re5uired, spectral classes may be associated with identified features in the image supervised classification% or may be chosen statistically unsupervised classification%. .lassification has also seen as a means to compressing image data by reducing the large range of #$ in several spectral bands to a few classes in a single image. .lassification reduces this large spectral space into relatively few regions and obviously results in loss of numerical information from the original image. &here is no theoretical limit to the dimensionality used for the classification, though obviously the more bands involved, the more computationally intensive the process becomes. It is often wise to remove redundant bands before classification.

Image Processing and Analysis .lassification generally comprises four steps2 Pre-processing, e.g., atmospheric, correction, noise suppression, band ratioing, Principal .omponent Analysis, etc. Training + selection of the particular features which best describe the pattern Decision + choice of suitable method for comparing the image patterns with the target patterns. Assessing the accuracy of the classification &he informational data are classified into systems2 0upervised Insupervised

Super-ised lassification In this system each pixel is supervised for the categorization of the data by specifying to the computer algorithm, numerical descriptors of various class types. &here are three basic steps involved in typical supervised classification )raining &tage3 &he analyst identifies the training area and develops a numerical description of the spectral attributes of the class or land cover type. #uring the training stage the location, size, shape and orientation of each pixel type for each class. $lassification &tage2 !ach pixel is categorised into landcover class to which it closely resembles. If the pixel is not similar to the training data, then it is labeled as un-nown. $umerical mathematical approaches to the spectral pattern recognition have been classified into various categories.

45 Measurements on &catter Diagram !ach pixel value is plotted on the graph as the scatter diagram indicating the category of the class. In this case the (+dimensional digital values attributed to each pixel is plottes on the graph 65 Minimum Distance to Mean $lassifier/$entroid $lassifier &his is a simple classification strategies. 1irst the mean vector for each category is determined from the average #$ in each band for each class. An un-nown pixel can then be classified by computing the distance from its spectral position to each of the means and assigning it to the class with the closest mean. /ne limitation of this techni5ue is that it overloo-s the different degrees of variation. 75 Parallelpiped $lassifier 1or each class the estimate of the maximum and minimum #$ in each band is determine. &hen parallelpiped are constructeds o as to enclose the scatter in each theme. &hen each pixel is tested to see if it falls inside any of the parallelpiped and has limitation A pixel may fall outside the parallelpiped and remained unclassified. &heme data are so strongly corrected such that a pixel vector that plots at some distance from the theme scatter may yet fall within the decision box and be classified erroneously. 0ometimes parallelpiped may overlap in which case the decision becomes more complicated then boundary are slipped. 85 'aussian Maximum %i*eli+ood $lassifier &his method determines the variance and covariance of each theme providing the probability function. &his is then used to classify an un-nown pixel by calculating for each class, the probability that it lies in that class. &he pixel is then assigned to the most li-ely class or if its probability value fail to reach any close defined threshold in any of the class, be labeled as unclassified. *educing data dimensionally before hand is aKone approach to speeding the process up.

Image Processing and Analysis .nsuper-ised lassification &his system of classification does not utilize training data as the basis of classification. &his classifier involves algorithms that examine the un-nown pixels in the image and aggregate them into a number of classes based on the natural groupings or cluster present in the image. &he classes that result from this type of classification are spectral classes. Insupervised classification is the identification, labeling and mapping of these natural classes. &his method is usually used when there is less information about the data before classification. &here are several mathematical strategies to represent the clusters of data in spectral space. 45 &e1uential $lustering In this method the pixels are analysed one at a time pixel by pixel and line by line. &he spectral distance between each analysed pixel and previously defined cluster means are calculated. If the distance is greater than some threshold value, the pixel begins a new cluster otherwise it contributes to the nearest existing clusters in which case cluster mean is recalculated. .lusters are merged if too many of them are formed by adjusting the threshold value of the cluster means. 65 &tatistical $lustering It overloo-s the spatial relationship between adjacent pixels. &he algorithm uses AxA windows in which all pixels have similar vector in space. &he process has two steps o &esting for homogeneity within the window of pixels under consideration. o .luster merging and deletion 9ere the windows are moved one at time through the image avoiding the overlap. &he mean and standard derivation are calculated for each band of the window. &he smaller the standard deviation for a given band the greater the homogenity of the window. &hese values are then compared by the user specified parameter for delineating the upper and lower limit of the standard deviation. If the window passes the homogenity test it forms cluster. .lusters are created untill then number exceeds the user defined maximum number of clusters at which point some are merged or deleted according to their weighting and spectral distances. 75 Iso Data $lustering /Iterati e &elf .rganising Data Analysis )ec+ni1ues0 Its repeatedly performs an entire classification and recalculates the statistics. &he procedure begins with a set of arbitrarily defined cluster means, usually located evenly through the spectral space. After each iteration new means are calculated and the process is repeated until there is some difference between iterations. &his method produces good result for the data that are not normally distributed and is also not biased by any section of the image. 85 #'9 $lustering It is 5uic- method for A band, 8 bit data. &he algorithm plots all pixels in spectral space and then divides this space into A( x A( x A( clusters. A cluster is re5uired to have minimum number of pixels to become a class. *B3 .lustering is not baised to any part of the data.

Вам также может понравиться