REMOTE SENSING AND GIS BOOKS PDF

adminComment(0)

No part of this book may be reproduced or translated in any form, by 2 Electromagnetic energy and remote sensing. 49 The electronic document ( PDF . The textbook Principles of GIS [6] introduced the example of studying the El Ni ˜no. Remote Sensing data is one of the primary data sources in GIS analysis. Handling and interpretation of remote sensing data will never be easy. .. Remote Sensing Learning Resources. BOOKS. Beginners. Remote Sensing: Principles and. The Manual on Advanced Remote Sensing and GIS is for both the Trainees and In all, this Manual on Advanced RS & GIS is intended to assist in developing.


Remote Sensing And Gis Books Pdf

Author:LUVENIA NICKODEM
Language:English, Portuguese, French
Country:Sweden
Genre:Politics & Laws
Pages:527
Published (Last):31.07.2015
ISBN:408-6-77052-112-2
ePub File Size:15.47 MB
PDF File Size:8.34 MB
Distribution:Free* [*Sign up for free]
Downloads:41854
Uploaded by: MYUNG

Textbook of Remote Sensing and Geographical Information Systems (bestthing.info Reddy, 2e, ) - bestthing.info Uploaded by Sai Vikas. Text Book of RS and GIS. In book: Principles of Remote Sensing, Edition: ITC Educational Textbook Review paper: Application of remote sensing and GIS in ecology. JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY HYDERABAD TEXT BOOKS: IV Year bestthing.info CESem. L T/P/. D C 1. Remote sensing of the.

Other details on classification. The next two sections of this chapter deal with change analysis and modeling,.

The author finishes up. Concerning file exchange. Reading the above chapter points out a major omission in the book, i. This book is not int ended for students;. No commercial promotion is.

Chapter 12, dealing with remote sensing applications, is the longest chapter in the book,. It is divided into eight sections and case studies, each with different. Aronof f himself starts with the agriculture section. Wulder et al. This section is followed by two case studies: Important to managers, hH ere , and elsewhere in this chapter , are included valuable.

Geologic applications are handled by. Berger and Fortin, who describe the key geologic structures as detected on remotely. Their case study involves an.

Next, Gallo gives a quick overview. Madry discusses archaeological applications such as archaeological site discovery, and. The next two sections, military applications Aronoff and. Swann and intelligence analysis Last , inevitably overlap while dealing with. The former section provides a table of.

The last application section Hipple and.

Haithcoat concerns urban infrastructure and business geographics and including includes. The concluding chapter on remote sensing and the organization Merchant contains a. In writing about the implementation phase,. In addition, he recommends determining the.

Concerning human resources, the frequency of use of remotely sensed data with using. This suggestion is made so that proper oversight is maintained regardless of. This oversight would. Developing partnerships or working as part of a consortia are also forwarded as a means. Needless to say, he concludes with.

With Appendix A already noted above, Appendix B provides the characteristics of Appendix C lists remote sensing and related. As an edited work, some unevenness and overlap expectedly occurs, although Aronof f. While I have noted. Citations 3. References 0. Since the parameters of hydrological models, such as those presented in the last two sections, depend on catchment characteristics, land use changes will influence the parameter values and thus also the runoff values.

This phenomenon is demonstrated in the literature Schultz, On the basis of land use changes observed in consecutive imagery from satellites or air photography it is possible to extrapolate such tendencies into the future. Remote sensing applications to hydrology: Gert A. Since no remote sensing RS devices have been developed allowing the measurement of river runoff directly, information from RS sources is used to compute runoff values indirectly. This is done with the aid of hydrological models, where RS data are used in two different ways: Three types of models are discussed, the parameters of which are estimated—at least partially—with the aid of RS information.

A mathematical model is demonstrated, which reconstructs monthly river runoff volumes on the basis of IR data obtained by the Meteosat geostationary satellite. A major model parameter, viz. The third model discussed is a water balance model which computes all relevant variables of the water balance equation including runoff on a daily basis.

Parameters used in the model components for interception, evapotranspiration and soil storage are estimated with the aid of RS information originating from Landsat and NOAA data. Examples of the performance of all three models are presented. Input to hydrological models computing runoff is usually either rainfall or snowmelt or both. An example for model input estimation on the basis of satellite data is presented as well as the use of ground-based weather radar rainfall measurements for real time flood forecasting.

An example of snowmelt runoff modelling is mentioned, followed by a brief discussion of future perspectives of runoff computations with the aid of RS data. Application of gis techniques for the quantification of land degradation caused by water erosion. Full-text available. Land decay processes caused by erosion are very serious by their long term impact on the quality of soils, surface waters, environment and living standards.

This article presents the results of a spatial erosion modelling. A new methodology is hereby proposed to estimate the soil losses caused by water erosion. The input data used are four site plans topographical survey, land use layout, soils layout, and erosion control plan map.

Upon processing, seven information layers resulted, which are included in the calculation model, representing actually the factors triggering and maintaining the erosion process.

The scope of erosion modeling is to determine the actual and potential erosion in the hydrographical basin considered for the study. Results show an average annual soil loss much above the tolerable limit in our country. In-house developed software application for erosion simulation in Fortran language was run under Geo-Graph software, that was used for the entire project development. Remote Sensing of Applications in Hydrology.

Apr Jessica Black. Remote sensing is the science of observing and analyzing an area, object, or phenomenon based on data collected by sensors physically separated from the target.

The sensors are mounted on platforms such as ground-based, airplane, or satellite platforms. Physical radiation transfer is the foundation for remote sensing. Typically, the observation is of radiation reflected or emitted from the object of interest. There are two types of remote sensing: In active remote sensing, the object is illuminated by radiation of a known wavelength or frequency ; the backscatter from the object is recorded by the sensor.

Both feet and meters are standard. The units of this reference system are degrees of latitude and longitude. But the distance represented by a degree depends upon its location on the globe which is the global reference system and not a rectangular coordinate system. Thus, a coverage can be digitised in meters, but not in degrees. The advantages and disadvantages of digitising a coverage in either digitiser units or in real-world units are outlined in Table 1.

Table 1. Digitiser units Real-world units i Easy to create edit plots at scale Maps need to be plotted at a ot source map precise scale to overlay edit plots ii Digitising staff has less to learn Digitising staff should understand and understand transformation and projection concepts iii Coverages are not spatially Allow multiple coverages to be referenced and cannot be shown, such as background or displayed simultaneously adjacent coverages.

This is most commonly done by creating an edit plot and overlaying it on a light table. If a map is digitised in real-world units, it may have been stretched and scaled so that it will no longer register accurately with the source map even if the file was digitised accurately.

If coverages are to be digitised in real-world units, the digitising staff should understand how to project and transform coverages. This naturally requires some knowledge of projection concepts. The ability to display adjacent and background coverage can be quite helpful, particularly for updating maps.

This is not possible when maps are digitised in digitiser units. Coverages should not be edited, cleaned, built, or buffered, nor should any spatial analysis be performed when they are stored in reference units latitude-longitude. The algorithms that perform snapping functions using a measurement of length or area are based upon Cartesian coordinates. The length of a line of latitude between two meridians varies with latitude, and the area is confusing when measured in square degrees.

Today maps are not just made using GIS, but the infrastructure of utilities in the streets of our towns will be held in a GIS. Your taxis and emergency services may be guided to their destination using satellite-linked spatial systems; the foresters and farmers will be monitoring their standing crops with spatial information systems; natural resources management developments, and environmental management strategies may be compared with the integration of satellite data while processing results.

To use GIS technology, for the above allied application areas, a huge GIS database on geographical features is to be created. Creating such a database is a complex operation which may involve data capture, verification and structuring process. Because raw geographical data are available in many different analogue or digital forms like maps, aerial photographs, and satellite images, a spatial database can be built in several, not mutually exclusive ways, such as acquiring data in digital form from a data supplier, digitising existing analogue data, carrying out field survey of geographic entities, and interpolating from point observations to continuous surfaces.

Images derived from optical and digital remote sensing systems mounted in aircraft and satellites provide much spatial information and major data as an input to GIS.

Remote sensing data are a major source of data for the mapping of resources like geology, forestry, water resources, land use and land cover. Integration of the two technologies, remote sensing and GIS, can be used to develop decision support 21 Remote Sensing and GIS systems for a planner or decision maker. Remotely sensed images can be used for two purposes, as a source of spatial data withinGIS and using the functionality of GIS in processing remotely sensed data in both pictorial and digital modes.

Since digital remote sensing images are collected in a raster format, digital images are inherently compatible spatially with other sources of information in a raster domain. Because of this, "raw" images can be directly and easily included as layers in a raster-based GIS. Similarly, such image processing procedures as automated land cover classification result in the creation of interpreted or derived data files in a raster format.

These derived data are again inherently compatible with the other sources of data represented in a raster format. Remote sensing images need not be digital in format to be of any value in a GIS environment.

You might also like: SIX SIGMA AND MINITAB EBOOK

Visual interpretation of hardcopy images is used extensively to locate specific features and conditions, which are then subsequently geocoded for inclusion in a GIS. At the same time, the information resident in a GIS can also be used to aid in a visual or digital image interpretation process. For example, GIS information on elevation, slope, and aspect might be used to aid in the classification of forest types appearing in images acquired over areas of high relief.

Thus, the interaction between remote sensing and GIS techniques is two-way in nature. Remote sensing images including the information extracted from such images, along with GPS data, have become primary data sources for modern GIS.

Similarly, these technologies are assisting us in modeling and understanding biophysical process at all scales. They are also permitting us to develop and communicate cause-and-effect "what-if' scenarios in a spatial context in ways never before possible. The importance of remote sensing, GIS, GPS, and related information technologies in the professional careers of today's students involved in measuring, studying, and managing earth resources cannot be over-stated. Hence, in recent years, remote sensing has become a powerful source of spatial data as an input for GIS through which a detailed map can be generated with the help of other collateral data derived from several other sources.

There are, two methods of extracting data for GIS from the remote sensing data. They are, Visual interpretation of satellite imageries in pictorial format, and computer processing of remotely sensed digital data. The output of either of these analysis methods can be considered an input for GIS for any kind of application.

Contour processing t. Terrain Analysis Basic "", Hy drogeomorphology. Image Manipulating. Image ratioing. Image Soils t-- class i fication.

In the present context, the definition of remote sensing is restricted to mean the process of acquiring information about any object without physically contacting it in anyway regardless of whether the observer is immediately adjacent to the object or millions of miles away.

It is further required that such sensing may be achieved in the absence of any matter in the intervening space between the object and the observer.

Consequently, the information about the object, area or any phenomenon must be available in a form that can be impressed on a carrier vacuum. The information carrier, or communication link, is electromagnetic energy. Remote senSing data basically consists of wavelength intensity information acquired by collecting the electromagnetic radiation leaving the object at specific wavelength and measuring its intensity.

Sensors mounted on aircraft or satellite platforms measure the amounts of energy reflected from or emitted by the earth's surface. These measurements are made at a large number of points distributed either along a one-dimensional profile on the ground below the platform or over a two-dimensional area on either side of the ground track of the platform. The sensors scan the ground below the satellite or aircraft platform and as the platform moves forward, an image of the earth's surface is built up.

Each scan line of a remotely sensed image is a digital or numerical record of radiance measurements made at regular intervals along the line. A set of consecutive scan lines forms an image Mather, Two-dimensional image data can be collected by means of two types of imaging sensors, namely, nadir looking or side looking sensor. Subsatellite track. Nadir point B Fig.

As the platform moves forward, an image of the swath region is built up. In the case of nadir looking, the ground area to either side of the satellite or aircraft platform is imaged, whereas an area of the earth's surface lying to one side of satellite track is imaged by means of side looking sensor. Spatial patterns evident in remotely sensed images are interpreted in terms of geographical variation in the nature of material forming the surface of the earth.

Such materials may be vegetation, exposed soil and rock, or water. These materials are not themselves detected directly by remote sensing, and their nature inferred from the measurements made. The characteristic of digital image data is that they can be adjusted so as to provide an estimate of physical measurements of properties of the targets, such as, radiance or reflectivity Mather, They are active sensing system and passive sensing system.

An active sensing system generates and uses its own energy to illuminate the target and records the reflected energy which carries the information content or entropy. Synthetic aperture radar SAR is one of the best examples of active sensing systems. These systems do not rely on the detection of solar or terrestrial emissions as the solar irradiance in microwave region is negligible. The active remote sensing operation principles and the general details of latest imaging radar systems are described in the following chapter.

The second type of remote sensing systems are passive systems mainly depending on the solar radiation operates in visible and infrared region of electromagnetic spectrum. The nature and properties of the target materials can be inferred from incident electromagnetic energy that is reflected, scattered or emitted by these materials on the earth's surface and recorded by the passive sensor for example, a camera without flash. The remote sensing system that uses electromagnetic energy can be termed as electromagnetic remote sensing.

These elements are described in detail further in this chapter. The data analysis process involves examining the data using various viewing instruments to analyse pictorial data which is called the 'visual image interpretation techniques. Use of computers to analyse digHal data through a process is known as digital image processing techniques. The analysis of a data utilising visual image interpretation involves use of the fundamental picture elements, namely tone, texture, pattern, size and shape in order to detect and identify various objects.

Aerial or satellite imagery are seen through stereoscopic instruments to obtain three- dimensional images. There are many photogrammetric instruments today for visual- interpretation and for transferring the details on to base maps.

If the data is available in digital form, it can be analysed on interactive computer systems for extracting statistical data or classified to obtain thematic information about resources. The scene is interactively analysed using computers by comparing with the actual "signature" of the object collected through field visits. This system of classification of objects is quite accurate and depends on the dispersion of training data sets over the area of the scene.

Repcxtngand outpt. Computer analysis of remotely sensed images offers several advantages like quick processing of large volumes of data and special image processing possibilities, such as geometrical and other types of corrections, scale changing, band ratioing, contrast enhancement, edge enhancement, and feature enhancement.

Besides the standard peripherals, the digital system consists of high resolution display terminal for interaction with the resource scientist and output devices like plotters. Sample reference ground data, called training data, is collected and used conjunctively in order to obtain a more accurate thematic information by means of image classification. This thematic information layers are then used as input data for a GIS.

Reference data also called as ground truth or field check, is an essential part of remote sensing data processing. Reference data is used to serve any or all of the following purposes: Lillesand, T.

W, Collection of reference data consists of either time-critical or time-stable measurements. Time critical measurements are those made in cases where ground conditions, such as, vegetation conditions or water pollutants which change rapidly with time. Time stable measurements like the geology of the area of interest are those involving the materials under observation, which do not change with time.

The rate of transfer of radiant energy is called the flux and has watts as the units of power. Density implies distribution over the surface on which the radiant energy falls. If radiant energy falls upon a surface then the term irradiance E is used in place of radiant flux density. If the flow of energy is away from the surface, as in the case of thermal energy emitted by the earth or incoming solar energy which is reflected by the earth, then the term radiant exitance or radiant emittance as measured in units of Wm- 2 is used Mather, Radiance L is defined as the radiant flux density transmitted from a small area on the earth's surface and viewed through a unit solid angle.

It is measured in watts per square meter per steradian Wm- 2 S. The concepts of the radian and steradian are illustrated in Fig. The other important terms we come across remote sensing technology is 'reflectance' denoted bye. It is defined as the ratio between the irradiance and the radiant emittance of an object. When remotely sensed images collected over a time period are- to be compared, it is most appropriate to convert the radiance values recorded by the sensor into reflectance in order to eliminate the effects of variable irradiance over the seasons of the year.

There are 21t radians degrees in a circle. A need not refer to a uniform shape. There are 41t radians in a sphere. Mather, The reflectance characteristic of earth's surface features may be quantifted by measuring the portion of incident energy that is reflected. It is a dimensionless quantity. The quantities described above are very often used to refer to particular narrow wavebands rather than to the whole spectrum.

The terms are then preceded by the word 'spectral', as in 'spectral radiance for a given waveband is the radiant flux density in the waveband per unit solid angle per unit wavelength' Curran, The sun's light is the form of electromagnetic radiation most familiar to human beings. The light as reflected by physical objects travels in a straight line to the observer's eye.

On reaching the retina, it generates electrical signals which are transmitted to the brain by the optic nerve. These Signals are used by the brain to construct an image of the viewer's surroundings. This is the process of vision and it is closely analogous to the process of remote sensing; indeed, vision itself is a form of remote sensing.

The set of a" electromagnetic waves is called the electromagnetic spectrum, which includes the range from the long radio waves, through the microwave and infrared wavelengths to visible light waves and beyond them to the ultraviolet and to the short wave X-and gamma rays Fig.

To be precise, in some situations electromagnetic energy behaves like waves, while in others it displays the properties of particles. This has been a very controversial point for the past years concerned the nature of electromagnetic energy.

This controversy has been sufficient to explain the nature of visible light, even though originally it was not realised that light is a form of electromagnetic energy.

Electromagnetic wave theory formulated by Maxwell in , succeeded in characterising the electric and magnetic fields and their relation to charges and current, and expressing these relationships in a set of partial differential equations now known generally, as Maxwell's equations.

Maxwell demonstrated that it was possible to have wave-like configurations of electric and magnetic fields.

Maxwell's equations explain a great variety of phenomena relating to propagation, dispersion, reflection, refraction, and interference of electromagnetic waves; but they do not explain the interaction of electromagnetic energy with matter on an atomic and molecular level.

In Planck found that, in order to the correct distribution of energy emitted by a black body, he could not assume that the constituent oscillators gain and lose energy continuously. He was rather forced to assume that a particular oscillator of frequency 'V' is able to exist only in discrete states whose energies are separated by the interval hv, where 'h' is known as the Planck's constant. Planck's ideas were applied and extended shortly afterwards.

However, it was Schrodinger in who formulated wave mechanics in terms of a wave equation. The Schrodinger 30 Remote Sensing - Basic Principles wave equation for atomic-molecular scale problems is not really derivable, and should be regraded as the counterpart of Newton's laws of motion for macroscopic bodies.

It is used and accepted, not because of its derivation showing validity, but because when properly applied it yields correct results consistent with observation and experiment.

The schrodinger wave equation directly yields the allowed energy levels of an atomic or molecular systems. Based on the historical development of understanding the nature of electromagnetic energy, it is presently possible to furnish a consistent and unambiguous theoretical explanation for all optical phenomena using a combination of Maxwell's electromagnetic wave theory and the modern quantum theory. Maxwell's theory deals primarily with the propagation and macroscopic optical effects of electromagnetic energy, while quantum theory is concerned with the atomic molecular absorption and emission aspects of radiation.

Maxwells Theory The four differential equations that form the basis of electromagnetic theory are generally referred to as "Maxwells's equations," and they are expressed in mathematical terms. The electric and magnetic fields may exist in regions where no electric charges are present.

When the fields at one point in space vary with time, then some variation of the fields must occur at every other point in space at some other time, and consequently, changes in the fields propagate throughout space.

The propagation of such a disturbance is called an electromagnetic wave. According to Maxwell the electromagnetic state at a pOint in a vacuum can be specified by two vectors: E, the electric field in volts per meter and H, the magnetic field in ampere turns per meter.

These vector quantities are completely independent of each other in the static case, and are determined by the distribution of all charges and currents in space.

In the dynamic case, however, the fields are not independent, but rather their space and time derivatives are interrelated as expressed by the curl V equations.

These four equations are "Maxwell's equations" for a vacuum. The four equations or both the fields satisfy the same partial differential equation, 31 Remote Sensing and GIS The major implication of the equation is that changes in the fields E and H propagate through space at a speed equal to the constant value c, which is known as the speed of light, with a measured value of 2.

This is the fundamental solution to the wave equation, representing a plane harmonic wave for which the solution is the same as that for the magnetic field. It can be shown that the magnetic and electric components are perpendicular to each other and that these plane waves are both perpendicular to the direction of propagation Fig.

In summary, it can be seen that all electromagnetic radiation is energy in x y Fig. It consists of inseparable oscillating electric and magnetic fields that are always mutually perpendicular to each other and to the direction of propagation, the rate of propagation being constant in a vacuum. These are called quanta or photons. The dilemma of the simultaneous wave and particle waves of electromagnetic energy may be conceptually resolved by considering that energy is not supplied continuously throughout a wave, but rather that it is carried by photons.

The classical wave theory does not give the intensity of energy at a point in space, but gives the probability of finding a photon at that point. Thus the classical concept of a wave yields to the idea that a wave simply describes the probability path for the motion of the individual photons. The particular importance of the quantum approach for remote sensing is that it provides the concept of discrete energy levels in materials. The values and arrangement of these levels are different for different materials.

Information about a given material is thus available in electromagnetic radiation as a consequence of transitions between these energy levels. A transition to a higher energy level is caused by the absorption of energy, or from a higher to a lower energy level is caused by the' emission of energy.

The amounts of energy either absorbed or emitted correspond precisely to the energy difference between the two levels involved in the transition. Because the energy levels are different for each material, the amount of energy a particular substance can absorb or emit is different for that material from any other materials. Consequently, the position and intensities of the bands in the spectrum of a given material are characteristic to that material.

The wavelength, denoted by A, is the distance between adjacent intensity maximum for example ofthe electromagnetic wave, and consequently, it may be expressed in any unit of length. The frequency denoted by v, is the number of maxima of the electromagnetic wave that passes a fixed point in a given time.

Frequency is commonly expressed in reciprocal centimeters, also called wave numbers cm- 1 or cycles per second cps which are also called Hertz Hz. The wavelengths may assume any value, although for most practical purposes the spectrum is usually presented between 16 and 10 7 m, or from the cosmic ray to the audio range. However, wavelengths as long as 10 11 m have been detected by sensitive magnetometers. No matter what the wavelength of the electromagnetic radiation, it is all generated by electrically charged matter.

However, there is no universal radiation generator that provides a useful intensity of radiation at all wavelengths for practical purposes, and there is no universal wavelength resolving instrument or universal detector.

Consequently, the spectrum has been divided into regions that bear names related to the sources that produce it, such as, the "ray" regions, or as extensions from the visible range such as, the ultraviolet and the infrared regions, or according to the way in which wavelengths in a range are used such as, radio and television. The extent of the wavelength ranges corresponding to these names were made arbitrarily, and the decision as to where the divisions should be was made mostly on the basis of the limits imposed by the range of the human eye visible , the properties of optical materials, and the response limits of various sources and detectors.

In brief, the electromagnetic spectrum is the continuum of energy that ranges from meters to nano-meters in wave length, travels at the speed of light, and propagates through a vacuum like the outer space Sabins All matter radiates a range of electromagnetic energy, with the peak intensity shifting toward progressively shorter wave lengths at an increasing temperature of the matter. In general, the wavelengths and frequencies vary from shorter wave length high frequency cosmic waves to long wavelength low frequency radio waves.

The wave lengths of greatest interest in remote sensing are visible and near-infrared radiation in the wave band 0. Spectral Wave Bands Visible light is electromagnetic radiation with wavelengths between 0.

The eye is not uniformly sensitive to light within this range and has its peak sensitivity at 0. This peak in the response function of the human eye corresponds closely to the peak in the sun's radiation emittance distribution. Electromagnetic radiation with wavelengths shorter than those of visible light O.

Because of the effect of scattering and absorption, none of these bands is used in satellite remote sensing. The infrared waveband, extending from 0. Short wavelength or near - IR between 0. Infrared radiation with a wavelength upto 3 flm is reflected by the surface of the earth.

Beyond a wavelength of 3 flm, IR radiation emitted by the earth's surface can be sensed in the form of heat. The region of the spectrum composed of electromagnetic radiation with wavelengths between 1 mm and cm is called the microwave band and radiation at these wavelengths can penetrate the clouds. The microwave band is thus a valuable region for remote sensing.

Beyond the microwave region is the , radioband of very long wavelengths used in certain radar applications. The electromagnetic wavebands with their utility in remote sensing are described in Table 2. All stars and planets emit radiation.

Our chief star, the sun is almost a spherical body with a diameter of 1. The continuous conversion of hydrogen to helium which is the main constituents of the sun, generates the energy that is radiated from the outer layers.

Textbook of Remote Sensing and Geographical Information Systems (M.Anji Reddy, 2e, 2008) - Book.pdf

X-ray 0. Not employed in remote sensing. Ultraviolet 0. Photographic 0. Detectable UVband with film and photodetectors, but atmospheric scattering is severe Visible 0. Includes reflected energy peak of earth at 0. Infrared 0. Atmospheric transmission windows are separated. Reflected 0. The band from 0. Images at these wavelengths are acquired by optical mechanical scanners and special vidicon systems but not by film. Microwave 0. Images may be acquired in the active or passive mode.

Radar 0. Some classified radars with very long wavelengths operate in this region.

Textbook of Remote Sensing and Geographical Information Systems (M.Anji Reddy, 2e, 2008) - Book.pdf

This includes the energy reflected by clouds and atmosphere. If the sun were a perfect emitter, it would be an example of an ideal black body.

A black body transforms heat energy into radiant energy at the possible maximum rate consistent with Planck's law which defines the spectral exitance of a black body as follows Henderson, The wavelength at which the maximum spectral exitance is achieved is reduced as the temperature increases. The dashed line in Fig. Wien's displacement law. The law gives the wavelength of maximum spectral exitance Am in the following form: The solar radiation, maximum of which occurs at 0.

Wavelength dependent mechanisms of atmospheric absorption alter the solar irradiance that actually reaches the surface of the earth. Generally, the selection of wavebands for use depends on a the characteristics of the radiation source, b the effects of atmospheric absorption and scattering, and c the nature of the target. This passage will alter the speed, frequency, intensity, spectral distribution, and direction of the radiation.

As a result atmospheric scattering and absorption occur Curran, These effects are most severe in visible and infrared wavelengths, the range very crucial in remote sensing. During the transmission of energy through the atmosphere, light interacts with gases and particulate matter in a process called atmospheric scattering.

The two major processes in scattering are selective scattering and non-selective scattering. Rayleigh, Mie and Raman scattering are of selective type. Non selective scattering is independent of wavelength. It is produced by particles whose radii exceed 10 Ilm, such as, water droplets and ice fragments present the clouds.

This type of scattering reduces the contrast of the image. While passing through the atmosphere, electromagnetic radiation is scattered and absorbed by gasses and particulates. Besides the major gaseous components like molecular nitrogen and oxygen, other constituents like water vapour, methane, hydrogen, helium and nitrogen compounds play an important role in modifying the incident radiation and reflected radiation.

This causes a reduction in the image contrast and introduces radiometric errors. Regions of the electromagnetic spectrum in which the atmosphere is transparent are called atmospheric windows. The atmosphere is practically transparent in the visible region of the electromagnetic spectrum and therefore many of the satellite based remote sensing sensors are designed to collect data in this region. Some of the commonly used atmospheric windows are 0.

Shows relative scatter as o 3 0. The characteristics of all the four types of scattering in the order of their importance in remote sensing are given in Table 2. TABLE 2. Mie Same size as Spherical Physical scattering Affects all visible the wavelength particles, under overcast wave lengths of radiation.

Therefore, the remaining signal can be interpreted in terms of suspensions only after a careful correction for the atmospheric contribution. For this reason the varying optical parameters of atmosphere must enter the radiative transfer calculations Fischer J, Before we study the 40 Remote Sensing - Basic Principles effects of solar radiation and atmospheric properties, we shall consider the mass quantities which determine the spectral upward radiance.

The source of the shortwave radiation field in atmosphere is the Sun emitting in a broad spectral range. The extraterrestrial irradiance at the top of the atmosphere, the solar constant, depends on the black body emission of the Sun's photosphere and on the scattering and absorption process in the Sun's chromosphere. Important Fraunhofer lines caused by the strong absorption in the Sun's chromosphere show some prominent drops in the spectral distribution of the solar radiation.

E The Chappuis band of ozone in the visible spectrum is the only ozone band used to detect the oceanic constituents from space.

The transmission of the chlorophyll fluorescence to the top of the atmosphere is hindered through the absorption by water vapour and molecular oxygen in their vibration action bands. In order to study the selective gaseous absorption in the radiative transfer calculations the transmission functions of 02 and H 2 0 are computed 41 Remote Sensing and GIS from absorption line parameters by explored through areas of Lorentz's theory of collision broadening.

The contribution from resonance broadening is negligible in the spectral region considered. Also the Doppler line broadening, which is small when compared with Lorentz line widths, is neglected since the area absorption takes place in the atmosphere below 40 km Barrow The reduction in the solar flux due to absorbtion and scattering by a clear mid-latitude summer atmosphere.

Response studies for the temperature and pressure dependence of the transmission function have been performed and show only a weak influence for the temperature effect. The pressure impact is not negligible and has to be accounted for. Air molecules are small compared to the wavelength of the incoming sunlight. Hence, the extinction through molecular scattering can be determined with Rayleigh theory.

The necessary property. Since molecular scattering within the atmosphere depends mainly on pressure, the scattering coefficient can be estimated by climatological measurement.

Atmospheric spectral turbidity variations are caused by variations in aerosol concentration, composition and size distribution. The vertical distribution of the aerosols is taken from Adler and Ken, The phase functions of aerosols are nearly wavelength independent within the visible and near infrared.

For the radiative transfer calculations the scattering functions are estimated by Mie theory. The range of atmospheric turbidity values used to study the effects of aerosol scattering on the measured spectral radiances correspond to horizontal visibilities at the surface between 6 and 88 km. As shown in Fig. These two atmospheric effects are expressed mathematically as follows Lillesand, and Kiefer The amount of irradiance depends on seasonal changes, solar elevation angle, and distance between the earth and sun.

Applying the principle of conservation of energy, the relationship can be expressed as: Where, and, EI A. In remote sensing, the amount of reflected energy ER A. Therefore, it is more convenient to rearrange these terms like From this mathematical equation, two important points can be drawn. Simply, it can be understood that, the measure of how much electromagnetic radiation is reflected off a surface is called its reflectance. The reflectance range lies between 0 and 1. A measure of 1. The reflectance characteristics are quantified by "spectral reflectance, p A.

According to Kirchoff's law of physics, the absorbance is taken as emissivity s. Therefore Eq. The classical example of this type of object is snow white object. Black body such as lamp smoke is an example of this type of object. Therefore it can be seen that the reflectance varies from 0 black body to 1 white body. When we divide the incident energy on both sides of the balance equation, we get the proportions of energy reflected, absorbed and transmitted which vary for different features of the earth depending on the material type.

These differences provide a clue to differentiate between features of an image. Thus two features which are indistinguishable in one spectral range, may exhibit a marked contrast in another wavelength band. Because many remote sensing systems operate in the wavelength regions in which reflected energy predominates, the reflectance properties of terrestrial features are very important.

The manner of interaction is described by the spectral response of the target. The spectral reflectance curves describe the spectral response of a target in a particular wavelength region of electromagnetic spectrum, which, in turn depends upon certain factors, namely, orientation of the sun solar azimuth , the height of the Sun in the sky solar elevation angle , the direction in which the sensor is pointing relative to nadir the look angle Fig.

Surface normal N Elevation is measured upwards from the horizontal plane. Azimuth is measured clockwise from north. The zenith angle is measured from the surface angle, and equals 90 minus elevation angle, in degrees. The spectral reflectance curves for vigorous vegetation manifests the "Peak-and-valley" configuration. The valleys in the visible portion of the spectrum are indicative of pigments in plant leaves.

Dips in reflectance Fig. The soil curve shows a more regular variation of reflectance. Factors that evidently affect soil reflectance are moisture content, soil texture, surface roughness, and presence of organic matter. The term spectral signature can also be used for spectral reflectance curves.

Spectral signature is a set of characteristics by which a material or an object may be identified on any satellite image or photograph within the given range of wavelengths. The characteristic spectral reflectance curve Fig. However, the spectral reflectance of water is significantly affected by the presence of dissolved and suspended organic and inorganic material and by depth of the water body. Experimental studies in the field and in the laboratory as well as experience with multispectral remote sensing have shown that the specific targets are characterised by an individual spectral response.

Indeed the successful development of remote sensing of environment over the past decade bears witness to its validity. In the remaining part of this section, typical and representative spectral reflectance curves for characteristic types of the surface materials are considerd. Imagine a beach on a beautiful tropical island.

Solid lines I represent incident rays, lines 4 and 5 are volume rays. After Vincent and Hunt The solid lines in the figure represent the incident rays, and dashed lines 1, 2, and 3 represent rays reflected from the surface but have never penetrated a sand grain. The latter are called specular rays by Vincent and Hunt , and surface-scattered rays by Salisbury and Wald ; these rays result from first-surface reflection from all grains encountered.

For a given reflecting surface, all specular rays reflected in the same direction, such that the angle of reflection the angle between the reflected rays and the normal, or perpendicular to the reflecting surface equals the angle of incidence the angle between the incident rays and the surface normal.

The measure of how much electromagnetic radiation is reflected off a surface is called its reflectance, which is a number between 0 and 1. In the case of first-surface reflection, this measure is called the specular reflectance, which will be designated here as rs A. The A in parentheses indicates that specular reflectance is a function of a wavelength.

The reason that rS A is a function of a wavelength is that the complex index of refraction of the reflecting surface material is dependent on a wavelength.

Book Preview

The term complex means that there is a real and imaginary part to the index of refraction. Every material has a complex index of refraction, though for some materials at some wavelengths, only the real part of the complex index of refraction may be nonzero. For instance, if the specular reflectance of three grains for a particular wavelength of electromagnetic radiation were 0.

The specular reflectance of the beach surface, RS A , is the average of. Rays of electromagnetic radiation that have been transmitted through some portion of one or more grains are called volume rays. These are shown as dashed lines 4 and 5 in Fig. The equation for the volume reflectance, r5 A , of a sand grain is complicated because it depends on both the transmittance of the grain and the interface reflectance of the top of that grain and the underlying grain s.

The average rS A for all the grains in the beach from which electromagnetic radiation is reflected is defined as the volume reflectance of the beach, RS A.

The total reflectance of the beach, RT A , is the averaged sum of the specular and volume reflectance, as follows: Three important observations can be summarised from the above discussion on the beautiful beach island. Note that when we use the terms transparent or opaque to explain optical behavior, we must designate both a wavelength region and the material because the complex index of refraction of any material is generally non-constant over a large range of wavelength.

To consider the effect on reflectance of mixing several minerals together, let us take the simpler case of a particulate medium consisting of several mineral constituents, with air filling the interstices between particles. It is possible for us to estimate the spectral reflectance of a mixed-mineral particulate sample by using a linear combination of the reflectance spectra of its mineral constituents, weighed by the percentage of area on the sample's surface that is covered by each mineral constituent.

The following equation demonstrates this estimation for the total spectral reflectance of a mixed particulate sample at wavelength A.

Robert K. Courtesy of R. Thus so far we have talked about volume reflectance and specular reflectance on the basis of whether electromagnetic rays did or did not penetrate one or more grains in a soil or rock surface.

Now we need to define some reflectance terms that 52 Remote Sensing - Basic Principles relate to the manner in which the soil or rock surface is illuminated, as well as, how the reflected energy from its surface is measured. The most fundamental term for reflectance used in this book is defined as spectral hemispherical reflectance or diffuse reflectance. As "a" decreases the intensity of energy increases, and vice versa.

This common effect in photographic images is called vignetting. The microwave band is a valuable region for remote sensing in view of two distinctive features, i Microwaves are capable of penetrating the atmosphere under almost all conditions. Depending on the wave lengths involved, microwave energy can 'see through' haze, light rain, snow, clouds, and smoke, ii Microwave reflections or emissions from earth materials bear no direct relationship to their counterparts in the visible or thermal portions of the spectrum.

The surfaces that appear rough in the visible may be smooth in microwave. Remote sensing techniques in the microwave region of electromagnetic spectrum can be classified into two categories Reeves, Active systems provide their own illumination, where as passive systems record the energy of thermal origin emitted from materials.

Active microwave sensing systems are of two types and they are imaging sensors and non-imaging sensors. The radar is an acronym derived from Radio Detector and Ranging. These imaging radars are divided into two categories. The first category is real aperture, and the second one is synthetic aperture systems.

In the real aperture system, resolution is determined by the actual beam Remote Sensing and GIS width and antenna size. The synthetic aperture system utilises signal processing techniques to achieve narrow beam width in the long track direction which provides better resolution.

Non-imaging remote sensing radars are either scatterometers or altimeters. Any calibrated radar that measures the scattering properties of a surface is called scatteromf! Passive microwave sensors called radiometers, measure the emissive properties of the earth's surface.

A radar altimeter sends out pulses of microwave signals and record the signal scattered back from the earth surface. The height of the surface can be measured from the time delay of the return signals. A wind scatterometer can be used to measure wind speed and direction over the ocean surface. It sends out pulses of microwaves along several directions and records the magnitude of the signals that are back scattered from the ocean surface.

The magnitude of the backscattered signals is related to the ocean surface roughness, which, in turn, is dependent on the sea surface wind conditions, so that the wind speed and direction can be derived. Imaging radars are side looking rather than nadir looking instruments and the geometry is complicated by foreshortening to the extent that the top of a mountain appearing closer to the sensor than the foot of the mountain, and shadow caused by the far side of a mountain or hill is being invisible to the side looking radar sensor.

A microwave radiometer is a passive device which records the natural microwave emission from the earth. It can be used to measure the total water content of the atmosphere within its field of view.

Application potential of radar remote sensing for various disciplines like soil moisture, agriculture, geology, hydrology, and oceanography, has been demonstrated through various ground based, aircraft and space craft experiments. This chapter provides the principles of radar remote sensing.

The microwave portion of the spectrum includes wavelength within the approximate range of 1 m. In active microwave remote sensing, the radar antenna transmits short burst pulses of energy to the target and echoes from these targets carry informations about the position range and quality of the illuminated objects.

The radar equation relates the influence of the system and terrain parameters to the power received by the antenna Reeves as shown below: Therefore, the power received from a resolution cell is a combined power obtained by adding the powers from these scatterers. The above equation can be converted to The back scattering coefficient, according to Elachi , is defined as the ratio of the energy received by the sensor over the energy that the sensor would have received if the surface had scattered the energy incident on it in isotropic fashion.

This is expressed in decibels dB. Back scattering coefficient describes the 57 Remote Sensing and GIS terrain contributing factor to the radar image tone, and the radar cross sections per unit area resolution cell.

It is a result of the sensor-target interaction Fig. The backscattering coefficient can be a positive number focusing energy in the back direction or a negative number away from the back direction. Electrical target characteristics I. Wavelength, frequency 2.

Textbook of Remote Sensing and Geographical Information Systems (M.Anji Reddy, 2e, 2008) - Book.pdf

Surface inhomogeneities 2. Polarization - microrelief resonant components 3. Angle of incidence - mesorelief surface roughness 4. Flight parameters 3. Sub-surface structures flight direction, altitude 5. They are also governed by physical and electrical properties of the target. The electromagnetic property of materials is expressed by the complex relative permitivity dielectric constant For a conducting medium, the amplitude of a wave propagation in it is attenuated exponentially with distance.

Therefore strong backscatter is observed only in nadir direction. Rough surfaces tend to reradiate uniformity in all the directions diffuse scattering , so they give relatively strong radar returns in all the directions.

For a smooth surface where the surface roughness scale is much shorter than wavelength, incident energy is reflected off specularly as illustrated in Fig. As the roughness scale approaches the same dimension as the wavelength, the scattered energy is dispersed, and when the roughness scale exceeds the wavelength of the incident energy, scattering is nearly uniform over the hemisphere.

More exact classification of surface roughness considering surface slopes is defined by Fung Slightly rough surface: Smooth undulating surface: Two scale composite rough surface: Surface scattering is caused normally at the air-ground interface, whereas volume scattering is caused by the dielectric discontinuities in a volume.

The surface scattering mechanism is an important component of radar scattering process. In general, surface scattering occurs at the air-ground interface. For a perfectly smooth surface, the incident wave will excite the atomic oscillators in the dielectric medium at a relative phase such that the reradiated field consists of two plane waves, namely, reflected wave and refracted wave or transmitted wave Fig.

For rough surface, energy is scattered in all directions depending upon the roughness of the surface as well as dielectric properties of the surface.

The roughness can be statistically characterised by its standard deviation relative to the mean flat surface. The surface correlation length is the separation after which two points are statistically independent, that is, the length after which auto-correlation function is less than 1 Ie. The description of models likel point scattering model, facet model, and Bragg model, are beyond the scope of this handbook.

Table 3. Although a radar signal does not detect color information or temperature information, it detects surface roughness and electrical conductivity information in soil moisture conditions. Hence, the wavelength, depression angle, and the polarisation of the signal are important properties. There are two categories of side-looking airborne radar SLAR systems, namely, real aperture and synt: The latter is the focus of this review, but the real aperture SLAR systems may be briefly considered so as to understand why synthetic aperture radar SAR systems have been developed.

This pulse moves out radially from the antenna and results in a beam being formed which is vertically wide but horizontally narrow. The time taken by a pulse to move away from the antenna. From this time measurement, it is possible to determine the distance between the antenna and the object in the slant range. An image product is generated in a film recorder by using the signal to control the intensity of the beam on a single line cathode ray tube CRT , and recording this Radar pulse sent from aircraft 13 "'17 1: The film is advanced at a rate proportional to the aircraft's motion.

In this way the combined response of many radar pulses is used to generate an image in which each line is the tonal representation of the strength of the signals returned to the radar antenna from a single pulse Lillesand and Kiefer, The ground resolution cell size of SLAR system mainly depends on pulse length and antenna beam width. The pulse length is defined as the length of time that the antenna emits its energy. Pulse length determines the spatial resolution in the direction of propagation Fig.

This direction is called range resolution. The other resolution: Therefore the resolution in the radar system is measured in two directions, along the track azimuthal and across the track range resolution. The effective resolution is the minimum separated distance that can be determined between two targets with echoes of similar strength.

The resolution in the range direction is given by In the azimuthal direction, however, the resolution R is determined by the angular beam width of the antenna, and the slant range distance which can be expressed as: This is because the radar beam 'fans out' with increasing distance from the aircraft.

This results in a deterioration of the azimuthal resolution with increasing range, and so objects which are separable close to the flight line are not distinguished further. At spacecraft altitude, the azimuthal resolution becomes too coarse. Ground range resolution Slant-range resolution To obtain a fine range resolution shorter pulses would need to be transmitted, and these would require a high peak power.

The radar systems in which beam width is controlled by the physical antenna length, are called brute-force, real aperture or non-coherent radars. The microwave energy scattered back to the spacecraft is measured.

The SAR makes use of the radar principle to form an image by utilising the time delay of the back scattered signals Fig. In real aperture radar imaging, the ground resolution is limited by the size of the microwave beam sent out from the antenna.

Finer details on the ground can be resolved by using a narrower beam. The beam width is inversely proportional to the size of the antenna, that is, the longer the antenna, the narrower the beam.

It is not feasible for a spacecraft to carry a very long antenna that is required for high resolution imaging of the earth surface.

The antenna's footprint sweeps out a strip parallel to the direction of the satellite's ground track. With a SAR system there is the advantage that the azimuth or along track resolution is improved by making it independent of the range. This is because with a SAR system a physically short antenna is made to behave as if it were much longer.

This aperture synthesis is achieved by recording not only the strength of a returned signal from an object on the ground, but also the signal's frequency. With this extra information the beam width can be effectively narrowed when the Doppler shift is used Lillesand and Kiefer, Therefore, it is possible for a single moving antenna to successively occupy the element positions of X to Xo in an array of length L.For a perfectly smooth surface, the incident wave will excite the atomic oscillators in the dielectric medium at a relative phase such that the reradiated field consists of two plane waves, namely, reflected wave and refracted wave or transmitted wave Fig.

On the digitiser surface, moving one inch up or down covers the same distance as moving one inch left or right.

Data Analysis and Modelling All these are considered continuous surface features. The latter are called specular rays by Vincent and Hunt , and surface-scattered rays by Salisbury and Wald ; these rays result from first-surface reflection from all grains encountered. Typically, radar and visible scanners achieving a 15 to 20 m resolution are confined to about a to km swath. Where, and, EI A.

DAPHINE from Aurora
Please check my other articles. I take pleasure in homebrewing. I do love reading comics vaguely.
>