satellite image classification

94
SATELLITE IMAGE CLASSIFICATION USING FUZZY-LOGIC Project work carried out by Jagriti Pande,4 th year student of GCEW at DTRL,DRDO under the guidance of Mr Pratik Chaturvedi,Scientist “C”. 1

Upload: jagriti-pande

Post on 23-Oct-2014

119 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: Satellite Image Classification

SATELLITE IMAGE CLASSIFICATION

USING FUZZY-LOGIC

Project work carried out by Jagriti Pande,4th year student of GCEW at DTRL,DRDO under the guidance of Mr Pratik Chaturvedi,Scientist “C”.

1

Page 2: Satellite Image Classification

CONTENTS

Acknowledgement

Decleration

Certificate

COMPANY INTRODUCTION

Genesis and growth

Vision

Mission

Establishments and labs

Technology cluster

Academics

Major products and technologies

Popular science and technology series

REMOTE SENSING

Technical definition

Basic principle

Stages of remote sensing

Emission of electromagnetic radiation

Interaction with atmosphere

Atmospheric Scattering

Atmospheric Absorption

Radiation

2

Page 3: Satellite Image Classification

Target Interaction

Transmission of energy from surface to remote sensor

Types of Remote sensing

Active vs passive

Types of satellite and their orbits

Geostationary

Polar

Land observation satellite

LANDSAT

SPOT

IRS

MOS

Digital Image Processing

Introduction

Digital image

Classes of image

Gray

Binary

Indexed

RGB

Steps in digital image processing

Image Classification

Pre-processing

Training

Decesion

3

Page 4: Satellite Image Classification

Accessing

Techniques used- MATLAB

Overview of MATLAB

Features

Language

Toolbox description

What is Fuzzy logic

Foundation of fuzzy logic

Fuzzy set

Membership function in fuzzy logic

Logical operation

If then Rules

Why use Fuzzy Logic

Project details

Input image description

Using fuzzy

Algorithm

Conclusion

Conclusion

ACKNOWLEDGEMENT

4

Page 5: Satellite Image Classification

I highly express my gratitude to MR Pratik Chaturvedi sc ‘c’,my mentor in DTRL, DRDO who extended all possible help and support to me. Without his guidance it would not been ben possible for me to carry out the project.

It is my profound pleasure to express my sincere thanks to Director, DTRL for allowing me to do the project work in this esteemed organization.

I would also like to thank my professors at college for their valuable support. I also wish to express my profound appreciation to my fellow trainees for their valuable suggestions,advice, constructive and healthy criticism during work.

JAGRITI PANDE

4TH year,ece branch

G.C.E.W

MDU University

5

Page 6: Satellite Image Classification

DECLARATION BY THE CANDIDATE

I Jagriti Pande hereby declares that the project work entitled “SATELLITE IMAGE CLASSIFICATION USING FUZZY LOGIC” is an authentic work carried out by me at the Defense Terrain Research Laboratory, Defense R&D Organization, Metcalfe House, new Delhi, under the guidance of Mr. Pratik chaturvedi, scientist ‘c’, DRDO.

CERTIFICATE

6

Page 7: Satellite Image Classification

Its hereby declared that the project entitled SATELLITE IMAGE CLASSIFICTION USING FUZZY LOGIC submitted by Ms Jagriti Pande,7 th smester student of Gurgaon College Of Engineering,MDU has been carried out under my guidance.Its an original work carried out by her.I grant her the certificate on her successful completion of the project under the time period 13th June 2011 to 13th August 2011.

Pratik Chaturvedi

Scientist ‘c’

DTRL,DRDO

COMPANY INTRODUCTION

7

Page 8: Satellite Image Classification

Defence Research & Development Organisation (DRDO) works under Department of Defence Research and Development of Ministry of Defence. DRDO dedicatedly working towards enhancing self-reliance in Defence Systems and undertakes design & development leading to production of world class weapon systems and equipment in accordance with the expressed needs and the qualitative requirements laid down by the three services. DRDO is working in various areas of military technology which include aeronautics, armaments, combat vehicles, electronics, instrumentation engineering systems, missiles, materials, naval systems, advanced computing, simulation and life sciences. DRDO while striving to meet the Cutting edge weapons technology provides ample spinoff benefits to the society at large thereby contributing to the nation buliding.

GENESIS AND GROWTH

DRDO was formed in 1958 from the amalgamation of the then already functioning Technical Development Establishment (TDEs) of the Indian Army and the Directorate of Technical Development & Production (DTDP) with the Defence Science Organisation (DSO). DRDO was then a small organisation with 10 establishments or laboratories. Over the years, it has grown multi-directionally in terms of the variety of subject disciplines, number of laboratories, achievements and stature.

Today, DRDO is a network of more than 50 laboratories which are deeply engaged in developing defence technologies covering various disciplines, like aeronautics, armaments, electronics, combat vehicles, engineering systems, instrumentation, missiles, advanced computing and simulation, special materials, naval systems, life sciences, training, information systems and agriculture.

Presently, the Organisation is backed by over 5000 scientists and about 25,000 other scientific, technical and supporting personnel. Several major projects for the development of missiles, armaments, light combat aircrafts, radars, electronic warfare systems etc are on hand and significant achievements have already been made in several such technologies.

VISION

Make India prosperous by establishing world-class science and technology base and provide our Defence Services decisive edge by equipping them with internationally competitive systems and solutions. 

MISSION

8

Page 9: Satellite Image Classification

Design, develop and lead to production state-of-the-art sensors, weapon systems, platforms and allied equipment for our Defence Services.

Provide technological solutions to the Defence Services to optimise combat effectiveness and to promote well-being of the troops.

Develop infrastructure and committed quality manpower and build strong technology base.

ESTABLISHMENTS/LABS OF DRDO:

ADE- Aeronautical Development Establishment, Bangalore

ARDE- Armament R&D Establishment, Pune

ASL- Advanced Systems Laboratory, Hyderabad

CAIR- Centre for Artificial Intelligence and Robotics, Bagalore

CFEES- Centre for Fire, Explosives & Environmental Safety, Delhi

CVRDE- Combat Vehicles R&D Establishment, Chennai/Avadi

DARE- Defence Avionics Reseach Establishmnet, Bangalore

DARL- Defence Agricultural Research Laboratory, Pithoragarh

DEBEL- Defence Bioengineering and Electrochemical Laboratory, Bangalore

DEAL- Defence  Electronics Applications Laboratory, Dehradun

DESIDOC- Defence Scientific Information & Documentation Centre, Delhi

DFRL- Defence Food Research  Laboratory, Mysore

DIPR- Defence Institute of Psychological Research, Delhi

DIPAS - Defence Institute of Physiology & Allied Sciences, Delhi

DLJ - Defence Laboratory, Jodhpur

DLRL - Defence Electronics Research Laboratory, Hyderabad

DMRL- Defence Metallurgical Research Laboratory, Hyderabad

DMSRDE- Defence Materials & Stores R & D Establishment, Kanpur

DTRL - Defence Terrain Research Laboratory

DRDL - Defence Research & Development Laboratory, Hyderabad

DRDE - Defence Research & Development Establishment, Gwalior

DRL- Defence Research Laboratory, Tezpur

FRL - Field Research Laboratory, C/o 56 A.P.O.

GTRE - Gas Turbine Research Establishment, Bangalore

HEMRL - High Energy Materials Research Laboratory, Pune

IAT - Institute of Armament Technology, Pune

INMAS - Institute of Nuclear Medicine & Allied Research, Delhi

9

Page 10: Satellite Image Classification

IRDE - Instruments R& D Establishment, Dehradun

ISSA-  Insttitue of Systems Studies and Analyses, Delhi

ITM-  Institute of Technology Management, Mussoorie

LASTEC- Laser Science & Technology Centre, Delhi

LRDE - Electronics and Radar Development Establishment, Bangalore

MTRDC - Microwave Tube R&D Centre, Bangalore

NMRL - Naval Materials Research Laboratory, Ambernath

NPOL - Naval Physical & Oceanographic Laboratory, Kochi

NSTL _ Naval Science & Technological Laboratory, Visakhapattanam

PXE - Proof & Experimental Establishment, Chandipur

RCI - Research Centre Imarat, Hyderabad

R&DE - Research & Development Engineers, Pune

SASE - Snow & Avalanche Study Establishment, Chandigarh

SAG - Scientific Analysis Group, Delhi

SSPLDA- Solid State Physics Laboratory, Delhi

TBRL - Terminal Ballistics Research Laboratory, Chandigarh

VRDE - Vehicles research & Development establishment, Ahmednagar

Labs and establishments delhi

Center for Fire, Explosive and Environment Safety (CFEES),Delhi Defence Institute of Physiology & Allied Sciences (DIPAS), Delhi Defence Institute of Psychological Research (DIPR), Delhi Defence Scientific Information & Documentation Centre (DESIDOC), Delhi Defence Terrain Research Laboratory (DTRL), Delhi Institute of Nuclear Medicine & Allied Sciences (INMAS), Delhi Institute of Systems Studies & Analyses (ISSA), Delhi Laser Science & Technology Centre (LASTEC), Delhi Scientific Analysis Group (SAG), Delhi Solid State Physics Laboratory (SSPL), Delhi

TECHNOLOGY CLUSTER :

Aeronautics

Aeronautical Development Establishment (ADE), Bangalore

10

Page 11: Satellite Image Classification

Aerial Delivery Research & Development Establishment (ADRDE), Agra Centre for Air Borne Systems (CABS), Bangalore Defence Avionics Research Establishment (DARE), Bangalore Gas Turbine Research Establishment (GTRE), Bangalore Center for Military Airworthiness & Certification (CEMILAC), Bangalore

Armaments

Armament Research & Development Establishment (ARDE), Pune Center for Fire, Explosive and Environment Safety (CFEES),Delhi High Energy Materials Research Laboratory (HEMRL), Pune Proof & Experimental Establishment (PXE), Balasore Terminal Ballistics Research Laboratory( TBRL),Chandigarh

Combat Vehicles & Engineering

Combat Vehicles Research & Development Estt. (CVRDE), Chennai Vehicle Research & Development Establishment (VRDE), Ahmednagar Research & Development Establishment (R&DE), Pune Snow & Avalanche Study Estt (SASE), Chandigarh

Electronics & Computer Sciences

Advanced Numerical Research & Analysis Group (ANURAG), Hyderabad Center for Artificial Intelligence & Robotics (CAIR), Bangalore Defence Electronics Application Laboratory (DEAL), Dehradun Defence Electronics Research Laboratory (DLRL), Hyderabad Defence Terrain Research Laboratory (DTRL), Delhi Defence Scientific Information & Documentation Centre (DESIDOC), Delhi Instruments Research & Development Establishment (IRDE), Dehradun Laser Science & Technology Centre (LASTEC), Delhi Electronics & Radar Development Establishment (LRDE), Bangalore Microwave Tube Research & Development Center (MTRDC), Bangalore Solid State Physics Laboratory (SSPL), Delhi Scientific Analysis Group (SAG), Delhi

Human Resource Development

Defence Institute of Advanced Technology (Deemed University), Pune Institute of Technology Management (ITM), Mussorie

Life Sciences

Defence Agricultural Research Laboratory (DARL), Pithoragarh Defence Bio-Engineering & Electro Medical Laboratory (DEBEL), Bangalore

11

Page 12: Satellite Image Classification

Defence Food Research Laboratory (DFRL), Mysore Defence Institute of High Altitude Research (DIHAR),Leh Defence Institute of Physiology & Allied Sciences (DIPAS), Delhi Defence Institute of Psychological Research (DIPR), Delhi Defence Research Laboratory (DRL), Tejpur Institute of Nuclear Medicine & Allied Sciences (INMAS), Delhi Defence Research & Development Establishment (DRDE), Gwalior

Materials

Defence Laboratory (DLJ), Jodhpur Defence Metallurgical Research Laboratory (DMRL), Hyderabad Defence Materials & Stores Research & Development Establishment (DMSRDE),

Kanpur

Missiles

Defence Research & Development Laboratory (DRDL), Hyderabad Institute of Systems Studies & Analyses (ISSA), Delhi Integrated Test Range (ITR), Balasore Research Center Imarat (RCI), Hyderabad

NAVAL

Naval Materials Research Laboratory (NMRL), Ambernath Naval Physical & Ocenographic Laboratory (NPOL), Cochin Naval Science & Technological Laboratory (NSTL), Vishakapatnam

ACADEMIC 

DRDO has constituted four research boards to nurture and harness talent in academic institutions, universities, R&D centres and industry. The organization provides necessary facilities for promoting basic research and to catalyse cross-fertilization of ideas with R&D agencies in other sectors for expanding and enriching the knowledge base in their respective areas. The boards provide grants-in-aid for collaborative defence-related futuristic frontline research having application in the new world class systems to be developed by DRDO.

EXAMPLES:

12

Page 13: Satellite Image Classification

Computational Fluid Dynamics (CFD) at the IISc, Bangalore, which is anticipated to give a boost to the designing of aeronautical systems within the country. Another centre of excellence in aerospace system design and engineering is being set up at IIT, Mumbai.

A Centre for Composite Structure Technology is proposed to be set up at the National Aerospace Laboratory,Bangalore. 

Hypermedia digital library in IIT-Kharagpur, to the development of audio-visual training aids for aircrew, to indoctrination in air sickness and positive pressure breathing at the Institute of Aviation Medicine, Bangalore and to the development of rarefied gas dynamic facility at IIT, Chennai.

The Aeronautics Research and Development Board (AR&DB) has approved projects in the filed of Aeronautics and related areas. The Armament Research Board (ARMREB) has approved projects in the fields of high energy materials, sensors, ballistics and other armament related fields. Under the Naval Research Board (NRB), projects are being pursued in five technology areas. Under Life Sciences Research Board (LSRB) projects have been supported in the areas of biological and bio-medical sciences, psychology, physiology, bio-engineering, specialized high altitude agriculture, food science and technology. 

MAJOR PRODUCTS AND TECHNOLOGIES :

Air-Borne Telemetry Receiving System All Electric Type Weapon Control System for ICV Antenna Systems Bhima Biomedical Devices for Internal Use (Implants) Biomedical Devices for External Use Briefcase SATCOM Terminal Code Programme Diagnostic Products for Infection Imaging EOCM-Class Laser System Explosive Reactive Armour (ERA) FSAPDS, Ammunition Indigenous X-Ray Industrial Tomography System Integrated Weapon System Simulation Kaveri Engine Lakshya Laser Warning Sensors Light Combat Aircraft Manipulator Arm MBT Arjun

13

Page 14: Satellite Image Classification

Missiles (Agni, Prithvi, Nag, Trishul, Akash) MMIC Model-Based Data Fusion Naval Weapon Systems Nishant Palmtop Green Microchip Laser Module Pan/Tilt Platform for Vision Systems Pinaka RaDIATion Protection Products Rajendra Radar Rapid Quantification & Detection Techniques for Pesticides In Fruits &

Vegetables Recovery Parachute System Sangraha Sanyukta Special Materials Technology for Dengue Control TechnologyforTitaniumSpongeProduction

(BFSR-SR)HAL Dhruv

(AAD) missileArjun MBT

Nag missile Akash missileHAL Tejas

INS Shivalik

HAL HJT-36 Sukhoi Su-30MKI T-72 AjeyaPrithvi missile

Rustom Prithvi Air Defense Rustom RC model11

14

Page 15: Satellite Image Classification

Shaurya missile

Rustom prototype1

A model of the BEL Weapon Locating Radar Thermonuclear device tested

during Pokhran-II.

POPULAR SCIENCE AND TECHNOLOGY SERIES (PST)

Battle with Barnacles Composite Materials Computer & its Defence Application Electronic Warfare Guided Missiles Kindling Creativity LASER & its Application Night Vision Devices Radiation Satellites Super Vision Toxicology & Human Life The Living Desert Sagar Mein Sangram Computer Aur Rhaksha Anupryog

REMOTE SENSING

Remote sensing is also called Earth’s Observation.It refers to obtaining information about objects or areas at the Earths’s surface without being in direct contact with the object or area.

15

Page 16: Satellite Image Classification

EXAMPLE OF REMOTE SENSING IN DAY TO DAY LIFE :

Reading newspaper,watching the cars driving in front of us ,watching computer screen are all the examples of remote sensing.

Applied remote sensing involves the detecting and measuring of em energy (in form of photons )emanating from distant objects made of various materials by which means the user can identify and categorise these objects –usually presented/depicted as images.

Remote sensing can show data as MAPS and GRAPHS or even DIGITAL NUMBERS that can be input to computer based analysis.Hence remote sensing is a tool for gathering information.

APPLICATIONS OF REMOTE SENSING

1.Mapping land use and cover.

2.Agriculture and soil mapping.

3.Forestry,city,planning,surveying etc.

16

1.Light from the screen.

2.Acts as source of information.

3.Radiated light from the source passes a distance.

4.Sensors(eyes) capture the radiated light/radiations.

5.Sensors send signals to the processor.Here our eyes are the processor.

6.Processor stores,interprets this information.

TECHNICAL DEFINITION

Remote sensing is a technology for sampling electromagnetic radiations comprising a signal emanating from its source target that is used to acquire and interpret non contagious geospatial data which is used to extract information about features,objects and classes on the Earth’s land surfaces,oceans and atmosphere.

Page 17: Satellite Image Classification

BASIC PRINCIPLE OF REMOTE SENSING

Detection and descrimination of objects or surface features in detecting and recording of radiant energy reflected or emitted by objects or surface material.Different objects return different amount of energy in different bands of electromagnetic spectrum incident upon it.This depends on the property of material(structural,chemical,physical),surface roughnes,angle of incidence,intensity and wavelength of radiant energy.

Stages in Remote SensingEmission of electromagnetic radiation, or EMR (sun/self- emission).Transmission of energy from the source to the surface of the earth, as well as absorption and scattering.Interaction of EMR with the earth’s surface: reflection and emission.Transmission of energy from the surface to the remote sensor.Sensor data output.6.Data transmission,processing and analysis.

figure : Remote Sensing process

The basic strategy for sensing electromagnetic radiation is clear. Everythingin nature has its own unique distribution of reflected, emitted and absorbedradiation. These spectral characteristics, if ingeniously exploited, can be usedto distinguish one thing from another or to obtain information about shape,size and other physical and chemical properties.

1.EMISSION OF ELECTROMAGNETIC RADIATION OR ILLUMINATION:

17

satellite

Reflected solar radiation

grassforest

Distribute for analysis Built up area

atmosphere

Preprocess and archive

water

sun

Page 18: Satellite Image Classification

The first requirement for remote sensing is to have an energy source to illuminate the target (unless the sensed energy is being emitted by the target). This energy is in the form of electromagnetic radiation.

Principal Divisions of the Electromagnetic Spectrum

INTERACTION OF ELECTROMAGNETIC WAVES WITH ATMOSPHERE

The sun is the source of radiation, and electromagnetic radiation (EMR)from the sun that is reflected by the earth and detected by the satellite oraircraft-borne sensor must pass through the atmosphere twice, once on its

18

1.Gamma Raywavelength<0.03 nanometers

Entirely absorbed by earth’s atmosphere.Not available for remote sensing.

2.X-raywavelength 0.03-30 nm

Entirely absorbed.Not available for remote sensing.

3.UVwavelength .03-.04 m micrometer

Absorbed by ozone.

4.photographic UVwavelength .3-.4 m

Available for remote sensing.Can be captured with the photographic plate.

5. Visible Spectrum0.4 m - 0.7 mViolet 0.4 m -0.446 mBlue 0.446 m -0.5 mGreen 0.5 m - 0.578 mYellow 0.578 m - 0.592 mOrange 0.592 m - 0.62 mRed 0.62 m -0.7 m

This is the light, which our eyes can detect. This is the only portion of the spectrum that can be associated with the concept of color. Blue Green and Red are the three primary colors of the visiblespectrum. They are defined as such because no singleprimary color can be created from the other two, butall other colors can be formed by combining thethree in various proportions. The color of an objectis defined by the color of the light it reflects.

6.Infrared (IR) Spectrum0.7 m – 100 m

7.Microwave Region1 mm - 1 m

Wavelengths longer than the red portion of thevisible spectrum are designated as the infraredspectrum.The infrared region can bedivided into two categories based on their radiationproperties.Reflected IR (.7 m - 3.0 m) is used for remote sensing. Thermal IR (3 - 35 m) is the radiation emitted from earth’s surface in the form of heat and radiaion.

8.Radio Waves(>1 m)

This is the longest portion of the spectrum mostlyused for commercial broadcast and meteorology.

This is the longest wavelength used in remote sensing.The shortest wavelengths in this range haveproperties similar to thermal infrared region. Themain advantage of this spectrum is its ability topenetrate through clouds.

Page 19: Satellite Image Classification

journey from the sun to the earth and second after being reflected by thesurface of the earth back to the sensor. Interactions of the direct solar radiationand reflected radiation from the target with the atmospheric constituentsinterfere with the process of remote sensing and are called as “AtmosphericEffects”.

The solar energy is subjected to modification by several physical processesas it passes the atmosphere, viz.

1) Scattering2) Absorption

The interaction of EMR with the atmosphere is important to remote sensing for two main reasons :

1. Information carried by EMR reflected/emitted by the earth’s surface is modified while traversing through the atmosphere.

2. Second, the interaction of EMR with the atmosphere can be used to obtain useful information about the atmosphere itself.

CONSEQUENCES

The atmospheric constituents scatter and absorb the radiation modulatingthe radiation reflected from the target by attenuating it, changing its spatialdistribution and introducing into field of view radiation from sunlightscattered in the atmosphere and some of the energy reflected from nearbyground area. Both scattering and absorption vary in their effect from one partof the spectrum to the other.

ATMOSPHERIC SCATTERING

Scattering is the redirection of EMR by particles suspended in theatmosphere or by large molecules of atmospheric gases.

EFFECTS OF SCATTERING:

Scattering not only reduces the image contrast but also changes the spectral signature of ground objects as seen by the sensor.

19

Page 20: Satellite Image Classification

FACTORS EFFECTING SCATTERING Size of the particles, their abundance, the wavelength of radiation, depth ofthe atmosphere through which the energy is traveling and the concentrationof the particles. The concentration of particulate matter varies both in time and over season. Thus the effects of scattering will be uneven spatially andwill vary from time to time and over season.Theoretically scattering can be divided into three categories dependingupon the wavelength of radiation being scattered and the size of the particlescausing the scattering. These are

SELECTIVE SCATTERING :

Rayleigh Scattering

Rayleigh scattering predominates where electromagnetic radiation interacts with particles that are smaller than the wavelength of the incoming light.

The effect of the Rayleigh scattering is inversely proportional to the fourth powerof the wavelength. Shorter wavelengths are scattered more than longerwavelengths. In the absence of these particles and scattering the sky wouldappear black.

In the context of remote sensing, the Rayleigh scattering is the most important type of scattering. It causes a distortion of spectralcharacteristics of the reflected light when compared to measurements takenon the ground.

As sunlight passes through atmosphere,shorter wavelength(ie blue ) of visiblespectrum scattered more than other at day.At sunrise or sunset,the light has to travel farther through the atmosphere than at mid day and scattering of shorter wavelengths is more complete ,this leaves a greater proportion of longer wavelength to penetrate atmosphere.

Mie Scattering

20

Scattering process

wavelength Approximate dependenceParticle size

Kinds of particles

1.selective

Rayleigh

Mie

-4

o to -4

< 1 m

0.1 to 10 m

Air molecules

Smoke, haze

2.Non-selective

o Dust, fog, clouds10 m

Page 21: Satellite Image Classification

Mie scattering occurs when the wavelength of the incoming radiation issimilar in size to the atmospheric particles. These are caused by aerosols: amixture of gases, water vapor and dust. It is generally restricted to the loweratmosphere where the larger particles are abundant and dominates underovercast cloud conditions. It influences the entire spectral region from ultraviolet to near infrared regions.

NON SELECTIVE SCATTERING

This type of scattering occurs when the particle size is much larger thanthe wavelength of the incoming radiation.

Particles responsible for this effect are water droplets and larger dust particles. The scattering is independent of the wavelength, all the wavelength are scattered equally.

The most common example of non-selective scattering is the appearance of clouds as white. As cloud consist of water droplet particles and the wavelengths are scattered in

equal amount, the cloud appears as white.

ATMOSPHERIC ABSORPTION The gas molecules present in the atmosphere strongly absorb the EMR passing

through the atmosphere in certain spectral bands. Mainly three gases are responsible for most of absorption of solar radiation,

viz. ozone, carbon dioxide and water vapour. Ozone absorbs the high energy, short wavelength portions of the ultraviolet

spectrum (< 0.24 m) thereby preventing the transmission of this radiation to the lower atmosphere.

Carbon dioxide is important in remote sensing as it effectively absorbs the radiation in mid and far infrared regions of the spectrum. It strongly absorbs in the region fromabout 13-17.5 m, whereas two most important regions of water vapourabsorption are in bands 5.5 - 7.0 m and above 27 m. Absorption relativelyreduces the amount of light that reaches our eye making the scene lookrelatively duller.

CONCLUSIONS

Because these gases absorb electromagnetic energy in very specific regions of the spectrum, they influence where (in the spectrum) we can "look" for remote sensing purposes. Those areas of the spectrum which are not severely influenced by atmospheric absorption and thus, are useful to remote sensors, are called atmospheric windows. By comparing the characteristics of the two most common energy/radiation sources (the sun and the earth) with the atmospheric windows available to us, we can define those

21

Page 22: Satellite Image Classification

wavelengths that we can use most effectively for remote sensing. The visible portion of the spectrum, to which our eyes are most sensitive, corresponds to both an atmospheric window and the peak energy level of the sun. Note also that heat energy emitted by the Earth corresponds to a window around 10 µm in the thermal IR portion of the spectrum, while the large window at wavelengths beyond 1 mm is associated with the microwave region.

RADIATION: TARGET INTERACTIONS :

Radiation that is not absorbed or scattered in the atmosphere can reach and interact with the Earth's surface. There are three 3 forms of interaction that can take place when energy strikes, or is incident (I) upon the surface. These are:

Absorption (A) Transmission (T) Reflection (R)

ABSORPTION (A) :

Occurs when radiation (energy) is absorbed into the target.

TRANSMISSION(T) :

Occurs when radiation passes through a target.

REFLECTION(R) :

Reflection (R) occurs when radiation "bounces" off the target and is redirected. In remote sensing, we are most interested in measuring the radiation reflected from targets. Two types of reflection, which represent the two extreme ends of the way in which energy is reflected from a target:

specular reflection : When a surface is smooth we get specular or mirror-like reflection where all (or almost all) of the energy is directed away from the surface in a single direction

diffuse reflection : Diffuse reflection occurs when the surface is rough and the energy is reflected almost uniformly in all directions.

Most earth surface features lie somewhere between perfectly specular or perfectly diffuse reflectors. Whether a particular target reflects specularly or diffusely, or somewhere in between, depends on the surface roughness of the feature in comparison to the wavelength of the incoming radiation.

22

Page 23: Satellite Image Classification

If the wavelengths are much smaller than the surface variations or the particle sizes that make up the surface, diffuse reflection will dominate. For example, fine-grained sand would appear fairly smooth to long wavelength microwaves but would appear quite rough to the visible wavelengths.  

TRANSMISSION OF ENERGY FROM THE SURFACE TO THE REMOTE

SENSOR

1. Passive

TYPES OF REMOTE SENSING

OR 2. Active

TYPES OF REMOTE SENSORS

Passive vs. Active SensingThe sun provides a very convenient source of energy for remote sensing.The sun's energy is either reflected, as it is for visible wavelengths, or absorbed and then reemitted,

23

Page 24: Satellite Image Classification

as it is for thermal infrared wavelengths. Remote sensing systems which measure energy that is naturally available are

called passive sensors. Passive sensors can only be used to detect energy when the naturally occurring energy is available. For all reflected energy, this can only take place during the time when the sun is illuminating the Earth. There isno reflected energy available from the sun at night. Energy that is naturally emitted (such as thermal infrared) can be detected day or night, as long as the amount of energy is large enough to be recorded.

Active sensors, on the other hand, provide their own energy source for illumination. The sensor emits radiation which is directed toward the target to be investigated. The radiation reflected from that target is detected and measured by the sensor. Advantages for active sensors include the ability to obtain measurements anytime, regardless of the time of day or season. Active sensors can be used for examining wavelengths that are not sufficiently provided by the sun, such as microwaves, or to better control the way a target is illuminated. However, active systems require the generation of a fairly large amount of energy to adequately illuminate targets. Some examples of active sensors are a laser fluorosensor and a synthetic aperture radar (SAR).

24

PASSIVE REMOTE SENSING

Page 25: Satellite Image Classification

In order for a sensor to collect and record energy reflected or emitted from a target or surface, it must reside on a stable platform removed from the target or surface being observed. Platforms for remote sensors may be situated on the ground, on an aircraft or balloon (or some other platform within the Earth's atmosphere), or on a spacecraft or satellite outside of the Earth's atmosphere. Ground-based sensors are often used to record detailed information about the surfacewhich is compared with information collected from aircraft or satellite sensors. Sensors may be placed on a ladder, scaffolding, tall building, cherry-picker, crane,etc.Aerial platforms are primarily stable wing aircraft, although helicopters are occasionally used. Aircraft are often used to collect very detailed images and facilitate the collection of data over virtually any portion of the Earth's surface at any time.In space, remote sensing is sometimes conducted from the space shuttle or, more commonly, fromsatellites.

TYPES OF SATELLITES AND THEIR ORBITS

The path followed by a satellite is referred to as its orbit. Satellite orbits are matched to the capability and objective of the sensor(s) they carry. 1.GEOSTATIONARYSATELLITES

2.POLAR

1.GEOSTATIONARY :

25

ACTIVE REMOTE SENSING

Page 26: Satellite Image Classification

Satellites at very high altitudes, which view the same portion of the Earth's surface at all times have geostationary orbits. These geostationary satellites, at altitudes of approximately 36,000 kilometres, revolve at speeds which match the rotation of the Earth so they seem stationary, relative to the Earth's surface. The satellites observe and collect information continuously over specific areas. Weather and communications satellites commonly have these types of orbits. Due to their high altitude, some geostationary weather satellites can monitor weather and cloud patterns covering an entire hemisphere of the Earth.

2.POLAR :

Many remote sensing platforms are designed to follow an orbit (basically north-south) which, in conjunction with the Earth's rotation (west-east), allows them to cover most of the Earth's surface over a certain period of time. These are nearpolar orbits, so named for the inclination of the orbit relative to a line running between the North and South poles. Many of these satellite orbits are also sun-synchronous such that they cover each area of the world at a constant local time of day called local sun time. At any given latitude, the position of the sun in the sky as the satellite passes overhead will be the same within the same season. This ensures consistent illumination conditions when acquiring images in a specific season over successive years, or over a particular area over a series of days.

26

Page 27: Satellite Image Classification

This is an important factor for monitoring changes between images or for mosaicking adjacent images together, as they do not have to be corrected for differentillumination conditions.

Most of the remote sensing satellite platforms today are in near-polar orbits, which means that the satellite travels northwards on one side of the Earth and then toward the southern pole on the second half of its orbit. These are called ascending and descendingpasses, respectively. If the orbit is also sunsynchronous, the ascending pass is most likely on the shadowed side of the Earth while the descending pass is on the sunlit side. Sensors recording reflected solar energy only image the surface on a descending pass, when solar illumination is available. Active sensors which provide their own illumination orpassive sensors that record emitted (e.g. thermal)radiation can also image the surface on ascending passes.

Land Observation Satellites

1.LANDSAT:

A number of sensors have been on board the Landsat series of satellites, including theReturn Beam Vidicon (RBV) camera systems, the MultiSpectral Scanner (MSS) systems, and the Thematic Mapper (TM). The most popular instrument in the early days of Landsat was the MultiSpectral Scanner (MSS) and later the Thematic Mapper (TM). Each of these sensors collected data over a swath width of 185 km, with a full scene being defined as 185 km x 185 km.

27

Page 28: Satellite Image Classification

MSS Bands

Channel Wavelength Range (μm)Landsat 1,2,3 Landsat 4,5MSS 4 MSS 1 0.5 - 0.6 (green)MSS 5 MSS 2 0.6 - 0.7 (red)MSS 6 MSS 3 0.7 - 0.8 (near infrared)MSS 7 MSS 4 0.8 - 1.1 (near infrared)

SPOT

SPOT (Système Pour l'Observation de la Terre) is a series of Earth observation imaging satellites designed and launched by CNES (Centre National d'Études Spatiales) of France, with support from Sweden and Belgium. The SPOT satellites each have twin high resolution visible (HRV) imaging systems, which can be operated independently and simultaneously.

IRSThe Indian Remote Sensing (IRS) satellite series, combines features from both the Landsat MSS/TM sensors and the SPOT HRV sensor. The third satellite in the series, IRS-1C,launched in December, 1995 has three sensors: a single-channel panchromatic (PAN) high resolution camera, a medium resolution four-channel Linear Imaging Self-scanning Sensor (LISS-III), and a coarse resolution two-channel Wide Field Sensor (WiFS).

MOSThe first Marine Observation Satellite (MOS-1) was launched by Japan in February, 1987 and was followed by its successor, MOS-1b, in February of 1990. These satellites carry three different sensors: a four-channel Multispectral Electronic Self-Scanning Radiometer (MESSR),a four-channel Visible and Thermal Infrared Radiometer (VTIR), and a two-channel Microwave Scanning Radiometer (MSR), in the microwave portion of the spectrum.

DIGITAL IMAGE PROCESSING

INTRODUCTION

Vision is the most advanced of our senses, so images play important role in human perception. Pictures or images are the most common and convenient means of conveying information.About 75% of the information received by human is in pictorial form. However, unlike humans, who are limited to the visual band of the electromagnetic (EM) spectrum, imaging machines cover almost the entire EM spectrum, ranging from gamma

28

Page 29: Satellite Image Classification

to radio waves. They can operate also on images generated by sources like ultrasound, electron microscopy, and computer generated images.

In the present context, the analysis of pictures that employ an overhead perspective, including the radiation not visible to human eye are considered. Thus our discussion will be focusing on analysis of remotely sensed images. These images are represented in digital form. When represented as numbers, brightness can be added, subtracted, multiplied, divided and subjected to statistical manipulations that are not possible if an image is presented only as a photograph.

DIGITAL IMAGE

An image may be defined as a two dimensional function, f (x, y), where x and y are spatial co-ordinates, and the amplitude of f at any pair of co-ordinates (x, y) is called intensity or gray level of monochrome images at that point. Such an image is continuous image. Converting such an image to digital form requires that the coordinates, as well as the amplitude, be digitized. Digitizing the coordinate values is called sampling; digitizing the amplitude value is called quantization. Thus when x, y and the amplitude values of f are all finite, discrete quantities, we call the image a digital image. The result of sampling and quantization is a matrix of real numbers.Digital image is composed of finite number of elements, each of which has a particular location and values. These elements are referred to as picture elements, image elements, pel and pixels. Pixel is the term used most widely to denote the elements of a digital image. Picture elements (pixels) are located at the intersection of each row i and column j in each K bands of imagery. Associated with each pixel is a number known as Digital Number (DN) or Brightness Value (BV) that depicts the average radiance of a relatively small area within a scene. A smaller number indicates low average radiance from the area and the high number is an indicator of high radiant properties of the area.. The size of this area effects the reproduction of details within the scene. As pixel size is reduced more scene detail is presented in digital representation.

Figure 1 : Structure of a Digital Image and Multispectral Image

Color image is formed by a combination of individual images. For example, in RGB color system a color image consists of three individual monochrome images, referred to as the red (R), green (G), blue (B) primary images.

29

Page 30: Satellite Image Classification

CLASSES OF IMAGES

Although we work with integer coordinates, the value (intensities) of pixels is not restricted to be integers in MATLAB. Table below list various classes supported by MATLAB and the image processing toolbox for representing pixel values.

NAME DESCRIPTIONdouble

single

unit8

unit16

unit32

int8

int16

char

logical

Double precision, floating point numbers in the approximate range 8 bytes per element.

Single precision floating point numbers with values in the approximate range 4 bytes per elements.

Unsigned 8 bit integers in the range [0,255] (1 byte per element).

Unsigned 16-bit integers in the range [0, 65535] (2 bytes per element).

Unsigned 32-bit integers in the range [0, 4294967295] (4 bytes per element).

Signed 8 bit integers in the range [-128,127] (1 byte per element).

Signed 16 bit integers in the range [-32768,32767] (2 byte per element).

Character (2 bytes per element)

Values are 0 or 1 (1 byte per element).

TYPES OF IMAGES

Gray-scale images. Binary images. Indexed images. RGB images.

Most monochrome image processing operations are carried out using binary or gray scale images.

30

Page 31: Satellite Image Classification

GRAY SCALE IMAGES

Gray scale image is a data matrix whose values represent shades of gray. When the elements of a gray scale image are of class unit8 or unit16, they have integer values in the range [0,255] or [0, 65535] respectively. Values of double and single gray scale images normally are scaled in the range [0, 1], although other ranges can also be used.

BINARY IMAGES

This image has a very specific meaning in MATLAB. A binary image is a logical array of 0’s and 1’s. Thus an array of 0s and 1s whose values are of data class, say, unit8, is not considered a binary image in MATLAB.

31

Page 32: Satellite Image Classification

A numeric array is converted to binary using function logical.

Thus if A is numeric array then we can create a logical array B using the statement: B= logical (A)

If A contains elements other than 0s and 1s, the logical function converts all non-zero quantities to logical 1s and all entries with value 0 to logical 0s. The use of relational and logical operator can also result in logical arrays.To test if an array is of class logical we use the islogical function:

Islogical (C)

If C is a logical array then function will returns a 1. Otherwise it returns a 0. Logical arrays can be converted to numeric arrays using the general class conversion syntax B= class_name (A)Where class _name is im2unit8, im2unit16, im2double, im2single or mat2gray. Toolbox function mat2gray converts an image to an array of class double scaled to the range [0,1]. The calling syntax is g=mat2gray (A, [Amin, Amax])

Where image g has value in range 0 (black) to 1 (white). The specified parameters, Amin and Amax, are such that values less than Amin in A becomes 0 in g, and values greater than Amax in A corresponds to 1 in g. The syntax g= mat2gray (A)

32

Page 33: Satellite Image Classification

sets the values of Amin and Amax to the actual minimum and maximum values in A. This syntax is very useful tool because it scales the entire range of values in the input to the range [0,1], independently of the class of the input.

INDEXED IMAGES

An indexed image has two components: a data matrix of integers, X, and a color map / matrix map. Matrix map is an m*3 array of class double containing floating point values in the range [0, 1]. The length of the map is equal to the number of colors it defines. Each row of map specifies the red, green, blue components of a single color (if the 3 columns of the map are equal then map becomes gray-scale map). An indexed image uses ‘direct mapping’ of pixel intensity values to color map values. The color of each pixel is determined by using the corresponding values of integer matrix X as an index (hence the name is indexed image) into map. To display an indexed image we write

>> imshow (X, map)Or, alternatively,>> image (X)>> colormap (map)

A color map is stored with an indexed image and is automatically loaded with the image when the imread function is used to load the image.

33

Page 34: Satellite Image Classification

RGB IMAGES

This image is also called as true color images. It is an M x N x 3 array of color pixels, where each color pixel is a triplet corresponding to the red, green, blue components of an RGB image at a specific spatial location. The three images forming an RGB color images are referred to as the red, green, blue component images. The data class of the component images determines their range of values. If an RGB image is of class double the range of values is [0, 1]. Let f R, f G, f B represent three RGB component images. An RGB image is formed from these images by using the cat (concatenate) operator to stack the images: rgb_image= cat (3, f R, f G, f B)

If all the component images are identical, the result is a gray-scale image. Let rgb_image denote an RGB image. The following commands extract the three component images:

>> f R= rgb_image(:, :, 1);>>f G= rgb_image (:, :, 2);>>f B= rgb_image (:, :, 3);

34

Page 35: Satellite Image Classification

Thus an image is characterized by both a class and a type. For instance, ‘unit8 gray-scale image’ simply refers to a gray-scale image whose pixels are of class unit8.

The output of a remote sensing system is an image representing the scene being observed. Many further steps of digital image processing and modeling are required in order to extract useful information from the image.

STEPS IN PROCESSING DIGITAL IMAGE

Image correction/restoration: Image data recorded by sensors on a satellite or aircraft contain errors related to geometry and brightness values of the pixels. These errors are corrected or rectified. Rectification is a process of geometrically

35

Page 36: Satellite Image Classification

correcting an image so that it can be represented on a planar surface. That is, it is the process by which geometry of an image is made plan metric.

Image enhancement: Image enhancement is the modification of image, by changing the pixel brightness values, to improve its visual impact. Image enhancement techniques are performed by deriving the new brightness value for a pixel either from its existing value or from the brightness values of a set of surrounding pixels. Image enhancement methods are applied separately to each band of a multispectral image. Digital techniques have been found to be most satisfactory than the photographic technique for image enhancement, because of the precision and wide variety of digital processes.

Image transformation: The multi-spectral character of image data allows it to be spectrally transformed to a new set of image components or bands with a purpose to get some information more evident or to preserve the essential information content of the image (for a given application), with a reduced number of transformed dimensions. The pixel values of the new components are related to the original set of spectral bands via a linear operation.

Image classification: The overall objective of image classification procedures is to automatically categorize all pixels in an image into land cover classes or themes. A pixel is characterized by its spectral signature, which is determined by the relative reflectance in different wavelength bands. Multi-spectral classification is an information extraction process that analyses these spectral signatures and assigns the pixels to classes based on similar signatures.

IMAGE CLASSIFICATION

INTRODUCTION

Image classification is formally defined as the process whereby a received pattern/signal is assigned to one of a prescribed number of classes. The overall objective of image classification procedures is to automatically categorize all pixels in an image into land cover classes or themes. Normally multi-spectral data are used to perform the classification and the spectral pattern present within the data for each pixel is used as the numerical basis for categorization. That is, different feature type’s manifest different combination of DNs (Digital number) based on their inherent spectral reflectance and emittance properties. Spectral pattern refers to the set of radiance measurements obtained in the various wavelength bands for each pixel. Spectral pattern reorganization refers to the family of classification procedures that utilizes this pixel-by-pixel spectral information.

36

Page 37: Satellite Image Classification

METHODS OF IMAGE CLASSIFICATION

A neural network should be trained before it can be put to use. Training involves

feeding the neural network with training samples and allowing it to learn by

adjusting weights of synapses and various other parameters. Neural networks can

be broadly classified into two categories based on the type of learning.

Supervised

Unsupervised

SUPERVISED

The supervised classification methods are based on user-defined classes and corresponding representative sample sets. The sample sets are specified by training raster data sets, which must be created, prior to entering the Automatic Classification process.

In supervised classification, spectral signatures are developed from specified locations in the image. These specified locations are given the generic name 'training sites' and are defined by the user. The training data consists of pairs of input objects and desired output.

Supervised classification requires the analyst to select training areas where he/she knows what is on the ground and then digitize a polygon within that area. The computer then creates mean spectral signature.

the analyst is "supervising" the categorization of a set of specific classes. The numerical information in all spectral bands for the pixels comprising these areas are used to "train" the computer to recognize spectrally similar areas for each class. The computer uses a

37

Page 38: Satellite Image Classification

special program or algorithm (of which there are several variations), to determine the numerical "signatures" for each training class.Once the computer has determined the signatures for each class, each pixel in the image is compared to these signatures and labeled as the class it most closely "resembles" digitally.Thus, in a supervised classification we are first identifying the information classes which are then used to determine the spectral classes which represent them.

Similarly all the pixels are analyzed by the analyst and corresponding spectral signature are created. The Result is Information--in this case a Land Cover map.

Common Supervised classifiers:

Parallelepiped

Minimum distance to mean

Maximum likelihood

Some advanced techniques:

Neural networks

Contextual classifiers

Linear regression

UNSUPERVISED

Rather than defining training sets and carving out pieces of n-dimensional space, we define no classes beforehand and instead use statistical approaches to divide the n-dimensional space into clusters with the best separation using clustering algorithms. After that, we assign class names to those clusters. It is distinguished from supervised learning by the fact that there is no a priori output.

The analyst requests the computer to examine the image and extract a number of spectrally distinct clusters.

The result of unsupervised classification is not yet informative until the analyst determines the ground cover for each of the clusters. This is as shown in the following figure.

38

Page 39: Satellite Image Classification

Common Unsupervised classification Methods are

1. Simple One-Pass Clustering 2. K Means 3. Fuzzy 4. Minimum Distribution Angle 5. Isodata Classification 6. Self-Organization and 7. Adaptive Resonance.

STEPS IN IMAGE CLASSIFICATION

Preprocessing: e.g. Atmospheric, correction, noise suppression, band

ratioing, principal component analysis etc.

Training: Selection of the particular feature which best describes the

pattern.

Decision: Choice of suitable method for comparing the image patterns

with the target patterns.

Assessing the accuracy of the classification

TECHNIQUES USED : In our project we are using MATLAB-

fuzzy logic toolbox

39

Page 40: Satellite Image Classification

Overview of the MATLAB Environment

MATLAB is a high-level technical computing language and interactive environment for algorithm development, data visualization, data analysis, and numeric computation. MATLAB product can solve technical computing problems faster than traditional programming languages, such as C, C++, and Fortran.

RANGE OF APPLICATIONS

=> Signal and image processing

=> Communications

=> Control design

=> Test and measurement

=> Financial modeling and analysis

=> Computational biology

FEATURES OF MATLAB :

High-level language for technical computing Development environment for managing code, files, and data

Interactive tools for iterative exploration, design, and problem solving

Mathematical functions for linear algebra, statistics, Fourier analysis, filtering, optimization, and numerical integration

2-D and 3-D graphics functions for visualizing data

Tools for building custom graphical user interfaces

Functions for integrating MATLAB based algorithms with external applications and languages, such as C, C++, Fortran, Java™, COM, and Microsoft® Excel®

The Language: The MATLAB language is a high-level matrix/array language with control flow statements, functions, data structures, input/output, and object-oriented programming features. It allows both "programming in the small" to rapidly create quick programs you do not intend to reuse. You can also do "programming in the large" to create complex application programs intended for reuse.

40

Page 41: Satellite Image Classification

Graphics : MATLAB has extensive facilities for displaying vectors and matrices as graphs, as well as annotating and printing these graphs. Consists of high-level functions for two-dimensional and three-dimensional data visualization, image processing, animation, and presentation graphics. Also includes low-level functions that allow you to fully customize the appearance of graphics as well as to build complete graphical user interfaces on your MATLAB applications.

Fuzzy Logic Toolbox Description

=>Fuzzy Logic Toolbox software is a collection of functions built on the MATLAB

technical computing environment.

=>Provides tools to create and edit fuzzy inference systems within the framework of MATLAB.

=>Fuzzy systems may be integrated into simulations with Simulink software.

=>Stand-alone C programs can be built that call on fuzzy systems built with MATLAB. This toolbox relies heavily on graphical user interface (GUI) tools to , although work may be accompalished entirely from the command line as per preferance.

The toolbox provides three categories of tools:

Command line functions Graphical interactive tools

Simulink blocks and examples

Command line functions :Contains functions that can be called from the command line or from your own applications. Many of these functions are files containing a series of MATLAB statements that implement specialized fuzzy logic algorithms.

MATLAB code for viewing these functions

type function_name

The way any toolbox function works by copying and renaming the file can be changed, then the copy can be modified. The toolbox can be extended by adding our own files .

Graphical interactive tools : This toolbox provides a number of interactive tools to access many of the functions through a GUI. Together, the GUI-based tools provide an environment for fuzzy inference system design, analysis, and implementation.

Simulink blocks and examples : Set of blocks for use with Simulink. These are specifically designed for high speed fuzzy logic inference in the Simulink environment.

41

Page 42: Satellite Image Classification

HOW IS TOOLBOX HELPFUL:

Most of human reasoning and concept formation is linked to the use of fuzzy rules. By providing a systematic framework for computing with fuzzy rules, the toolbox greatly amplifies the power of human reasoning. Further amplification results from the use of MATLAB and graphical user interfaces, areas in which The MathWorks™ has unparalleled expertise.

What Is Fuzzy Logic?

Fuzzy logic has two different meanings :

In a narrow sense, fuzzy logic is a logical system, which is an extension of multivalued logic.

In a wider sense fuzzy logic (FL) is almost synonymous with the theory of fuzzy sets, a theory which relates to classes of objects with unsharp boundaries in which membership is a matter of degree.Even in its more narrow definition, fuzzy logic differs both in concept and substance from traditional multivalued logical systems.

The basic concept underlying FL is that of a linguistic variable, that is, a variable whose values are words rather than numbers. In effect, much of FL may be viewed as a methodology for computing with words rather than numbers. Although words are inherently less precise than numbers, their use is closer to human intuition. Furthermore, computing with words exploits the tolerance for imprecision and thereby lowers the cost of solution.

What Can Fuzzy Logic Toolbox Software Do?

=> Create and edit fuzzy inference systems with Fuzzy Logic Toolbox software. They can be created using graphical tools or command-line functions, or can be generated automatically using either clustering or adaptive neuro-fuzzy techniques.

If there is access to Simulink software, fuzzy system in a block diagram simulation environment can be easily tested.

The toolbox also allows to run stand-alone C programs directly. This is made possible by a stand-alone Fuzzy Inference Engine that reads the fuzzy systems saved from a MATLAB session. The stand-alone engine can be customized to build fuzzy inference into our own code. All provided code is ANSI compliant.

Because of the integrated nature of the MATLAB environment, we can create our own tools to customize the toolbox or harness it with another toolbox, such as the Control System Toolbox, Neural Network Toolbox, or Optimization Toolbox software.

42

Page 43: Satellite Image Classification

Foundations of Fuzzy Logic

1. Fuzzy Sets:

Fuzzy logic starts with the concept of a fuzzy set. A fuzzy set is a set without a crisp, clearly defined boundary. It can contain elements with only a partial degree of membership.

Reasoning in fuzzy logic is just a matter of generalizing the familiar yes-no (Boolean) logic. If you give true the numerical value of 1 and false the numerical value of 0, this value indicates that fuzzy logic also permits in-between values like 0.2 and 0.7453.

For instance:

Q: Is Saturday a weekend day? A: 1 (yes, or true) Q: Is Tuesday a weekend day? A: 0 (no, or false) Q: Is Friday a weekend day? A: 0.8 (for the most part yes, but not completely) Q: Is Sunday a weekend day? A: 0.95 (yes, but not quite as much as Saturday).

The following plot on the left shows the truth values for weekend-ness if you are forced to respond with an absolute yes or no response. On the right, is a plot that shows the truth value for weekend-ness if you are allowed to respond with fuzzy in-between values.

now consider a continuous scale time plot of weekend-ness shown in the following plots.

43

Page 44: Satellite Image Classification

By making the plot continuous, you are defining the degree to which any given instant belongs in the weekend rather than an entire day. In the plot on the left, notice that at midnight on Friday, just as the second hand sweeps past 12, the weekend-ness truth value jumps discontinuously from 0 to 1. This is one way to define the weekend, and while it may be useful to an accountant, it may not really connect with your own real-world experience of weekend-ness.

2.Membership Functions in Fuzzy Logic Toolbox Software

A membership function (MF) is a curve that defines how each point in the input space is mapped to a membership value (or degree of membership) between 0 and 1. The input space is sometimes referred to as the universe of discourse.

The only condition a membership function must really satisfy is that it must vary between 0 and 1. The function itself can be an arbitrary curve whose shape we can define as a function that suits us from the point of view of simplicity, convenience, speed, and efficiency.Its defined for a fuzzy set and not a classical set.

A classical set might be expressed as

A = {x | x > 6}

A fuzzy set is an extension of a classical set. If X is the universe of discourse and its elements are denoted by x, then a fuzzy set A in X is defined as a set of ordered pairs.

A = {x, µA(x) | x X}∈

µA(x) is called the membership function (or MF) of x in A. The membership function maps each element of X to a membership value between 0 and 1.

The toolbox includes 11 built-in membership function types. These 11 functions are, in turn, built from several basic functions:

piece-wise linear functions the Gaussian distribution function

the sigmoid curve

44

Page 45: Satellite Image Classification

quadratic and cubic polynomial curves

By convention, all membership functions have the letters mf at the end of their names.

The simplest membership functions are formed using straight lines. Of these, the simplest is the triangular membership function, having the function name trimf. Its a collection of three points forming a triangle. The trapezoidal membership function, trapmf, has a flat top and really is just a truncated triangle curve. These straight line membership functions have the advantage of simplicity.

Two membership functions are built on the Gaussian distribution curve: a simple Gaussian curve and a two-sided composite of two different Gaussian curves. The two functions are gaussmf and gauss2mf.

The generalized bell membership function is specified by three parameters and has the function name gbellmf. The bell membership function has one more parameter than the Gaussian membership function, so it can approach a non-fuzzy set if the free parameter is tuned. Because of their smoothness and concise notation, Gaussian and bell membership functions are popular methods for specifying fuzzy sets. Both of these curves have the advantage of being smooth and nonzero at all points.

45

Page 46: Satellite Image Classification

Although the Gaussian membership functions and bell membership functions achieve smoothness, they are unable to specify asymmetric membership functions, which are important in certain applications. Next, you define the sigmoidal membership function, which is either open left or right. Asymmetric and closed (i.e. not open to the left or right) membership functions can be synthesized using two sigmoidal functions, so in addition to the basic sigmf, you also have the difference between two sigmoidal functions, dsigmf, and the product of two sigmoidal functions psigmf.

Polynomial based curves account for several of the membership functions in the toolbox. Three related membership functions are the Z, S, and Pi curves, all named because of their shape. The function zmf is the asymmetrical polynomial curve open to the left, smf is the mirror-image function that opens to the right, and pimf is zero on both extremes with a rise in the middle.

There is a very wide selection to choose from when you're selecting a membership function. You can also create your own membership functions with the toolbox. However, if a list based on expanded membership functions seems too complicated, just remember that you could probably get along very well with just one or two types of membership functions, for example the triangle and trapezoid functions. The selection is wide for those who want to explore the possibilities, but expansive membership functions are not necessary for good fuzzy inference systems. Finally, remember that more details are available on all these functions in the reference section.

3.Logical Operations

The most important thing to realize about fuzzy logical reasoning is the fact that it is a superset of standard Boolean logic. In other words, if the fuzzy values at their extremes of 1 (completely true), and 0 (completely false) can be kept , standard logical operations will hold. As an example, consider the following standard truth tables.

46

Page 47: Satellite Image Classification

Now, because in fuzzy logic the truth of any statement is a matter of degree, can these truth tables be altered? The input values can be real numbers between 0 and 1. What function preserves the results of the AND truth table (for example) and also extend to all real numbers between 0 and 1?

One answer is the min operation. That is, resolve the statement A AND B, where A and B are limited to the range (0,1), by using the function min(A,B). Using the same reasoning, you can replace the OR operation with the max function, so that A OR B becomes equivalent to max(A,B). Finally, the operation NOT A becomes equivalent to the operation. Notice how the previous truth table is completely unchanged by this substitution.

Moreover, because there is a function behind the truth table rather than just the truth table itself, you can now consider values other than 1 and 0.

The next figure uses a graph to show the same information. In this figure, the truth table is converted to a plot of two fuzzy sets applied together to create one fuzzy set. The upper part of the figure displays plots corresponding to the preceding two-valued truth tables, while the lower part of the figure displays how the operations work over a continuously varying range of truth values A and B according to the fuzzy operations you have defined.

47

Page 48: Satellite Image Classification

Given these three functions, you can resolve any construction using fuzzy sets and the fuzzy logical operation AND, OR, and NOT.

If-Then Rules

Fuzzy sets and fuzzy operators are the subjects and verbs of fuzzy logic. These if-then rule statements are used to formulate the conditional statements that comprise fuzzy logic.

A single fuzzy if-then rule assumes the form

if x is A then y is B

where A and B are linguistic values defined by fuzzy sets on the ranges (universes of discourse) X and Y, respectively. The if-part of the rule "x is A" is called the antecedent or premise, while the then-part of the rule "y is B" is called the consequent or conclusion. An example of such a rule might be

If service is good then tip is average

The concept good is represented as a number between 0 and 1, and so the antecedent is an interpretation that returns a single number between 0 and 1. Conversely, average is represented as a fuzzy set, and so the consequent is an assignment that assigns the entire fuzzy set B to the output variable y. In the if-then rule, the word is gets used in two entirely different ways depending on whether it appears in the antecedent or the consequent. In MATLAB terms, this usage is the distinction between a relational test using "==" and a variable assignment using the "=" symbol. A less confusing way of writing the rule would be

48

Page 49: Satellite Image Classification

If service == good then tip = average

In general, the input to an if-then rule is the current value for the input variable (in this case, service) and the output is an entire fuzzy set (in this case, average). This set will later be defuzzified, assigning one value to the output

Interpreting an if-then rule involves distinct parts:

1.Evaluating the antecedent (which involves fuzzifying the input and applying any necessary fuzzy operators)

2.Applying that result to the consequent (known as implication). In the case of two-valued or binary logic, if-then rules do not present much difficulty. If the premise is true, then the conclusion is true. But for multivalued,if the antecedent is true to some degree of membership, then the consequent is also true to that same degree.

Thus:

  in binary logic:  p →  q (p and q are either both true or both false.)  in fuzzy logic:  0.5 p → 0.5 q (Partial antecedents provide partial implication.)

The antecedent of a rule can have multiple parts.

If sky is gray and wind is strong and barometer is falling, then ...

in which case all parts of the antecedent are calculated simultaneously and resolved to a single number using the logical operators described in the preceding section. The consequent of a rule can also have multiple parts.

If temperature is cold then hot water valve is open and cold water valve is shut

in which case all consequents are affected equally by the result of the antecedent. How is the consequent affected by the antecedent? The consequent specifies a fuzzy set be assigned to the output. The implication function then modifies that fuzzy set to the degree specified by the antecedent. The most common ways to modify the output fuzzy set are truncation using the min function (where the fuzzy set is truncated as shown in the following figure) or scaling using the prod function (where the output fuzzy set is squashed). Both are supported by the toolbox, but you use truncation for the examples in this section.

Fuzzy logic is a convenient way to map an input space to an output space. Mapping input to output is the starting point for everything. Consider the following examples:

With information about how good your service was at a restaurant, a fuzzy logic system can tell you what the tip should be.

49

Page 50: Satellite Image Classification

With your specification of how hot you want the water, a fuzzy logic system can adjust the faucet valve to the right setting.

With information about how fast the car is going and how hard the motor is working, a fuzzy logic system can shift gears for you.

A graphical example of an input-output map is shown in the following figure.

To determine the appropriate amount of tip requires mapping inputs to the appropriate outputs. Between the input and the output, the preceding figure shows a black box that can contain any number of things: fuzzy systems, linear systems, expert systems, neural networks, differential equations, interpolated multidimensional lookup tables, or even a spiritual advisor, just to name a few of the possible options. Clearly the list could go on and on.

Of the dozens of ways to make the black box work, it turns out that fuzzy is often the very best way..this is because its cheaper,easier and reusable.

Fuzzy logic is a convenient way to map an input space to an output space. Mapping input to output is the starting point for everything. Consider the following examples:

With information about how good your service was at a restaurant, a fuzzy logic system can tell you what the tip should be.

With your specification of how hot you want the water, a fuzzy logic system can adjust the faucet valve to the right setting.

With information about how fast the car is going and how hard the motor is working, a fuzzy logic system can shift gears for you.

50

Page 51: Satellite Image Classification

Why Use Fuzzy Logic?

Here is a list of general observations about fuzzy logic:

Fuzzy logic is conceptually easy to understand.

The mathematical concepts behind fuzzy reasoning are very simple. Fuzzy logic is a more intuitive approach without the far-reaching complexity.

Fuzzy logic is flexible.

With any given system, it is easy to layer on more functionality without starting again from scratch.

Fuzzy logic is tolerant of imprecise data.

Everything is imprecise if you look closely enough, but more than that, most things are imprecise even on careful inspection. Fuzzy reasoning builds this understanding into the process rather than tacking it onto the end.

Fuzzy logic can model nonlinear functions of arbitrary complexity.

You can create a fuzzy system to match any set of input-output data. This process is made particularly easy by adaptive techniques like Adaptive Neuro-Fuzzy Inference Systems (ANFIS), which are available in Fuzzy Logic Toolbox software.

Fuzzy logic can be built on top of the experience of experts.

In direct contrast to neural networks, which take training data and generate opaque, impenetrable models, fuzzy logic lets you rely on the experience of people who already understand your system.

Fuzzy logic can be blended with conventional control techniques.

Fuzzy systems don't necessarily replace conventional control methods. In many cases fuzzy systems augment them and simplify their implementation.

Fuzzy logic is based on natural language.

The basis for fuzzy logic is the basis for human communication. This observation underpins many of the other statements about fuzzy logic. Because fuzzy logic is built on the structures of qualitative description used in everyday language, fuzzy logic is easy to use.

51

Page 52: Satellite Image Classification

PROJECT DETAILS :

INPUT IMAGE DESCRIPTION: THE input image from the satellite was in form of 7 bands which were then reduced to following 3 bands :-

Red Green Near infrared Region

Using Fuzzy toolbox image is classified into 5 sets called membership functions. These membership functions are as given below:

Vegetation Rocky Urban Waterbodies Openland

USING FUZZY: The aim of this method is to classify the mixed pixel to specific category with the help of the fuzzy logic.

ALGORITHM :

Step 1 To read and display image.

On MATLAB workspace, first following instructions are written:

>> clear all:- to clear all the other files opened on the workspace. >> clc : - this clears the workspace screen. After this image is read using instruction: imread and saved in variable, say, a:

>> a= imread(‘file path containing image’);

The prompt sign >> indicates the beginning of an instruction. Now this image is in 7 bands.

52

Page 53: Satellite Image Classification

STEP 2:For extracting image in red ,green and nir band respectively: >> i1=a (:, :, 1); >> i2=a(:,:,2);>> i3=a(:,:,3);this will store 3 band images in variables i1,i2 and i3. Now open this image by writing:>>imtool(i1) ;>> imtool(i2); >>imtool(i3)

STEP 3 : Concate the image of 3 bands into 1 using>>figure,imshow(cat(3,i1,i2,i3))

53

Page 54: Satellite Image Classification

54

Red band image

Green band image

Page 55: Satellite Image Classification

55

Nir image

Concated image in red,blue and nir region

Page 56: Satellite Image Classification

Now we will work on this concated image

By the help of >>imtool instruction we can obtain the training sets/sites of image. These training sites are the RGB values of an image, which define the color of an image and are used at the time of classification.From this training set, minimum and maximum values of all the three bands are taken and applied to FIS EDITOR .

Step 4FIS EDITOR

Now to open FIS Editor we write instruction fuzzy on the workspace and the page so open is shown below:

56

Page 57: Satellite Image Classification

Mamdani`s fuzzy inference method is the most commonly seen fuzzy methodology and it expects the output membership functions to be fuzzy sets. After the aggregation process, there is the fuzzy set for each output variable that needs deffuzification. Surgeno –type system can be used to model any inference system in which the output membership functions are either linear or constant.

Here the madmani type inference system is used.

In FIS Editor we increase number of input from one, to three.The combined input will give a final imageThis is done by goint to the option edit and then adding variable => input

57

Page 58: Satellite Image Classification

MEMBERSHIP FUNCTION EDITOR :

a) Here by default there are three membership function mf1, mf2, mf3. To make them five go to EDIT click on ADD MFs and enter the number of member ship function to be added. Now name these membership functions in the column Name; add minimum, average and maximum values of each membership function from training sets to column Params (both on the right side window), change range and display range from [0 1] to [0 255] on left side window. The range here is from 0-255 because we have the DN numbers of 8-bit image. Repeat these steps for the rest 2 inputs also. Here we use the triangular mebership function for each membership function.

58

Three inputs and their combined output

Page 59: Satellite Image Classification

MEMBERSHIP FUNCTIONS FOR RED BAND :

Vegetation: [9 21 33] Waterbodies: [19 22 25] Rocky: [25 52 83] Urban: [82 121 160] Open_land: [101 140 179]

59

Page 60: Satellite Image Classification

MEMBERSHIP FUNCTIONS FOR GREEN BAND

Vegetation: [25 34 43] Waterbodies: [25 32 40] Rocky: [22 46 70] Urban: [82 108 135] Open_land: [70 106 143]

60

Page 61: Satellite Image Classification

MEMBERSHIP FUNCTION FOR NEAR INFRARED :

Vegetation: [53 82 111] Waterbodies: [1 13 25] Rocky: [74 123 172] Urban: [102 157 213] Open_land: [173 203 234]

61

Page 62: Satellite Image Classification

MEMBERSHIP FUNCTIONS FOR OUTPUT:

Vegetation: [0 1 2] Rocky:[2 3 4] Urban: [4 5 6] Waterbodies:[6 7 8] Open_land: [8 9 10]

62

Page 63: Satellite Image Classification

RULE EDITOR: Now it`s the time to define the rules in rule editor. Based on the descriptions of the input (red, green and nir)& output vriables the rules are constructed in the rule editor. As we define the rule that: IF (red is mf1) and ( green is mf1) and (nir mf1) then class is mf1. The rules here are entered in the verbose format . The inputs are connected with and function. Basic if-then rules are used. At this point , fuzzy inference systems has been completely defined in that variables, membership functions are the rules necessary to calculate classes are in place. The rule editor is shown below:

63

Page 64: Satellite Image Classification

This rule editor file is imported or saved to a file named final.fis with following details :

[System]

Name='rules'

Type='mamdani'

Version=2.0

NumInputs=3

NumOutputs=1

NumRules=5

AndMethod='min'

OrMethod='max'

ImpMethod='min'

AggMethod='max'

DefuzzMethod='centroid'

64

Page 65: Satellite Image Classification

[Input1]

Name='red'

Range=[0 255]

NumMFs=5

MF1='vegetation':'trimf',[9 21 33]

MF2='rocky':'trimf',[25 52 83]

MF3='urban':'trimf',[82 121 160]

MF4='waterbodies':'trimf',[19 22 25]

MF5='openland':'trimf',[70 106 143]

[Input2]

Name='green'

Range=[0 255]

NumMFs=5

MF1='vegetation':'trimf',[25 34 43]

MF2='rocky':'trimf',[22 46 70]

MF3='urban':'trimf',[82 108 135]

MF4='waterbodies':'trimf',[25 32 40]

MF5='openland':'trimf',[70 106 143]

[Input3]

Name='nir'

Range=[0 255]

NumMFs=5

65

Page 66: Satellite Image Classification

MF1='vegetation':'trimf',[53 82 111]

MF2='rocky':'trimf',[74 123 172]

MF3='urban':'trimf',[102 157 213]

MF4='waterbodies':'trimf',[1 13 25]

MF5='openland':'trimf',[173 203 234]

[Output1]

Name='output1'

Range=[0 10]

NumMFs=5

MF1='vegetation':'trimf',[0 1 2]

MF2='rocky':'trimf',[2 3 4]

MF3='urban':'trimf',[4 5 6]

MF4='waterbodies':'trimf',[6 7 8]

MF5='openland':'trimf',[8 9 10]

[Rules]

1 1 1, 1 (1) : 1

2 2 2, 3 (1) : 1

3 3 3, 4 (1) : 1

4 4 4, 2 (1) : 1

5 5 5, 5 (1) : 1

66

Page 67: Satellite Image Classification

Coding : For coding go to FILE then new and then click M-File. By this EDITOR window will open. clear all clcb=imread('C:\Users\vatsal\Desktop\image.tif');y=readfis('final.fis');I=b(:,:,1:3);a=I;x=size(I);for i=1:x(1) for j=1:x(2) red=double(I(i,j,1)); green=double(I(i,j,2)); nir=double(I(i,j,3)); q(i,j)=evalfis([red green nir],y); %veg if (q(i,j)>=0&&q(i,j)<2) a(i,j,1)=0; a(i,j,2)=255; a(i,j,3)=0; %rocky elseif (q(i,j)>=2&&q(i,j)<4) a(i,j,1)=80; a(i,j,2)=0; a(i,j,3)=80; %urban elseif (q(i,j)>=4&&q(i,j)<6) a(i,j,1)=0; a(i,j,2)=0; a(i,j,3)=255; %water elseif (q(i,j)>=6&&q(i,j)<8) a(i,j,1)=0; a(i,j,2)=0; a(i,j,3)=0; %openland elseif (q(i,j)>=8&&q(i,j)<10) a(i,j,1)=255; a(i,j,2)=0; a(i,j,3)=0; else a(i,j,1)=255; a(i,j,2)=255; a(i,j,3)=255; end endendimtool(a)

67

Page 68: Satellite Image Classification

68

Page 69: Satellite Image Classification

Now run this code. The final and classified image is obtained which is given below:

69

Page 70: Satellite Image Classification

70

Page 71: Satellite Image Classification

Vegetation-[0 ,255,0] reprensented by colour green Rocky –[80 0 80] represented by colour purple Urban-[0 0 255] represneted by colour blue Watebodies-[0 0 0] represented by colour black Openland-[255 0 0] represented by colour red.

Conclusion :

The satellite image was classified easily using the concept of fuzzy logic.

71

Page 72: Satellite Image Classification

REFERENCES :

www.google.com rst.gsfc.nasa.gov/

en.wikipedia.org/wiki/Remote_sensing

http://www.ccrs.nrcan.gc.ca/resource/index_e.php#tutor

http://drdo.gov.in/drdo/English/index.jsp?pg=homebody.jsp

MATLAB Software and help

DIGITAL IMAGE PROCESSING using MATLAB by Rafael C.Gonzalez,Richard E.Woods,Steven L.Eddins

72

Page 73: Satellite Image Classification

73

Page 74: Satellite Image Classification

74

Page 75: Satellite Image Classification

75