image processing

113
(SESSION 2012-2013) MAJOR PROJECT REPORT ON IMAGE PROCESSING SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENT FOR THE DEGREE OF BACHELOR OF COMPUTER APPLICATION Submitted To: MAKHANLAL CHATURVEDI NATIONALUNIV.OF COMMUNICATION & JOURNALISM BHOPAL (M.P.) 1

Upload: amit-patel

Post on 27-Sep-2015

6 views

Category:

Documents


0 download

DESCRIPTION

that is java project doc

TRANSCRIPT

(SESSION 2012-2013)MAJOR PROJECT REPORT ON IMAGE PROCESSING SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENT FOR THE DEGREE OF BACHELOR OF COMPUTER APPLICATION Submitted To:

MAKHANLAL CHATURVEDI NATIONALUNIV.OF COMMUNICATION & JOURNALISMBHOPAL (M.P.)

Guided By:- Head Of Dept. Submitted by: Mr.RAVI SIR Mr.ABHISHEKH VERMA AMIT PATEL ARVIND PATEL BRIJKISHOR PATEL NEELESH PATEL

(SESSION 2012-2013) DEPARTMENT OF COMPUTER APPLICATIONCERTIFICATEThis is to certify that the project synopsis on Image Processing submitted by AMIT PATEL (016294), ARVIND PATEL , BRIJKISHOR PATEL , NEELESH PATEL to extol college, Bhopal, in the partial fulfillment of the requirement of the award of the degree of Bachelor of Computer Application is a satisfactory account of their project work and is recommended for their award of degree.

Guided By:- Head Of Dept. Mr.Ravi Sir Mr.Abhisekh Varma ACKNOWLEDGEMENT

It is a privilege to express our deep sense of gratitude to our guide Mr. Ravi Sir Department of computer application, EXTOL COLLEGE, Bhopal, M.P for his constant encouragement, valuable guidance and benevolent help, which was of greatest support to bring this work in its present shape. This work is the result of inspiration, support, guidance, motivation, cooperation and facilities that were extended to us at their best at all levels. The discussion with them regarding various issues of our project have been very beneficial and gave us a new direction of thinking. All these discussions have indeed played a vital role in progress of our work at many critical points to during my endeavor. Warehighly indebted to Mr. Abhisekh varma (HOD) Head, Department of Computer Application, EXTOL COLLEGE, Bhopal for providing us all the necessary facilities and guidance We are thankful to our faculties for their valuable lectures in UML design, etc. which helped us in designing this project. We would also like to acknowledge ones, who, from behind the scenes have contributed their ideas and energies.

AMIT PATEL

ARVIND PATEL BRIJKISHOR PATEL

NEELESH PATEL

DECLARATION

We hereby certify that the work which is being presented in the project entitled IMAGE PROCESSING by Amit Patel , Arvind Patel ,Brijkishor Patel and Neelesh Patel in partial fulfillment of requirements for the award of degree of B.C.A. submitted in the Department of Computer application at EXTOL COLLEGE under M.C.N.U.C &J BHOPAL is an authentic record of our work carried out under the supervision of Mr. Ravi sir.

Project Guide: Mr. Ravi sir

TABLE OF CONTENTS PAGE NO. Abstract...81 Introduction........8 1.1 Problem Statement9 1.2 Aim 10 1.3 Study of Current System101.4 Proposed System....111.5 Feasibility Study.111.6 Document Conventions122.7 Intended Audience...122 Technology Survey...132.1 Introduction to technology/language...13 2.2 Features of the technology related to the project.173 System Conception ..194 Analysis and Modeling20 4.1 Analysis20 4.1.1 Domain Analysis....20 4.1.2 Application Analysis...21 4.1.3 Cost and Benefit Analysis ....21

4.2 UML Modeling...22 4.2.1 Sequence Diagram..22 4.2.2 Use Case Diagram.. 23 4.2.3 Collaboration Diagram...24 4.2.4 Activity Diagram...24

4.3 Data Modeling25 4.3.1 Data Flow Diagram25 4.3.2 ER Diagram26 4.3.3 Normalization275 System Requirement.28 5.1 Infrastructure Requirements....28 5.2 Hardware & Software.....28 5.3 Other Nonfunctional Requirements....29 5.4 Performance Requirements.29 5.5 Security Requirements....296 System Design...306.1 Reusability Plan...306.2 Sub Systems.306.3 Modules Specification.316.4 Class Diagram..326.5 Object Diagram....33 6.6 Algorithm / Core Logic....34

7 Coding and Snapshot....36 7.1 Coding...36 7.2 Snapshot...67

8 Testing.....81 8.1 Unit Testing.81 8.2 Black Box testing ...82 8.3 White Box testing82 8.4 Alpha testing83 8.5 Beta testing...83 9 Conclusion....84

10 References..8510.1 Books ....8510.2 URL Links.85

ABSTRACTImage processing tools are being rapidly developed for different operating system platforms. These tools are usually big in size, not completely portable across different is platforms and lack an ability to be efficiently fielded on the Internet. The purpose of this project is to research current image processing tools and create a simple, easy and flexible image processing widget based on the Java Advanced Imaging (JGI) API. This widget will address the above-mentioned general problems associated with most image processing tools. Some popular image processing applications are discussed,including their strengths and weaknesses and their popularity. This is followed by a more detailed discussion on creating a java widget based on JAI. The features that make this widget easy to use to the average user, as well as any software developer wanting to expand it, are discussed. Further advantages and disadvantages of this widget a discussed.

1 INTRODUCTIONThe brief definition of image processing is the ability to retrieve information from mages. This is achieved by, first transforming the image into a data set. Mathematical operations can be done on this new format. Different kinds of information are retrieved by the different operations that are performed on the data set. It is important to note that the reverse, constructing an image from data, is also image processing.A simple example: how do we find out the differences between two images? We executeanArithmetic Subtraction operation on both images. As the name implies, subtraction leaves the difference, the resulting image would contain the differences between the twoimages. Image processing is used in many different fields. In medicine (the ultrasound machine, X-ray machine), astronomy (Hubbles telescope taking photo graphs in X-rays, Gamma rays, Infrared) and military (image maps used in ground hugging missiles), these are just a few of the fields in which image processing is widely used. Image processing is also used in every day items (digital cameras) to mission critical systems. 1.1 PROBLEM STATEMENT

Image processing is the ability to extract information from images. This is the shortest and simplest definition of image processing. To extract information of any kind from an image, the image has to be first transformed into a data set. Mathematical operations can now be applied on to this data set. This thesis starts off with an introduction image processing. This is followed by descriptions of current image processing. applications and libraries. There are numerous programs that handle image processing. Photoshop was not designed for image processing functions per se, but rather, for creative and artistic needs. portability comes at a cost, namely the speed processing. JAI is slower than other graphics libraries as it runs in conjunction with a Java Virtual Machine (JVM). Despite this, JAI is still popular and has other advantages. JAI is probably the most viable graphics library that can be deployed over the Internet. The power is already in the JVM in the clients browser. JAI inherits Java's highly modular programming structure allowing programmers to build highly modular programs.Problem description with possible enhancements that can be made. This is followed discussions on the areas that were researched for this project. Why these areas were deemed important, sample scenarios and some sample solutions are also discussed. The degree of success achieved by this project is discussed here. Further improvements are suggested. This is followed by a partial code listing in the appendix. All the interfaces used in the implementation are given, but no source code is included. The primary goal of this project is to create a component that will allow fast and easy access to image processing tools, while being lightweight and portable across multiple.operating system platforms.

1.2 AIM

The brief definition of image processing is the ability to retrieve information from images. This is achieved by, first transforming the image into a data set. Mathematical operations can be done on this new format. Different kinds of information are retrieved by the different operations that are performed on the data set. It is important to note that the reverse, constructing an image from data, is also image processing.Asimple example: how do we find out the differences between two images? We execute an Arithmetic Subtraction operation on both images. As the name implies, subtraction leaves the difference, the resulting image would contain the differences between the two images. Image processing is used in many different fields. In medicine (the ultrasound machine, X-ray machine), astronomy (Hubbles telescope taking photographs in X-rays, Gamma rays,Infrared) and military (image maps used in ground hugging missiles), theseare justafew of the fields in which image processing is widely used. Image processing is al used in every day items (digital cameras) to mission critical systems.1.3 STUDY OF CURRENT SYSTEM

Today, computers and computer-generated images touch many aspects of daily life. Computer imagery is found on television, in newspapers, for example in weather reports, or for example in all kinds of medical investigation and surgical procedures. A well-constructed graph can present complex statistics in a form that is easier to understand and interpret. In the media "such graphs are used to illustrate papers, reports, thesis", and other presentation material.Many powerful tools have been developed to visualize data. Computer generated imagery can be categorized into several different types: 2D, 3D, and animated graphics. As technology has improved, 3D computer graphics have become more common, but 2D computer graphics are still widely used. Computer graphics has emerged as a sub-field of computer science which studies methods for digitally synthesizing and manipulating visual content. Over the past decade, other specialized fields have been developed like information visualization and scientific visualization more concerned with "the visualization of three dimensional phenomena .1.4 PROPOSED SYSTEMOne very important feature that was not in the specification was implemented. This wasan on the fly check of the modified image. This involved double clicking any tool onthe drawing area, which, if connected correctly, opened up the properties window with the modified image up to that point. Further, modifier tools with custom properties were made to show real-time image modification in response to changes the user makes its settings. Thus, less time was wasted in running the whole design1.5 FEASIBILITY STUDYFeasibility study is about the viability of a system. The proposed system has to be examined for its technical, economical and operational feasibility. This system for making effects with image files. Many alternatives are found and the best among them, which suits our requirement in a better way, is chosen. One should keep following points in mind to choose a better alternative. Greater speed of processing Effective procedures eliminating errors Better accuracy Fast retrieval of data Efficient way to store data

These alternatives are taken into account and a better system is designed.Then,the system is thoroughly scrutinized to make sure of its practicability.

1.6 DOCUMENT CONVENTIONIts easy-to-use programming model simplifies the tasks required to creating imaging software, therefore, reducing the time to develop applications.Because it is built on the network-centric Java Platform, developers can use this to build collaborative applications for high-end image processing and visualization over the network.It offers the first Java based open-specification, cross platform, extensible imaging API, enabling developers to focus on creating the right applications regardless of the disparate computing platforms.1.7 INTENDED AUDIENCEwhen people comes across , image world and want to create an image with various kind of effect on it. The image processing is the solution of that problem that provide a way to retrieve information from images . and apply various effect on image like (gray filter, red filter rotate ,zoom ,shear, flip, Image morfihing) that create an amazing image .audience or is it more basic and elementary?The bottom line is to make sure that the information is appropriate for your needs. Again, the preface will often answer these questions. Modern retrieval and search systems also sometimes indicate the targeted audience.

2 TECHNOLOGY SURVEY2.1 Introduction to technology/languageJava is purely Object Oriented Programming language.java is platform independent, automatic memory management, robust (fault tolerant), secured.java is commercial multithreaded code executing multiple function using at a time.java is mainly use for games and software development because of accuracy.java is developed in 1991 by James Gosling. Initial name of java is OAK. The 1st version of java is develop in 1996 is jdk 1.0.The version of java used in this project is jdk 1.6.0 Sun Microsystems released the first public implementation as Java 1.0 in 1995. It promised "Write Once, Run Anywhere" (WORA), providing no-cost run-times on popularplatforms. Fairly secure and featuring configurable security, it allowed network- and file-access restrictions. Major web browsers soon incorporated the ability to run Javaappletswithin web pages, and Java quickly became popular. With the advent ofJava 2(released initially as J2SE 1.2 in December 19981999), new versions had multiple configurations built for different types of platforms. For example,J2EEtargeted enterprise applications and the greatly stripped-down versionJ2MEfor mobile applications (Mobile Java).J2SEdesignated the Standard Edition. In 2006, for marketing purposes, Sun renamed newJ2versions asJava EE,Java ME, andJava SE, respectively.In 1997, Sun Microsystems approached theISO/IEC JTC1standards body and later theEcma Internationalto formalize Java, but it soon withdrew from the process.[15]Java remains ade factostandard, controlled through theJava Community Process.[16]At one time, Sun made most of its Java implementations available without charge, despite theirproprietary softwarestatus. Sun generated revenue from Java through the selling of licenses for specialized products such as the Java Enterprise System. Sun distinguishes between itsSoftware Development Kit (SDK)andRuntime Environment (JRE)(a subset of the SDK); the primary distinction involves the JRE's lack of the compiler, utility programs, and header files.On November 13, 2006, Sun released much of Java asfree and open source software, (FOSS), under the terms of theGNU General Public License(GPL). On May 8, 2007, Sun finished the process, making all of Java's core code available underfree software/open-source distribution terms, aside from a small portion of code to which Sun did not hold the copyright. Sun's vice-president Rich Green said that Sun's ideal role with regards to Java was as an "evangelist." FollowingOracle Corporation's acquisition of Sun Microsystems in 20092010, Oracle has described itself as the "steward of Java technology with a relentless commitment to fostering a community of participation and transparency".It is now known that over 3 million devices use Java, according to Oracle On April 2, 2010, James Gosling resigned from Oracle. Principles There were five primary goals in the creation of the Java language: 1. It should be "simple, object-oriented and familiar"1. It should be "robust and secure"1. It should be "architecture-neutral and portable"1. It should execute with "high performance"1. It should be "interpreted, threaded, and dynamic" Versions Major release versions of Java, along with their release dates:

Java PlatformOne characteristic of Java is portability, which means that computer programs written in the Java language must run similarly on any hardware/operating-system platform. This is achieved by compiling the Java language code to an intermediate representation calledJava bytecode, instead of directly to platform-specificmachine code. Java bytecode instructions are analogous tomachine code, but are intended to beinterpretedby avirtual machine(VM) written specifically for the host hardware.End-userscommonly use aJava Runtime Environment(JRE) installed on their own machine for standalone Java applications, or in a Web browser for Javaapplets.Standardized libraries provide a generic way to access host-specific features such agraphics,threading, andnetworking.A major benefit of using bytecode is porting. However, the overhead of interpretation means that interpreted programs almost always run more slowly than programs compiled to native executables would. Just-in-Time compilers were introduced from an early stage that compile bytecodes to machine code during runtime.ImplementationsExpandability is another area that would be of importance in building our component. Expandability involves the addition of extra image-processing functions to component and again there should be a way to achieve this relatively quickly and easily. For example, if it was decided to add a threshold function, there should be a clear and precise way of adding the necessary button(s) and corresponding code with a minimal amount work. This area was of particular interest because our component was being based on the JAI API. Any API was going to change with time. There could be slight updates to major overhauls of the whole API. Whole sets of new functions could have been added in a newer version. Therefore, the initial conclusion was that the part of our component that communicated with JAI had to be at least made extremely flexible so that more functions could be added later. This same problem could be looked at from a different perspective. How would a programmer be able to add more image-processing tools without disturbing any of the existing tools or the core component?image-processing tools separate from the core component. That is, all image-processing tools would be loaded by the core component at runtime. Thus, in contrast to extensibility, expandability does not require the addition or modification of the core component or any existing tools. The programmer is only expanding our component. In solving this problem what has been achieved is the complete separation of the core component from the JAI tools. This means the core component can exist without any JAI tools. Of course the user could not do any image processing, as there are no JAI based tools. But the user could still experiment with our component by adding connecting the basic tools (image input, viewers and connectors). This is another example of the highly modular design of our component. The following method was designed for modularising the JAI tools of our component. JAI tools are divided into logical categories. For example, category Arithmetic would contain the arithmetic operations Add, Subtract, Multiply and Divide. Each such category would be packaged as a module of JAI tools. Each such module would contain a configuration file describing the JAI tools contained within the module so that it can be loaded into our component. Addition of JAI tools to an existing module is not expected (though possible), as modules can be made by third parties who may want to retain control over their modules and thus disallow any changes to their modules. The general procedure used to create a module of JAI tools is as follows: 1) Determine the category and the tools that will be contained in the module. 2) Extend the class provided by the core component to implement the JAI tools. Asstated earlier, the tools on the design area have certain formatting and properties. Extending the class gives the new JAI tool the same formatting and properties. The JAI tool itself only have to implement a few methods. 3) Any access to resources if required, such as extra images, should be handled from within JAI tool itself. Again, the class supplied by the core component handles the setting of tool icons and other general settings. 4) Create the configuration file. This is a text file and should be named the same as the module name followed by a .config. The first line should contain the string that will become the category name. The following lines should contain the full path to the JAI tools contained within the module. The order of listing of the JAI tools would be the order of appearance.

2.2 Features of the technology related to the projectHere in our project we made use of the java language. The java language is purely Object Oriented Programming language OOP is an object based approach for developing a software solutions for a given problem. In OOPs everything is modeled in the form of objects OOPs have 5 major features

Inheritance Abstraction Encapsulation Polymorphism Data hiding/Security

InheritanceInobject-oriented programming(OOP),inheritanceis a way toreusecode of existing objects, establish asubtypefrom an existing object, or both, depending upo programming language support. Inclassical inheritancewhere objects are defined byclasses, classes can inherit attributes and behavior (i.e., previously coded algorithms associated with a class) from pre-existing classes calledbaseclassesorsuperclassesorparent classesorancestor classes. The new classes are known asderived classesorsubclassesorchild classes. The relationships ofclassesthrough inheritance gives rise to ahierarchy. Inprototype-based programming, objects can be defined directly from other objects without the need to define any classes, in which case this feature is calleddifferential inheritance.Complex inheritance, or inheritance used within an insufficiently mature design, may lead to theyo-yo problem.

Abstractionabstractionis the process by whichdataandprogramsare definearepresentationsimilar to its pictorial meaning as rooted in the more complex realm of human life and language with their higher need of summarization and categorization (semantics), while hiding away theimplementationdetails. Abstraction tries to reduce and factor out details so that theprogrammercan focus on a few concepts at a time. A system can haseveralabstraction layerswhereby different meanings and amounts of detail are exposed to the programmer. For example,low-levelabstraction layers expose details of thehardwarewhere the program isrun, while high-level layers deal with thebusiness logicof the program.

Encapsulationencapsulation. Accessors hide how the data is derived. Information Hiding prevents external objects from using the derived data altogether In the strictest OO sense of the term, encapsulation is gatheringalloperations on the object's state into the object's interface, and only those operations. This is one sense in which you can talk about 'enforcing encapsulation'. The other is that methods can only operate on the state of their own object (which implies thatotherobjects in the system employinformation hiding

Data hiding/Security information hidingis the principle of segregation of thedesigndecisionsin acomputer programthat are most likely to change, thus protecting other parts of the program from extensive modification if the design decision is changed. The protection involves providing a stableinterfacewhich protects the remainder of the program from the implementation (the details that are most likely to change).Written another way, information hiding is the ability to prevent certain aspects of aclassorsoftware componentfrom being accessible to itsclients, through an explicit exporting policy and through reliance on the short formas the primary vehicle for class documentation3 SYSTEM CONCEPTION

Software specifications are nearly always very abstract. The specification does not talk isabout implementing data structures, even less, the implementing language.However,this specification requires a component, similar to a re-usable Unix widget, for the Java aAdvanced Imaging (JAI) Application Programming Interface (API). Thus, we alreadyknow what language this component would be implemented in and hence have a good ofidea of the final product. In Java, such components are implemented as a Java Bean (bean for short). Hence, the title of this specification. Despite the fact that we already have prior knowledge of the final product, the specification is kept as abstract a possible.

4 ANALYSIS AND MODLING 4.1 Analysis

Analysis is detailed study of the various operations performed by a system and their relationship between within and outside is collected on the available files decision and transaction handled by the present system. All the logical aspect of the system is conversed in the phase.Analysis is the most important phase in the system of a system. In analysis phase one has to study the existing system in detail and also collect necessary information regarding the system to be designed. Hence in this phase flowchart and DFD are made indicating the data flow in the system, and then only can a system be made correct otherwise it will be incorrect. Analysis is conducted with the following objective in mind:

Identify the customer need. Evaluate the system concept for feasibility. Perform economic and technical analysis. Allocate function to hardware,software,database and other system elements.

4.1.1 Domain Analysis

Expandability is another area that would be of importance in building our component.Expandability involves the addition of extra image-processing functions to ourcomponent and again there should be a way to achieve this relatively quickly and easily.For example, if it was decided to add a threshold function, there should be a clear andprecise way of adding the necessary button(s) and corresponding code with a minimalamount work.

4.1.2 Application Analysis

By following this procedure, extra features can be added to the core component. This highly modular design also provides additional advantages such as programming in parallel by multiple programmers.In all, three files are modified. When a function to be a added is finalised and added to the appropriate interface, other programmers can a simultaneously write the code for the other two files. These programmers are able towork independently of each other, as all they need is the interface. Furthermore, while thethere would be some final integration testing of the new component, much testing can bebe achieved individually (including compilation) while the other part is still being written. This saves time, reduces the requirement that all programmers be familiar with isthe component (familiarity with the sub-component is enough) and thus increases aefficiency in fielding extensibility to our component.

4.1.3 Cost and Benefit Analysis

The above mentioned on the fly check could be extended to show changes to images that lie higher up on the hierarchy, such as the final output. Thus, a user would be able to see real-time changes, say, of the final output by changing settings on other tools. This would further boost productivity. Scripting ability could also be added to the component. This would be the major change, if I were to do a future version of this product. The specification and the resulting design had no requirement for scripting capability to be built into the component. Therefore, it is expected that this change may result in significantly more work than a simple extension or an addition of a feature. This would also allow the component to be used in a nongraphical sense, to create a design from within the script, and get the final output. The parts that were not finished, saving a design and opening a design, is expected to be implemented in the next upgrade of the product or in a new release. Other standard features can be added to this component. These include undo and redo functions and the ability to handle multiple designs.4.2 UML Modeling

4.2.1 Sequence Diagram userSave work & exitValidationFor sign up

Select image & operation

1: login() 2.Valid name Home() 3. Invalid login name operation 4.invalid data()on image()

5. get back()

6.request to home()

7. Response()

8. Save work()

4.2.2 USE CASE DIAGRAM

4.2.4 Activities diagram

Enter login name

Gather info

noverify

yes

Home

Browse image

Select operation

Negative responseSure for save

Positive response Save & back to Home

Exit

4.3 Data Modeling

4.3.1 Data Flow Diagram

LoginnameSelect imageValidate user

Allow OperationexitDelete Old image New image

Save as

4.3.2 ER Diagram MAIN PANEL CORGANISATION ORGANISATION

Enter user namefileFRONT VIEWedit HOME PPAGEEEMPLOYEEVALID NAMEINVALID NAMEmenu

INVERTCONTARASTROTATEGRAYRESIZEREDOUNDOCOMPAREEXITCLOSEESAVEOPENEXITtransformationfilter

EXITIMAGE1

BLURSHEARFLIP

Image DiffrenceIMAGE2

4.3.3 NORMALIZATIONInimage processing,normalizationis a process that changes the range ofpixelintensity values. Applications include photographs with poorcontrastdue to glare, for example. Normalization is sometimes called contrast stretching. In more general fields of data processing, such asdigital signal processing,itisreferredtoasdynamicrangeexpansion.The purpose of dynamic range expansion in the various applications is usually to bring the image, or other type of signal, into a range that is more familiar or normal to the senses, hence the term normalization. Often, the motivation is to achieve consistency in dynamic range for a set of data, signals, or images to avoid mental distraction or fatigue. For example, a newspaper will strive to make all of the images in an issue share a similar range of grayscale.Normalization is alinearprocess. If the intensity range of the image is 50 to 180 and the desired range is 0 to 255 the process entails subtracting 50 from each of pixel intensity, making the range 0 to 130. Then each pixel intensity is multiplied by 255/130, making the range 0 to 255. Auto-normalization in image processing software typically normalizes to the full dynamic range of the number system specified in the image file format. The normalization process will produce iris regions, which have the same constant dimensions, so that two photographs of the same iris under different conditions will have characteristic features at the same spatial location.

5 System Requirement

5.1 Infrastructure Requirements5.2 HARDWARE REQUIREMENTS Minimum System Requirements to Install and Use the software is JVM(java virtual machine),JRE(java run time enviorment.The minimum requirements are:RAM: 256 MB (Recommended)Processor: Pentium III 450 MHzOperating System: Windows 2000 or Windows XP Hard Disk Space: 2 GB (Includes 500 MB free space on disk) 5.3 SOFTWARE REQUIRMENT :FRONT END : NETBEANS IDE 6.5+BACK END : NETBEANS +JCREATER(JAVA GRAPHICS) AWT &SWING

5.3 Other Nonfunctional Requirements 5.4 Performance Requirements

It is the process of assessing the development organization's ability to construct proposed a system. Test is made to see whether reliable hardware and software, technical resources capable of meeting the needs of a proposed system can be acquired or developed by an organization in the required time. While accessing the technical feasibility, the various issues that are considered are system performance, system interfaces, development processes, risks, failure immunity and security

5.5 Security Requirements

Image compression usually considers the minimization of storage space as its main objective. It is desirable, however, to code images so that we have the ability to process the resulting representation directly. In this thesis we explore an approach to document image compression that is efficient in both space (storage requirement) and time (processing flexibility). Image processing allows you to modify the appearance of an image by applying various types of filters, scaling options or transformations. The simplest type of processing is linear scaling. With linear scaling, one pixel from the source image is multiplied by a scale factor, and then an offset term is added. The original pixel value is then replaced with the resulting value. This process is repeated on each pixel in the image. Images account for a significant and growing fraction of Web downloads. The traditional approach to transporting images uses TCP, which provides a generic reliable, in-order byte-stream abstraction, but which is overly restrictive for image data. We analyze the progression of image quality at the receiver with time and show that the in-order delivery abstraction provided by a TCP-based approach prevents the receiver application from processing and rendering portions of an image when they actually arrive. The end result is that an image is rendered in bursts interspersed with long idle times rather than smoothl.

6 SYATEM DESIGN 6.1 Reusability Plan Graphic design in Photoshop is based on toolboxes and palettes. As is almost standard in all graphic design applications, Photoshop uses floating toolboxes and palettes. This means that a particular toolbox or palette can be dragged around to a position the user wants and will be visible at all times, that is, it will be on top of the application. This allows fast access to frequently used tools and customized familiarity; a different configuration could be easily changed into a particular configuration by just dragging and moving the toolboxes and palettes around. This also means that most users that have been using other image processing applications should find at least some familiarity when they switch over to Photoshop.

6.2 Sub Systems It is the process of assessing the development organization's ability to construct proposed a system. Test is made to see whether reliable hardware and software, technical resources capable of meeting the needs of a proposed system can be acquired or developed by an organization in the required time. While accessing the technical feasibility, the various issues that are considered are system performance, system interfaces, development processes, risks, failure immunity and security.

6.3 Modules Specification

The above UML diagram shows how a module is designed. A module consists of multiple tools. Each tool must implement the ModifierInterface which used by clipboard operations such as cut and copy. All modifier tools are extensions of the AbstractCanvasButton class and only implements the methods that need to be updated (different) from the parent class. The above UML diagram shows how a module is designed.

6.4 Class Diagram+ GET() : + POST(): uname : rupesh

6.5 Object Diagram

6.5 ALGORITHM / CORE LOGIC

This section discusses the theory of most commonly used image processing algorithmslike, 1. Filtering 2) Convolution 3) Edge detection

1. Filtering

A median filter is a non-linear digital filter which is able to preserve sharp signalchanges and is very effective in removing impulse noise (or salt and pepper noise) An impulse noise has a gray level with higher or lower value that is different from the neighborhood point. Linear filters have no ability to remove this type of noise withoutaffecting the distinguishing characteristics of the signal. Median filters haveremarkable advantages over linear filters for this particular type of noise. Thereforemedian filter is very widely used in digital signal and image/video processingapplications. A standard median operation is implemented by sliding a window of oddsize (e.g. 3x3 window) over an image. At each window position the sampled values ofsignal or image are sorted, and the median value of the samples replaces the sample in the center of the win 2) Convolution

Convolution is a simple mathematical operation which is fundamental to many common image processing operators. Convolution is a way of multiplying together two arrays of numbers of different sizes to produce a third array of numbers. In image processing, convolution is used to implement operators whose output pixel values are simple linear combination of certain input pixels values of the image. Convolution belongs to a class of algorithms called spatial filters. Spatial filters use a wide variety of masks (kernels), to calculate different results, depending on the desired function. 2D-Convolution, is most important to modern image processing. The basic idea is to scan a window of some finite size over an image. The output pixel value is the weighted sum of the input pixels within the window where the weights are the values of the filter assigned to every pixel of the window. 1. Edge detection

Edges are places in the image with strong intensity contrast. Edges often occur at image locations representing object boundaries; edge detection is extensively used in image segmentation when we want to divide the image into areascorresponding to different objects. Representing an image by its edges has the further advantage that the amount of data is reduced significantly while retaining most of the image information. Edges can be detected by applying a high pass frequency filter in the Fourier domain or by convolving the image with an appropriate kernel in the spatial domain. In practice, edge detection is performed in the spatial domain, because it is computationally less expensive and often yields better results. Since edges correspond to strong illumination gradients, the derivatives of the image are used for calculating the edge.

7 Coding and Snapshot

7.1 CODING

package imageprocessor;

import java.awt.*;import java.awt.event.*;import java.awt.image.*;import java.io.*;import javax.swing.*;

class frontview extends JFrame{public frontview(){setSize(800,800);setTitle("IMAGE PROCESSING");Container contentPane =getContentPane();

JPanel panel=new JPanel();

panel.setBackground(Color.pink);

Image image2=Toolkit.getDefaultToolkit().getImage("S.gif");

textlabel11=new JLabel(); textlabel9=new JLabel(); textlabel8=new JLabel(); textlabel7=new JLabel(); textlabel6=new JLabel(); textlabel5=new JLabel(); textlabel4=new JLabel(); textlabel3=new JLabel();sunlabel2=new JLabel();sunlabel=new JLabel();welllabel=new JLabel(); label1=new JLabel(); label2=new JLabel(); textlabel1=new JLabel(); textlabel2=new JLabel(); extralabel1=new JLabel(); logolabel=new JLabel();linelabel1=new JLabel();linelabel2=new JLabel();linelabel3=new JLabel();Image image9=Toolkit.getDefaultToolkit().getImage("Signup.gif");buttonsign=new JButton(new ImageIcon(image9));Image image8=Toolkit.getDefaultToolkit().getImage("W1.gif");Image image7=Toolkit.getDefaultToolkit().getImage("sun2.gif");Image image6=Toolkit.getDefaultToolkit().getImage("w.gif");Image image5=Toolkit.getDefaultToolkit().getImage("line.gif");Image image3=Toolkit.getDefaultToolkit().getImage("a.gif");Image image4=Toolkit.getDefaultToolkit().getImage("cist.jpg");

sunlabel2.setIcon(new ImageIcon(image8)); sunlabel.setIcon(new ImageIcon(image7));welllabel.setIcon(new ImageIcon(image6));linelabel1.setIcon(new ImageIcon(image5));linelabel2.setIcon(new ImageIcon(image5));linelabel3.setIcon(new ImageIcon(image5));

logolabel.setIcon(new ImageIcon(image4));extralabel1.setIcon(new ImageIcon(image3));label2.setIcon(new ImageIcon(image2));label1.setIcon(new ImageIcon(image2)); textlabel8.setText("RUPESH BHADE "); textlabel7.setText("PRAGATI MULEY"); textlabel6.setText("Head Of Dept. "); textlabel11.setText("Prof.P.N.HARDAHA"); textlabel3.setText(" Guided By:"); textlabel4.setText("Prof.VIKASH JAIN");

textlabel5.setText(" Submitted By:");textlabel1.setText("CORPORTAE COLLAGE");textlabel2.setText("IMAGE PROCESSING"); logolabel.setBounds(260,130,300,300); buttonsign.setBounds(310,480,170,50); label1.setBounds(-35,10,95,65);label2.setBounds(695,10,95,65);linelabel1.setBounds(85,60,200,40);linelabel2.setBounds(285,60,200,40);linelabel3.setBounds(20,60,222,40);welllabel.setBounds(250,385,300,120);sunlabel.setBounds(55,120,200,160);sunlabel2.setBounds(-10,150,300,380);textlabel3.setBounds(10,400,120,80);textlabel4.setBounds(20,430,250,80);textlabel11.setBounds(20,480,250,80);textlabel5.setBounds(540,400,150,80);textlabel6.setBounds(20,455,250,80);textlabel7.setBounds(540,450,230,80);textlabel8.setBounds(540,470,230,80);textlabel9.setBounds(540,490,180,80);

textlabel1.setFont(new Font("Serif",Font.CENTER_BASELINE,40));textlabel2.setFont(new Font("Serif",Font.CENTER_BASELINE,30));textlabel3.setFont(new Font("Serif",Font.CENTER_BASELINE,20));textlabel4.setFont(new Font("Serif",Font.CENTER_BASELINE,18));textlabel11.setFont(new Font("Serif",Font.CENTER_BASELINE,18));textlabel5.setFont(new Font("Serif",Font.CENTER_BASELINE,20));textlabel6.setFont(new Font("Serif",Font.CENTER_BASELINE,18)); textlabel7.setFont(new Font("Serif",Font.CENTER_BASELINE,14)); textlabel8.setFont(new Font("Serif",Font.CENTER_BASELINE,14)); textlabel9.setFont(new Font("Serif",Font.CENTER_BASELINE,18));

textlabel1.setBounds(90,10,660,60);textlabel2.setBounds(150,85,700,60);

extralabel1.setBounds(60,10,670,60);

contentPane.add(textlabel1); contentPane.add(buttonsign); contentPane.add(textlabel2);contentPane.add(extralabel1);contentPane.add(label1);contentPane.add(label2); contentPane.add(logolabel); contentPane.add(linelabel1); contentPane.add(linelabel2);contentPane.add(linelabel3);contentPane.add(welllabel);contentPane.add(sunlabel);contentPane.add(textlabel3);contentPane.add(textlabel4);contentPane.add(textlabel5);contentPane.add(textlabel6);contentPane.add(textlabel7);contentPane.add(textlabel8);contentPane.add(textlabel9);contentPane.add(textlabel11);contentPane.add(panel);

buttonsign.addActionListener(new ActionListener(){ public void actionPerformed(ActionEvent e) { setVisible(false); Pass app = new Pass();} });

}

private JLabel label1;private JLabel label2;private JLabel textlabel1;private JLabel extralabel1;private JLabel logolabel;private JLabel linelabel1;private JLabel linelabel2;private JLabel linelabel3;private JLabel textlabel2;private JLabel textlabel3;private JLabel textlabel4;private JLabel textlabel5;private JLabel textlabel6;private JLabel textlabel7;private JLabel textlabel8;private JLabel textlabel9;private JLabel textlabel10;private JLabel textlabel11;private JLabel welllabel;private JLabel sunlabel;private JLabel sunlabel2;

public JButton buttonsign;}

package imageprocessor;

import java.awt.*;

import java.awt.event.*;

import javax.swing.*;

public class Pass extends JFrame{

private Color colorValues[] = { Color.black, Color.blue, Color.red, Color.green };

private JLabel logo_label;

private JLabel user_name_label;

private JLabel message_label;

private JLabel house_label;

private JLabel sun_label;

private JLabel mom;

private JPanel contentPane;

private JTextField userfield;

private JButton cancel_button;

public Pass()

{

super();

initializeComponent();

this.setVisible(true);}

private void initializeComponent()

{logo_label = new JLabel();

user_name_label = new JLabel();

message_label = new JLabel();

house_label = new JLabel();

sun_label = new JLabel();

mom = new JLabel();

Icon bug1 = new ImageIcon( "ma33.gif" );

Icon bug2 = new ImageIcon( "ma44.gif" ); Icon bug7 = new ImageIcon("truba.jpg");

Icon house_icon = new ImageIcon( "house.gif" );

Icon sun_icon = new ImageIcon( "sun2.gif" );

Icon m = new ImageIcon( "W1.gif" );

userfield = new JTextField();

contentPane = (JPanel)this.getContentPane();

logo_label.setText("");

Icon bug6 = new ImageIcon( "truba.jpg" );

logo_label = new JLabel( "",bug7,SwingConstants.CENTER );

logo_label.setToolTipText( "" );

house_label = new JLabel( "",house_icon,SwingConstants.CENTER );

house_label.setToolTipText( "" );

sun_label = new JLabel( "",sun_icon,SwingConstants.CENTER );

sun_label.setToolTipText( "" );

mom = new JLabel( "",m,SwingConstants.CENTER );

mom.setToolTipText( "" );

Icon cancel = new ImageIcon( "cancel.gif" );

cancel_button = new JButton("",cancel);

user_name_label.setText("User Name");

message_label.setText(" Please Enter the User Name And press Enter key ");user_name_label.setForeground( colorValues[ 3] );

message_label.setForeground( colorValues[ 3] );

userfield.addActionListener(new ActionListener() {

public void actionPerformed(ActionEvent e){

String s="";

if( e.getSource() == userfield )

{s = e.getActionCommand();

if((s.equals("rupesh")))

{ setVisible(true);

TopFrame app = new TopFrame(); } else { message1();

}

}

}

});

cancel_button.addActionListener(new ActionListener() { public void actionPerformed(ActionEvent e) { setVisible(false);/* frontview app = new frontview();*/ } });contentPane.setLayout(null);contentPane.setBackground(new Color(120,17,117));addComponent(contentPane, logo_label, 250,350,175,26);addComponent(contentPane, user_name_label, 250,350,175,26);addComponent(contentPane, message_label, 250,300,475,26);addComponent(contentPane, house_label, 50,20,175,126);addComponent(contentPane, sun_label, 550,20,175,126);addComponent(contentPane, mom, -20,320,375,366);addComponent(contentPane, userfield, 380,350,175,26); addComponent(contentPane, cancel_button, 380,450,165,36);setSize(800,1200);setVisible(true);}private void addComponent(Container container,Component c,int x,int y,int width,int height){c.setBounds(x,y,width,height);container.add(c);}private void message1(){JOptionPane.showMessageDialog( null, "Invalid user name"+"\nPlease Try Again " );}public static void main(String[] args){try{UIManager.setLookAndFeel("com.sun.java.swing.plaf.windows.WindowsLookAndFeel");}catch (Exception ex){System.out.println("Failed loading L&F: ");System.out.println(ex);}new Pass();}}/* * TopFrame.java * */

package imageprocessor;

import java.awt.*;import java.awt.event.*;import java.awt.image.*;import com.sun.image.codec.jpeg.*;import javax.swing.*;import java.io.*;

import imageprocessor.filefilter.*;import imageprocessor.filter.advancedfilter.*;import imageprocessor.*;

public class TopFrame extends JFrame implements ActionListener,MouseListener,MouseMotionListener{

/** Creates a new instance of TopFrame */ private JMenuBar itsMenuBar;

public JFileChooser aFileChooser; public static String color,fn,dn; private Container itsContentPane; private JToolBar itsToolBar; private MyPanel itsPanel; private Image Original;

static Image itsImage; private double itsScale; /* For scaling the image */ private int itsRotation; /* For Rotating the image */ private double itsShearX; /* For X component of Shear */ private double itsShearY; /* For Y component of Shear */

private JMenu itsFileMenu; private JMenuItem itsFileOpen; private JMenuItem itsFileSave; private JMenuItem itsFileClose; private JMenuItem itsFileExit;

private JMenu itsFilterMenu; private JMenuItem itsFilterGray; private JMenuItem itsFilterInvert; private JMenuItem itsFilterContrast; private JMenuItem itsFilterBrightness; private JMenuItem itsFilterBlack_white; private JMenuItem noise_reduction; private JMenu itsFilterSel_color_inversion; private JMenuItem red; private JMenuItem green; private JMenuItem blue;

private JMenu itsAdvFilterMenu; private JMenuItem itsAdvFilterBlur; private JMenuItem itsAdvFilterSharpen; private JMenuItem smoothing; private JMenuItem itsAdvFilterEdge; private JMenuItem itsAdvFilterEmboss; private JMenuItem itsAdvFilterCrop;

private JMenu itsAbout;

private JMenu itsEditMenu; private JMenuItem itsEditUndo; private JMenuItem itsEditRedo;

private JMenu itsSizeMenu; private JMenuItem itsSize25; private JMenuItem itsSize50; private JMenuItem itsSize100; private JMenuItem itsSize200; private JMenuItem itsSize400;

private JMenu itsLookMenu; private JMenuItem itsLookWindows; private JMenuItem itsLookMetal; private JMenuItem itsLookMotif;

private JMenu itsRotateMenu; private JMenuItem itsRotate90; private JMenuItem itsRotate180; private JMenuItem itsRotate270; private JMenuItem itsRotate360;

private JMenu itsShearMenu; private JMenuItem itsShearLeft; private JMenuItem itsShearRight; private JMenuItem itsShearUp; private JMenuItem itsShearDown;

private JMenu itsTransformMenu; private JMenuItem itsTransformMove; private JMenuItem Splitting;

private JMenu flipping; private JMenuItem horizontal; private JMenuItem vertical;

private JMenu itsOptionsMenu; private JMenuItem itsProperties; private JMenuItem itsOptionsBackground; private JMenuItem itsRestoreOriginal;

private JMenu itsFaceRecognition;private JMenuItem itsOpenWindow;

private JScrollBar itsHorizontalScrollBar; private JScrollBar itsVerticalScrollBar;

private boolean itsMoveFlag; /* To indicate that move is selected or not */ private int itsXpos; /* X position of image */ private int itsYpos; /* Y position of image */

private Toolkit itsToolKit;

public TopFrame() { setTitle("IMAGE PROCESSING (doveloped by Rupesh bhade & Pragati Muley)");

itsToolKit=Toolkit.getDefaultToolkit(); Dimension aDimension=itsToolKit.getScreenSize(); setSize((int)aDimension.getWidth(),(int)aDimension.getHeight());

/* To make the Photo Editor window fit * according to current screen resolution */

Container itsContentPane=this.getContentPane(); itsContentPane.setLayout(new BorderLayout());

itsImage=itsToolKit.getImage(""); /* Load the default image */ Original=itsImage; //original image itsScale = 1.0; /* Default value for scale */ itsRotation = 0; /* Default value for Rotation */ itsShearX=0.0; /* Default value for Shear X */ itsShearY=0.0; /* Default value for Shear Y */ itsMoveFlag = false; /* Default value for move flag */ itsXpos = 0; /* Default value for X pos */ itsYpos = 0; /* Default value for Y pos */

itsPanel=new MyPanel(itsImage); /* Create the central panel */

itsContentPane.add(itsPanel,BorderLayout.CENTER);

itsHorizontalScrollBar=new JScrollBar(JScrollBar.HORIZONTAL,0,50,0,100);

itsVerticalScrollBar=new JScrollBar(JScrollBar.VERTICAL,0,50,0,100);

itsMenuBar=new JMenuBar(); setJMenuBar(itsMenuBar); /* Adds the Menu Bar */

itsFileMenu = new JMenu("File"); itsFileMenu.setMnemonic(KeyEvent.VK_F); itsFileSave = new JMenuItem("Save As...",KeyEvent.VK_V); itsFileSave.addActionListener(this); itsFileOpen = new JMenuItem("Open",KeyEvent.VK_O); itsFileOpen.addActionListener(this); itsFileClose = new JMenuItem("Close",KeyEvent.VK_C); itsFileClose.addActionListener(this); itsFileExit = new JMenuItem("Exit",KeyEvent.VK_X); itsFileExit.addActionListener(this);

itsAbout = new JMenu("About"); itsAbout.setMnemonic(KeyEvent.VK_A); itsAbout.addActionListener(this);

itsFilterMenu = new JMenu("FIlters"); itsFilterMenu.setMnemonic(KeyEvent.VK_I); itsFilterGray = new JMenuItem("GrayScale",KeyEvent.VK_G); itsFilterGray.addActionListener(this); itsFilterInvert = new JMenuItem("Invert",KeyEvent.VK_I); itsFilterInvert.addActionListener(this); itsFilterContrast = new JMenuItem("Contrast",KeyEvent.VK_C); itsFilterContrast.addActionListener(this); itsFilterBrightness = new JMenuItem("Brightness",KeyEvent.VK_B); itsFilterBrightness.addActionListener(this); itsFilterBlack_white = new JMenuItem("Black & White",KeyEvent.VK_W); itsFilterBlack_white.addActionListener(this); noise_reduction = new JMenuItem("Noise Reduction",KeyEvent.VK_N); noise_reduction.addActionListener(this); itsFilterSel_color_inversion = new JMenu("Selective Color Filtering"); itsFilterSel_color_inversion.setMnemonic(KeyEvent.VK_S); red = new JMenuItem("Red",KeyEvent.VK_R); red.addActionListener(this); green = new JMenuItem("Green",KeyEvent.VK_G); green.addActionListener(this); blue = new JMenuItem("Yellow",KeyEvent.VK_B); blue.addActionListener(this);

itsAdvFilterMenu = new JMenu("AdvancedFIlters"); itsAdvFilterMenu.setMnemonic(KeyEvent.VK_A); itsAdvFilterBlur = new JMenuItem("Blur",KeyEvent.VK_B); itsAdvFilterBlur.addActionListener(this); itsAdvFilterSharpen = new JMenuItem("Sharpen",KeyEvent.VK_S); itsAdvFilterSharpen.addActionListener(this); smoothing= new JMenuItem("Smoothing",KeyEvent.VK_S); smoothing.addActionListener(this); itsAdvFilterEdge = new JMenuItem("Edge",KeyEvent.VK_E); itsAdvFilterEdge.addActionListener(this); itsAdvFilterEmboss = new JMenuItem("Emboss",KeyEvent.VK_M); itsAdvFilterEmboss.addActionListener(this); itsAdvFilterCrop = new JMenuItem("Crop",KeyEvent.VK_C); itsAdvFilterCrop.addActionListener(this);

itsEditMenu = new JMenu("Edit"); itsEditMenu.setMnemonic(KeyEvent.VK_E); itsEditUndo = new JMenuItem("Undo",KeyEvent.VK_U); itsEditUndo.addActionListener(this); itsEditRedo = new JMenuItem("Redo",KeyEvent.VK_R); itsEditRedo.addActionListener(this);

itsTransformMenu = new JMenu("Transform"); itsTransformMenu.setMnemonic(KeyEvent.VK_T); Splitting = new JMenuItem("Splitting"); Splitting.addActionListener(this); Splitting.setMnemonic(KeyEvent.VK_S); itsTransformMove = new JMenuItem("Move"); itsTransformMove.addActionListener(this); itsTransformMove.setMnemonic(KeyEvent.VK_M);

flipping = new JMenu("Flipping"); flipping.setMnemonic(KeyEvent.VK_F); horizontal = new JMenuItem("Horizontally",KeyEvent.VK_H); horizontal.addActionListener(this); vertical = new JMenuItem("Vertically",KeyEvent.VK_V); vertical.addActionListener(this);

itsSizeMenu = new JMenu("Size"); itsSizeMenu.setMnemonic(KeyEvent.VK_Z); itsSize25 = new JMenuItem("25%"); itsSize25.addActionListener(this); itsSize50 = new JMenuItem("50%"); itsSize50.addActionListener(this); itsSize100 = new JMenuItem("100%"); itsSize100.addActionListener(this); itsSize200 = new JMenuItem("200%"); itsSize200.addActionListener(this); itsSize400 = new JMenuItem("400%"); itsSize400.addActionListener(this);

itsLookMenu = new JMenu("Look"); itsLookMenu.setMnemonic(KeyEvent.VK_L); itsLookWindows = new JMenuItem("Windows",KeyEvent.VK_W); itsLookWindows.addActionListener(this); itsLookMetal = new JMenuItem("Metal",KeyEvent.VK_M); itsLookMetal.addActionListener(this); itsLookMotif = new JMenuItem("Motif",KeyEvent.VK_O); itsLookMotif.addActionListener(this);

itsRotateMenu = new JMenu("Rotate"); itsRotateMenu.setMnemonic(KeyEvent.VK_R); itsRotate90 = new JMenuItem("90"); itsRotate90.addActionListener(this); itsRotate180 = new JMenuItem("180"); itsRotate180.addActionListener(this); itsRotate270 = new JMenuItem("270"); itsRotate270.addActionListener(this); itsRotate360 = new JMenuItem("360"); itsRotate360.addActionListener(this);

itsShearMenu = new JMenu("Shear"); itsShearMenu.setMnemonic(KeyEvent.VK_H); itsShearLeft = new JMenuItem("Left"); itsShearLeft.addActionListener(this); itsShearRight = new JMenuItem("Right"); itsShearRight.addActionListener(this); itsShearUp = new JMenuItem("Up"); itsShearUp.addActionListener(this); itsShearDown = new JMenuItem("Down"); itsShearDown.addActionListener(this);

itsOptionsMenu = new JMenu("Options"); itsOptionsMenu.setMnemonic(KeyEvent.VK_O); itsProperties = new JMenuItem("Properties",KeyEvent.VK_P); itsProperties.addActionListener(this); itsRestoreOriginal = new JMenuItem("Restore",KeyEvent.VK_R); itsRestoreOriginal.addActionListener(this); itsOptionsBackground = new JMenuItem("Background"); itsOptionsBackground.setMnemonic(KeyEvent.VK_B); itsOptionsBackground.addActionListener(this); itsFaceRecognition = new JMenu("FaceDifference"); itsFaceRecognition.setMnemonic(KeyEvent.VK_F); itsOpenWindow =new JMenuItem("OpenWindow"); itsOpenWindow.setMnemonic(KeyEvent.VK_O); itsOpenWindow.addActionListener(this);

itsMenuBar.add(itsFileMenu); itsMenuBar.add(itsEditMenu); itsMenuBar.add(itsTransformMenu); itsMenuBar.add(itsFilterMenu); itsMenuBar.add(itsAdvFilterMenu); itsMenuBar.add(itsOptionsMenu); itsMenuBar.add(itsFaceRecognition);

itsMenuBar.add(itsAbout); itsFaceRecognition.add(itsOpenWindow); itsFileMenu.add(itsFileOpen); itsFileMenu.add(itsFileSave); itsFileMenu.add(itsFileClose); itsFileMenu.add(itsFileExit);

itsFilterMenu.add(itsFilterGray); itsFilterMenu.add(itsFilterInvert); itsFilterMenu.add(itsFilterContrast); itsFilterMenu.add(itsFilterBrightness); itsFilterMenu.add(itsFilterBlack_white); itsFilterMenu.add(noise_reduction); itsFilterMenu.add(itsFilterSel_color_inversion); itsFilterSel_color_inversion.add(red); itsFilterSel_color_inversion.add(green); itsFilterSel_color_inversion.add(blue);

itsAdvFilterMenu.add(itsAdvFilterBlur); itsAdvFilterMenu.add(itsAdvFilterSharpen); itsAdvFilterMenu.add(smoothing); itsAdvFilterMenu.add(itsAdvFilterEdge); itsAdvFilterMenu.add(itsAdvFilterEmboss); itsAdvFilterMenu.add(itsAdvFilterCrop);

itsEditMenu.add(itsEditUndo); itsEditMenu.add(itsEditRedo);

itsSizeMenu.add(itsSize25); itsSizeMenu.add(itsSize50); itsSizeMenu.add(itsSize100); itsSizeMenu.add(itsSize200); itsSizeMenu.add(itsSize400);

itsLookMenu.add(itsLookWindows); itsLookMenu.add(itsLookMetal); itsLookMenu.add(itsLookMotif);

itsRotateMenu.add(itsRotate90); itsRotateMenu.add(itsRotate180); itsRotateMenu.add(itsRotate270); itsRotateMenu.add(itsRotate360);

itsShearMenu.add(itsShearLeft); itsShearMenu.add(itsShearRight); itsShearMenu.add(itsShearUp); itsShearMenu.add(itsShearDown);

itsTransformMenu.add(itsTransformMove); itsTransformMenu.add( Splitting); itsTransformMenu.add(itsSizeMenu); itsTransformMenu.add(itsRotateMenu); itsTransformMenu.add(itsShearMenu); itsTransformMenu.add(flipping); flipping.add( horizontal); flipping.add( vertical);

itsOptionsMenu.add(itsRestoreOriginal); itsOptionsMenu.add(itsOptionsBackground); itsOptionsMenu.add(itsLookMenu); itsOptionsMenu.add(itsProperties);

itsPanel.addMouseListener(this); itsPanel.addMouseMotionListener(this);

this.addWindowListener(new WindowAdapter() { public void windowClosing(WindowEvent e) {System.exit(0);} });

/*Terminates the Image Processor Application */

try { UIManager.setLookAndFeel("javax.swing.plaf.metal.MetalLookAndFeel"); SwingUtilities.updateComponentTreeUI(this);

/* For Windows Look And Feel */} catch (UnsupportedLookAndFeelException exc) { System.out.println("UnsupportedLookAndFeelException Error:" + exc); } catch (IllegalAccessException exc) { System.out.println("IllegalAccessException Error:" + exc); } catch (ClassNotFoundException exc) { System.out.println("ClassNotFoundException Error:" + exc); } catch (InstantiationException exc) { System.out.println("InstantiateException Error:" + exc); }

setVisible(true); }

public void actionPerformed(ActionEvent theActionEvent) { String action=theActionEvent.getActionCommand(); if(action.equals("OpenWindow")) {imageviewer image1=new imageviewer(); image1.show(); } else if(action.equals("Open")) { aFileChooser=new JFileChooser("F:\\My Pitcure\\Quotes"); aFileChooser.setFileSelectionMode(JFileChooser.FILES_ONLY); /* For selecting files only */ aFileChooser.addChoosableFileFilter(new imageprocessor.ImageFilter()); /* Fileters only image files */ /* To resolve ambiguity of package names */ int aValue=aFileChooser.showOpenDialog(this); if(aValue==JFileChooser.APPROVE_OPTION) /* If Ok is pressed */ { color=" 24 bit RGB "; itsImage=itsToolKit.getImage(aFileChooser.getSelectedFile().getPath());

dn=aFileChooser.getSelectedFile().getPath(); fn=aFileChooser.getSelectedFile().getName();

Original=itsImage; itsScale = 1.0; /* Default value for scale */ itsRotation = 0; /* Default value for Rotation */ itsShearX=0.0; /* Default value for Shear X */ itsShearY=0.0; /* Default value for Shear Y */ itsXpos = 0; /* Default value for X pos */ itsYpos = 0; /* Default value for Y pos */

itsPanel.setImage(itsImage); itsPanel.setScale(itsScale); itsPanel.setRotation(itsRotation); itsPanel.setShearX(itsShearX); itsPanel.setShearY(itsShearY); itsPanel.setXpos(itsXpos); itsPanel.setYpos(itsYpos);

} }

else if(action.equals("Properties")) { PropertiesFrame pf = new PropertiesFrame(this); pf.setSize(300,375); Dimension pfSize = pf.getSize(); Dimension frmSize = getSize(); Point loc = getLocation(); pf.setLocation((frmSize.width - pfSize.width) / 2 + loc.x, (frmSize.height - pfSize.height) / 2 + loc.y); pf.setModal(true); pf.show();

}

else if(action.equals("Close")) { setVisible(false);

TopFrame myFrame = new TopFrame(); /* Create the Frame */

}

else if(action.equals("Exit")) { System.exit(0); /* Close the Application */ }

else if(action.equals("Undo")) { itsImage=Original; itsPanel.setImage(itsImage); itsPanel.repaint(); }

else if(action.equals("Splitting")) { int iw = (int)(itsPanel.getImageWidth()); int ih = (int)(itsPanel.getImageHeight());

Splitting aFilter = new Splitting(); aFilter.Splitting(itsImage,iw,ih);

}

else if(action.equals("GrayScale")) /* Gray Scale Filter */ { int iw = (int)(itsPanel.getImageWidth()); int ih = (int)(itsPanel.getImageHeight());

BasicFilter aFilter = new BasicFilter(itsImage,action,this,iw,ih); itsImage = aFilter.ApplyFilter(); itsPanel.setImage(itsImage); itsPanel.repaint(); }

else if(action.equals("Brightness")) { int iw = (int)(itsPanel.getImageWidth()); int ih = (int)(itsPanel.getImageHeight());

NewBrightness aFilter = new NewBrightness(); itsImage=aFilter.NewBrightness(itsImage,iw,ih); itsPanel.setImage(itsImage); itsPanel.repaint(); }

else if(action.equals("Noise Reduction"))

{ int iw = (int)(itsPanel.getImageWidth()); int ih = (int)(itsPanel.getImageHeight());

Noise_reduction aFilter = new Noise_reduction(); itsImage=aFilter.Noise_reduction(itsImage,iw,ih); itsPanel.setImage(itsImage); itsPanel.repaint(); }

else if(action.equals("Restore")) { itsImage=Original; itsPanel.setImage(itsImage); itsPanel.repaint(); }

else if(action.equals("Invert")) /* Invert Fileter */ { int iw = (int)(itsPanel.getImageWidth()); int ih = (int)(itsPanel.getImageHeight()); BasicFilter aFilter = new BasicFilter(itsImage,action,this,iw,ih); itsImage = aFilter.ApplyFilter(); itsPanel.setImage(itsImage); itsPanel.repaint(); }

else if(action.equals("Contrast")) /* Contrast Filter */ { int iw = (int)(itsPanel.getImageWidth()); int ih = (int)(itsPanel.getImageHeight());

BasicFilter aFilter = new BasicFilter(itsImage,action,this,iw,ih); itsImage = aFilter.ApplyFilter(); itsPanel.setImage(itsImage); itsPanel.repaint(); }

else if(action.equals("Black & White")) { int iw = (int)(itsPanel.getImageWidth()); int ih = (int)(itsPanel.getImageHeight());

BW aFilter = new BW(); itsImage=aFilter.BW(itsImage,iw,ih); itsPanel.setImage(itsImage); itsPanel.repaint();

}

else if(action.equals("Yellow")) { int iw = (int)(itsPanel.getImageWidth()); int ih = (int)(itsPanel.getImageHeight());

NewBlueColorFilter aFilter = new NewBlueColorFilter(); itsImage=aFilter.NewBlueColorFilter(itsImage,iw,ih); itsPanel.setImage(itsImage); itsPanel.repaint(); }

else if(action.equals("Green")) { int iw = (int)(itsPanel.getImageWidth()); int ih = (int)(itsPanel.getImageHeight());

NewRedColorFilter aFilter = new NewRedColorFilter(); itsImage=aFilter.NewRedColorFilter(itsImage,iw,ih); itsPanel.setImage(itsImage); itsPanel.repaint(); } else if(action.equals("Red")) { int iw = (int)(itsPanel.getImageWidth()); int ih = (int)(itsPanel.getImageHeight());

NewGreenColorFilter aFilter = new NewGreenColorFilter(); itsImage=aFilter.NewGreenColorFilter(itsImage,iw,ih); itsPanel.setImage(itsImage); itsPanel.repaint(); }

else if(action.equals("Blur")) /* Blur Filter */ { int iw = (int)(itsPanel.getImageWidth()); int ih = (int)(itsPanel.getImageHeight()); BlurFilter aFilter = new BlurFilter(itsImage,iw,ih); itsImage = aFilter.FilterImage(); itsPanel.setImage(itsImage); itsPanel.repaint(); }

else if(action.equals("Sharpen")) /* Sharpen Filter */ { int iw = (int)(itsPanel.getImageWidth()); int ih = (int)(itsPanel.getImageHeight()); SharpenFilter aFilter = new SharpenFilter(itsImage,iw,ih); itsImage = aFilter.FilterImage(); itsPanel.setImage(itsImage); itsPanel.repaint(); }

else if(action.equals("Smoothing")) { int iw = (int)(itsPanel.getImageWidth()); int ih = (int)(itsPanel.getImageHeight());

smoothing aFilter = new smoothing(); itsImage=aFilter.smoothing(itsImage,iw,ih); itsPanel.setImage(itsImage); itsPanel.repaint(); }

else if(action.equals("Edge")) /* Edge Filter */ { int iw = (int)(itsPanel.getImageWidth()); int ih = (int)(itsPanel.getImageHeight()); EdgeFilter aFilter = new EdgeFilter(itsImage,iw,ih); itsImage = aFilter.FilterImage(); itsPanel.setImage(itsImage); itsPanel.repaint(); }

else if(action.equals("Emboss")) /* Edge Filter */ { int iw = (int)(itsPanel.getImageWidth()); int ih = (int)(itsPanel.getImageHeight()); EmbossFilter aFilter = new EmbossFilter(itsImage,iw,ih); itsImage = aFilter.FilterImage(); itsPanel.setImage(itsImage); itsPanel.repaint(); }

else if(action.equals("Crop")) /* Crop Filter */

{

}

else if(action.equals("About")) { Frame1_AboutBox dlg = new Frame1_AboutBox(this); Dimension dlgSize = dlg.getPreferredSize(); Dimension frmSize = getSize(); Point loc = getLocation(); dlg.setLocation((frmSize.width - dlgSize.width) / 2 + loc.x, (frmSize.height - dlgSize.height) / 2 + loc.y); dlg.setModal(true); dlg.show(); }

else if(action.equals("Save As...")) { FileDialog fd = new FileDialog(this, "Save As...", FileDialog.SAVE); fd.setDirectory("C:\\My Documents\\My Pictures"); fd.setVisible(true); String afn = fd.getFile(); String adn = fd.getDirectory(); if( afn != null ) { adn = adn.concat(afn); int w=itsImage.getWidth(null); int h=itsImage.getHeight(null); BufferedImage bi=new BufferedImage(w,h,BufferedImage.TYPE_INT_RGB); Graphics2D big = bi.createGraphics(); big.drawImage(itsImage,0,0,null);

//save(f,bi);try { File file = new File(adn+".jpg"); FileOutputStream out = new FileOutputStream(file); JPEGImageEncoder encoder = JPEGCodec.createJPEGEncoder(out); JPEGEncodeParam param = encoder.getDefaultJPEGEncodeParam(bi); param.setQuality(1.0f,false); encoder.setJPEGEncodeParam(param); encoder.encode(bi); } catch(Exception e){ System.out.println("save failed for "+adn+".jpg: "+e); } }//saveJPG(adn); }

else if(action.equals("25%")) { itsScale = 0.25; itsPanel.setScale(itsScale); itsPanel.repaint(); }

else if(action.equals("50%")) { itsScale = 0.5; itsPanel.setScale(itsScale); itsPanel.repaint();

}

else if(action.equals("100%")) { itsScale = 1.0; itsPanel.setScale(itsScale); itsPanel.repaint(); }

else if(action.equals("200%")) { itsScale = 2.0; itsPanel.setScale(itsScale); itsPanel.repaint(); }

else if(action.equals("400%")) { itsScale = 4.0; itsPanel.setScale(itsScale); itsPanel.repaint(); }

else if(action.equals("90")) { itsRotation = (itsRotation+90)%360; itsPanel.setRotation(itsRotation); itsPanel.repaint(); }

else if(action.equals("180")) { itsRotation = (itsRotation+180)%360; itsPanel.setRotation(itsRotation); itsPanel.repaint(); }

else if(action.equals("270")) { itsRotation = (itsRotation+270)%360; itsPanel.setRotation(itsRotation); itsPanel.repaint(); }

else if(action.equals("360")) { itsRotation = (itsRotation+360)%360; itsPanel.setRotation(itsRotation); itsPanel.repaint(); }

else if(action.equals("Left")) { itsShearX-=0.5; itsPanel.setShearX(itsShearX); itsPanel.repaint();

}

else if(action.equals("Right")) { itsShearX+=0.5; itsPanel.setShearX(itsShearX); itsPanel.repaint();

}

else if(action.equals("Up")) { itsShearY-=0.5; itsPanel.setShearY(itsShearY); itsPanel.repaint();

}

else if(action.equals("Down")) { itsShearY+=0.5; itsPanel.setShearY(itsShearY); itsPanel.repaint();

}

else if(action.equals("Move")) { itsMoveFlag = true;

}

else if(action.equals("Horizontally")) { int iw = (int)(itsPanel.getImageWidth()); int ih = (int)(itsPanel.getImageHeight());

Vflipping aFilter = new Vflipping(); itsImage=aFilter.Vflipping(itsImage,iw,ih); itsPanel.setImage(itsImage); itsPanel.repaint(); } else if(action.equals("Vertically")) { int iw = (int)(itsPanel.getImageWidth()); int ih = (int)(itsPanel.getImageHeight());

Hflipping aFilter = new Hflipping(); itsImage=aFilter.Hflipping(itsImage,iw,ih); itsPanel.setImage(itsImage); itsPanel.repaint(); }

else if(action.equals("Background")) { JColorChooser aColorChooser=new JColorChooser(); Color aColor = JColorChooser.showDialog(this,"Choose the Background Color",Color.white); itsPanel.setBgColor(aColor); repaint();

}

else if(action.equals("Windows")) { try { UIManager.setLookAndFeel("com.sun.java.swing.plaf.windows.WindowsLookAndFeel"); SwingUtilities.updateComponentTreeUI(this);

/* For Windows Look And Feel */} catch (UnsupportedLookAndFeelException exc) { System.out.println("UnsupportedLookAndFeelException Error:" + exc); } catch (IllegalAccessException exc) { System.out.println("IllegalAccessException Error:" + exc); } catch (ClassNotFoundException exc) { System.out.println("ClassNotFoundException Error:" + exc); } catch (InstantiationException exc) { System.out.println("InstantiateException Error:" + exc); } }

else if(action.equals("Metal")) { try { UIManager.setLookAndFeel("javax.swing.plaf.metal.MetalLookAndFeel"); SwingUtilities.updateComponentTreeUI(this);

/* For Metal Look And Feel */} catch (UnsupportedLookAndFeelException exc) { System.out.println("UnsupportedLookAndFeelException Error:" + exc); } catch (IllegalAccessException exc) { System.out.println("IllegalAccessException Error:" + exc); } catch (ClassNotFoundException exc) { System.out.println("ClassNotFoundException Error:" + exc); } catch (InstantiationException exc) { System.out.println("InstantiateException Error:" + exc); } }

else if(action.equals("Motif")) { try { UIManager.setLookAndFeel("com.sun.java.swing.plaf.motif.MotifLookAndFeel"); SwingUtilities.updateComponentTreeUI(this);

/* For Motif Look And Feel */} catch (UnsupportedLookAndFeelException exc) { System.out.println("UnsupportedLookAndFeelException Error:" + exc); } catch (IllegalAccessException exc) { System.out.println("IllegalAccessException Error:" + exc); } catch (ClassNotFoundException exc) { System.out.println("ClassNotFoundException Error:" + exc); } catch (InstantiationException exc) { System.out.println("InstantiateException Error:" + exc); } } }

public void mouseClicked(MouseEvent theMouseEvent) {

}

public void mouseEntered(MouseEvent theMouseEvent) {

}

public void mouseExited(MouseEvent theMouseEvent) { itsMoveFlag=false; }

public void mousePressed(MouseEvent theMouseEvent) {

}

public void mouseReleased(MouseEvent theMouseEvent) { itsMoveFlag = false; }

public void mouseDragged(MouseEvent theMouseEvent) { if(itsMoveFlag == true) { itsXpos = theMouseEvent.getX(); itsYpos = theMouseEvent.getY();

itsPanel.setXpos(itsXpos); itsPanel.setYpos(itsYpos); itsPanel.repaint(); }

}

public void mouseMoved(MouseEvent theMouseEvent) {

}}

7.2 Snapshot

MAIN FRONT PASS WINDOW

Main Frame Transform operation

Filter operation Advance filter operation Gray filter operation Invert filter operation Advance filtering emboss operation

Image diffrence Image diffrence

8 Testing 8.1 Software testing Is the process used to assess the quality of computer software. Software testing is an empirical technical investigation conducted to provide stakeholders with information about the quality of the product or service under test, with respect to the context in which it is intended to operate. This includes, but is not limited to, the process of executing a program or application with the intent of finding software bugs. Quality is not an absolute; it is value to some person. With that in mind, testing can never completely establish the correctness of arbitrary computer software; testing furnishes a criticism or comparison that compares the state and behavior of the product against a specification. An important point is that software testing should be distinguished from the separate discipline of Software Quality Assurance (S.Q.A.), which encompasses all business process areas, not just testing. Over its existence, computer software has continued to grow in complexity and size. Every software product has a target audience. For example, a video game software has its audience completely different from banking software. Therefore, when an organization develops or otherwise invests in a software product, it presumably must assess whether the software product will be acceptable to its end users, its target audience, its purchasers, and other stakeholders. Software testing is the process of attempting to make this assessment Software testing methods are traditionally divided into black box testing and white box testing. These two approaches are used to describe the point of view that a test engineer takes when designing test cases.

8.2 Black box testing Treats the software as a black-box without any understanding of internal behavior. It aims to test the functionality according to the requirements. Thus, the tester inputs data and only sees the output from the test object. This level of testing usually requires thorough test cases to be provided to the tester who then can simply verify that for a given input, the output value (or behavior), is the same as the expected value specified in the test case. Black box testing methods include: equivalence partitioning, boundary value analysis, all-pairs testing, fuzz testing, model-based testing, traceability matrix etc.8.3 White box testingHowever, is when the tester has access to the internal data structures, code, and algorithms. White box testing methods include creating tests to satisfy some code coverage criteria. For example, the test designer can create tests to cause all statements in the program to be executed at least once. Other examples of white box testing are mutation testing and fault injection methods. White box testing includes all static testing.White box testing methods can also be used to evaluate the completeness of a test suite that was created with black box testing methods. This allows the software team to examine parts of a system that are rarely tested and ensures that the most important function points have been tested. Two common forms of code coverage are function coverage, which reports on functions executed and statement coverage, which reports on the number of lines executed to complete the test. They both return a coverage metric, measured as a percentage.Testing can be done on the following levels: Unit testing tests the minimal software component, or module. Each unit (basic component) of the software is tested to verify that the detailed design for the unit has been correctly implemented. In an object-oriented environment, this is usually at the class level, and the minimal unit tests include the constructors and destructors. Integration testing exposes defects in the interfaces and interaction between integrated components (modules). Progressively larger groups of tested software components corresponding to elements of the architectural design are integrated and tested until the software works as a system. System testing tests a completely integrated system to verify that it meets its requirements. System integration testing verifies that a system is integrated to any external or third party systems defined in the system requirementsBefore shipping the final version of software, alpha and beta testing are often done additionally:

8.4 Alpha testing Is simulated or actual operational testing by potential users/customers or an independent test team at the developers' site. Alpha testing is often employed for off-the-shelf software as a form of internal acceptance testing, before the software goes to beta testing.8.5 Beta testing comes after alpha testing. Versions of the software, known as beta versions, are released to a limited audience outside of the programming team. The software is released to groups of people so that further testing can ensure the product has few faults or bugs. Sometimes, beta versions are made available to the open public to increase the feedback field to a maximal number of future users Finally, acceptance testing can be conducted by the end-user, customer, or client to validate whether or not to accept the product. Acceptance testing may be performed as part of the hand-off process between any two phases of development.

9 Conclusion

A final product was created to the specification and design given above in this document. The product met all requirements given in the above specification. However,only a few (six) JAI operations were implemented for the final testing of the product.These six operations were divided into two categories and thus were in two different modules.The component was tested with no modules, with one module and two modules. Each time, the component loaded correctly with the available tools (if available). Each tool button on each toolbar was tested individually. The drawing area was tested with a variable number of tools and connectors on them, and checked if mouse operations drag, move and click) worked correctly for all objects on the drawing area. Clipboard functions were tested for modifier tools on the drawing area. Property setting functions for modifier tools (all six) were checked. All of these tests passed. The component was also tested with invalid modules. The component executed correctly by dumping the module when an error state was reached.Finally a simple design was created on the drawing area. The design executed correctly giving the expected output. More complex designs were created and they also gave the correct output.It should be noted that the save and open function were not completed for this component. The completed part for both of these functions worked correctly. One very important feature that was not in the specification was implemented. This was an on the fly check of the modified image. This involved double clicking any tool on the drawing area, which, if connected correctly, opened up the properties window with the modified image up to that point. Further, modifier tools with custom properties were made to show real-time image modification in response to changes the user makes to its settings. Thus, less time was wasted in running the whole design.Other features that were implemented but was not in the specification were arrows connectors showing the direction of the connection, special highlighting of connectors that were connected correctly and tool tips for all tools in the component.

10 References. 10.1 Books 1 W. D. Hillis, The Connection Machine. MIT Press, Cambridge,