lastbscs final project proposal(er) srs1 correction.docx
TRANSCRIPT
7/28/2019 LastBSCS FINAL PROJECT PROPOSAL(ER) SRS1 Correction.docx
http://slidepdf.com/reader/full/lastbscs-final-project-proposaler-srs1-correctiondocx 1/6
BSCS FINAL PROJECT PROPOSAL
Emotion Recognition by Facial
Expressions
Term: Spring 2013
Presented by:
Registration No: Name:L1F09BSCS2090 HASSAN SHABBIR
L1F09BSCS2019 WALEED KHALID
Faculty of Information Technology
University of Central Punjab
7/28/2019 LastBSCS FINAL PROJECT PROPOSAL(ER) SRS1 Correction.docx
http://slidepdf.com/reader/full/lastbscs-final-project-proposaler-srs1-correctiondocx 2/6
ER System
Page1
Project Title
Emotion Recognition by Facial Expressions
Project Advisor Prof. Yasir Danial Khan
Particulars of the students:
S.No Registration#eg.L1F00MSCS0101
Name in FullUse Block Letters
CGPA Signatures
1 L1F09BSCS2090 HASSAN SHABBIR 2.49
2 L1F09BSCS2019 WALEED KHALID 2.23
Advisor’s Consent I Prof./Dr./Mr./Ms. ________________________________________________ am willing to
guide these students in all phases of above-mentioned project / thesis as advisor. I have
carefully seen the Title and description of the project / thesis and believe that it is of an
appropriate difficulty level for the number of students named above.
Note: Advisor can’t be changed without priorpermission of the
Manager Projects and the duration for completion of
Research Project / Thesis is 1 Year (approx.) from the date of
Registration of Research Project/Thesis.
Signatures and Date
Advisor
REFERENCE 1I have carefully read the project proposal and feel that the proposed project is a useful one
and of a sufficient difficulty level to justify a one year work load of above mentioned students.
Recommended Signatures and Date
Yes No
REFERENCE 2I have carefully read the project proposal and feel that the proposed project is a useful one
and of a sufficient difficulty level to justify a one year work load of above mentioned students.
Recommended Signatures and Date
Yes No
7/28/2019 LastBSCS FINAL PROJECT PROPOSAL(ER) SRS1 Correction.docx
http://slidepdf.com/reader/full/lastbscs-final-project-proposaler-srs1-correctiondocx 3/6
ER System
Signature Reference – 1 Signature Reference – 2
Page2
Abstract
Emotions play a vital role in people’s everyday life. It is a mental state that does not
arise through free will and is often accompanied by physiological changes. Therefore
monitoring these changes is important as they are perceptions of emotional changes
and can help in identifying matters of concern at an early stage before they becomeserious. Emotion recognition has become an important subject when it comes to
human-machine interaction. Various methods have been used in past to detect and
evaluate human emotions. The most commonly used techniques include the use of
textual information, speech, body gestures and physiological signals. In this project
we will develop an emotion recognition system based on information provided by the
facial expressions. The four basic emotions observed in this project are happy
(excited), sad, angry and neutral (relaxed). The data has been collected from healthy
individuals, including both male and female. [1][2]
Clear Statement of the Problem
It is proposed to design a real time monitoring systems, capable of evaluating four
basic emotions i.e. happy, sad, angry and neutral. In order to introduce affect
recognition and its possible applications to the information security community. The
implementation of these systems will represent the realization of an important goal for
the security industry, the automation of real-time prediction of human behavior and
intention. [2] If any company makes a funny/sad advertisement and they are not sure
their advertisement is up to the mark. The implementation of these systems will help
them to find out their advertisement is ready or not for market arranging test run by
showing their advertisement to different sets of people.Objectives
As we know, computers and robots are being used widely for betterment of our daily
life therefore it is important for computers and robots to have an artificial mind that
would enable them to communicate with human beings using both logical and
emotional information and our main objective of this project is to design a real time
monitoring systems, capable of evaluating four basic emotions i.e. happy, sad, angry
and neutral.[2] The basic idea of emotion recognition using facial expression is to
segment facial images into various regions of interest. The common regions taken into
account include movements of cheeks, chin, eyes, eyebrows, and mouth. Different
Classification techniques are then applied to differentiate between types of emotions.
Motivation
Over the past decades, human-computer interaction with computer vision has been an
important field in computer study. It is concerned with the relationship of direct
communication between the computer and human beings. Much research has been
conducted on improving and developing the interaction between humans and the
computer. Interested researchers worked on this area for different reasons. One of the
significant factors that contributed to increasing and developing the interaction between
the computer and humans is studying the computer's ability to distinguish emotions for
7/28/2019 LastBSCS FINAL PROJECT PROPOSAL(ER) SRS1 Correction.docx
http://slidepdf.com/reader/full/lastbscs-final-project-proposaler-srs1-correctiondocx 4/6
ER System
Signature Reference – 1 Signature Reference – 2
Page3
humans and the motivation for this project has stemmed from numerous sources.
Firstly the desire is to work and create content with Image Processing and secondly I
always have been keen to do things that are new, and push technology to set new
foundations.
I want to bring IP social interactions together to create a much more appealing andinteractive social medium.
Sometimes we are unable to find out the proper mood of the person during simple
chat where cam is not available this thing will help to find out the mood by changing
the background or sending the emotions automatically. (If angry backgrounds will be
red). This software will help others software as well.
Introduction and Background
Facial emotion recognition plays a vital role when it comes to developing multi-
cultural visual communication systems for emotion translation between cultures. As
we know computers and robots are being used widely for betterment of our daily life
therefore it is important for computers and robots to have an artificial mind that would
enable them to communicate with human beings using both logical and emotional
information The basic idea of emotion recognition using facial expression is to
segment facial images into various regions of interest. The common regions taken into
account include movements of cheek, chin, wrinkles, eyes, eyebrows, and mouth.
Different classification techniques are then applied to differentiate between different
types of emotions. The analysis and evaluation of facial expressions. The proposedscheme used external stimulus to excite specific emotions in human subjects whose
facial expressions were analyzed by segmenting and localizing the individual frames
into regions of interest. Selected facial features such as eye opening, mouth opening.
The system is developed to analyze the test data based on the provided training data to
recognize the emotional states. Our System should recognize seven emotions – anger,
contempt, disgust, fear, joy, sadness, and surprise and also be able to detect emotions
from video, image or real time facial expression.
When we smile, frown or grimace, thousands of tiny facial muscles are at work.
Emotion-recognition software, or ERS, creates a 3-D face map, pinpointing trigger
areas like eye and mouth corners.
Then a face-tracking algorithm matches the movements to basic expression patterns,
corresponding to anger, sadness, fear, surprise, disgust and happiness, or a mixture of
them. [3]
7/28/2019 LastBSCS FINAL PROJECT PROPOSAL(ER) SRS1 Correction.docx
http://slidepdf.com/reader/full/lastbscs-final-project-proposaler-srs1-correctiondocx 5/6
ER System
Signature Reference – 1 Signature Reference – 2
Page4
Related Work
There is a vast body of literature on emotion. Emotion recognition has been
investigated with the Speech and body gesture but the results are generally obtained
with acted emotion database because they contain strong emotional expressions.Recent discoveries suggests that emotions are intricately linked to other functions
Such as attention, perception, memory, decision making, and learning. In this, we
concentrate on the expressive nature of emotion, especially those expressed on the
face.
Project Plan / Schedule
7/28/2019 LastBSCS FINAL PROJECT PROPOSAL(ER) SRS1 Correction.docx
http://slidepdf.com/reader/full/lastbscs-final-project-proposaler-srs1-correctiondocx 6/6
ER System
Signature Reference – 1 Signature Reference – 2
Page5
Resources Required
1) Internet Connectivity
2) Pc
3) Web cam
References
[1] M.Tauseef Quazi, Emotion recognition by using smart sensor
ACM Digital Library, School of engineering and technology, Massey University,
New Zealand, pp. 5-10, Feb 2012
[2]: ACM Joseph Burlington, Proceedings of the 2nd annual conference on
Information security curriculum development, Georgia Southern University,
Statesboro, GA, New York, USA, Pages 95-99, 2005
[3]: Nicole Martinelli, Emotion-Recognition Knows What Makes YouSmile, University, Statesboro, GA, New York, USA, Pages 40-41-, 2007
[4] M. T. Quazi, “Human Emotion Recognition Using Smart Sensors”,
Electronics, Information and Communication System (EICS) Seminar, 21st March2011. Massey University, Palmerston North, New Zealand