interactive screen

Post on 01-Jul-2015

416 Views

Category:

Education

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Interactive Screen is a Graduation Project for the 2010 year, Ain Shams University, It is extended project of Interactive Wall 2009 Interactive Screen won in a competition called MIE (Made In Egypt) Organized by IEEE, and The project did take TOP ONE rating. For Any Details Please Contact:Hajer.Mohammed@gmail.comhadeel.m.yusef@gmail.comyasmeen.abdel.naby@gmail.comalgorithmist.abubakr@gmail.com

TRANSCRIPT

Section 1

Faculty of computer and

information science

Ain Shams University

Supervisors:Professor Dr. Mohammed RoushdyFaculty of computer and information sciences

Dr. Haythem El-MessiryFaculty of computer and information sciences

T.A. Ahmad SalahFaculty of computer and information sciences

Sponsors:

Teamwork:

Abu-Bakr Taha Abdel Khalek

Hadeel Mahmoud Mohammed

Hager Abdel Motaal Mohammed

Mahmoud Fayez El-Khateeb

Yasmeen Abdel Naby Aly

Agenda:1. Introduction.2. Interactive Screen Vs Other systems.3. Market Research &Customer needs.4. Physical Environment.5. System Framework.6. System Modules.7. Applications.8. Limitations.9. Future Work.10. Final Demo.11. References.

Overview:Projector and two cams system

HCI (Human Computer Interaction) system.Interact with hand gestures(shapes).Extension of interactive wall ‘09

Introduction: Problem Definition:Human interacts normally with another human by using motions, It might be annoying and impractical to use hardware equipments to interact with someone/something.

Introduction: Motivation:

1. “Interactive Wall

2009”.

2. multi touch technology

3. large size of touch screen with appropriate cost.

4. flexibility.

future without annoying

input devices, and to be

proud of being a part of

accomplishing such a

dream.

develop the

Interactive Screen

System to be able to

handle more features

and gestures and to

get over its limitation

which will make the

user get satisfied with

its usability and

flexibility of use.

Time Plan:

Milestones:. Segmentation modules. May-2010

• Multi-hand tracking. April-2010• Automatic hand detection. April-2010 • Z-Depth Module. April-2010• Dynamic gesture Module. May-2010

Physical Environment:

• Simple components constructs a new environment of interactive screens that overcomes limitations of other systems.

•Environment limitations

Traditional Environment

Proposed Physical Environment:

Physical Environment:

other alternative solutions VS proposed solution

Interactive screen Vs other Systems

Microsoft surface.

Diamond Touch.

Touch Screens.

• Cost• No need To touch the screen.• Gesture Recognition.• Dynamic Gesture Recognition.• Bare Hands.• No sensors, pure image

processing.

• In 1991, First Smart White Board.

• Over 1.6 million smart whiteboards have been installed throughout the world.

• Surveys indicates that interactive whiteboards benefit student engagement, learner motivation and knowledge retention.

Market Research

Framework

System Controlling

Input

Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Experimental Results

Input apply a geometric calibration using the four calibration points acquired

by the configuration module. Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Experimental ResultsSimple back ground

Complex back ground

It’s main task is to generate a binary image from the captured

image represents the foreground

Input

Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Experimental Results

responsible for detecting the hand position automatically at any position,

with certain gesture (Open Hand)

Input

Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Experimental Results

responsible for keep track with the user hand, know the actual position of the hand all the time

Input

Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Experimental Results

the main task is to decide whether the user did touch the

screen or not

Input

Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Experimental Results

Responsible for recognize the shape of the hand.

•Static.•Dynamic.

Input

Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Main Finger Main Fingers

Open Hand Closed Hand Fingers

Smart whiteboard application

• In class rooms.• In meeting rooms.

Application

Limitations

• Non Skin color for the user top clothes.

• user must wear Long Sleeves.

•Enter the system with certain gesture.

1. Multi-user system.

2. Body Tracking.

1. Detecting any shape of hand.

Future work

Demo

[1]. Mennat-Allah Mostafa Mohammad,Nada Sherif Abd El Galeel,Rana Mohammad Ali Roshdy,Sarah Ismail Ibrahim, Multi Touch Interactive Surface, Faculty of Computer and Information Sciences, Ain shams University, Cairo, Egypt, 2009.

[2]. Kai Briechle, Uwe D. Hanebeck, Template Matching using Fast Normalized Cross Correlation, Institute of Automatic Control Engineering,Technische Universität München, 80290 München, Germany.,2001.

[3]. Rafeal C. Gonzalez, Richard E. Woods, DIGITAL IMAGE PROCESSING, Third edition, Pearson, 2008.

[4]. Gray Bradski, Adrian kaebler, Learning Open CV, O'Reilly Media, 2008.

[5]. Alan M. McIvor, Background Subtraction Techniques In Proc. of Image and Vision Computing, Auckland, New Zealand, 2000.

[6]. Francesca Gasparini, Raimondo Schettini , Skin Segmentation using multiple thresholding, Milano Italy, 2007.

References

[7]. Hideki Koike, Masataka Toyoura, Kenji Oka and Yoichi Sato, 3-D Interaction with Wall-Sized Display, IEEE computer society,2008.

[8]. Mahdi Mirzabaki , A New Method for Depth Detection Using Interpolation Functions using single camera, INTERNATIONAL ARCHIVES OF PHOTOGRAMMETRYREMOTE SENSING AND SPATIAL INFORMATION SCIENCES, 2004, VOL 35; PART 3, pages 724-726.

[9]. Patrick Horain and Mayank Bomb, 3D Model Based Gesture Acquisition Using a Single Camera,proceedings of sixth IEEE on applications of computer vision ,2002.

[10] Z. Cˇernekova´, N. Nikolaidis and I. Pitas, SINGLE CAMERA POINTING GESTURE RECOGNITION USING SPATIALFEATURES AND SUPPORT VECTOR MACHINES,EUSIPCO,Pozan,2007.

References

top related