isphere ui seminar report

11
iSphere:A free-hand 3D modeling interface Mohamed Imran, Noor Mohamed Sign Language Synthesis and Interaction, 66123,Saarbrucken, Germany http://slsi.dfki.de Abstract. Integrating a high-level 3D modeling interface in free-hand sketching reduces the human cognitive load in 3D creations.Modern CAD based interfaces inhibit interactions with low level commands and thereby creating a psychological gap.Here we intend to develop Isphere with 24 degrees of freedom to bypass the mental load of low-level commands.Isphere is an intuitive device embedded with 12 capacitive sensors which enables object design using Top-down approach.The interface uses simple Push and Pull commands to interact with the user.We believe that Isphere can save lot of time by bypassing traditional Mice and keyboard based mod- eling.Experimental results indicate that novices in 3D modeling learn faster with the help of Isphere.We claim that Isphere is designed to mini- mize the barrier between the humans cognitive model of what they want to accomplish and the computer’s understanding of the user’s task.As Isphere lacks reliability in its input mechanisms, this paper suggest new improved algorithms for sensor control and new novel methods to create 3D models. Keywords: Degrees of Freedom, Human-computer interaction, User In- terface, Input device, Proximity sensing 1 Introduction For designers, a quick method to demonstrate an idea or principle might be Free- hand sketching.A sketch expresses an idea directly and interactively.Freehand sketching is a common way to model 2D objects but for 3D modeling it will be complex and unintuitive to visualize.Recent developments in computer aided de- sign(CAD) helps to address the problem effectively.It involves series of low-level commands and mode-switching operations to model a 3D object.This introduces a well-known problem for novice to learn commands as it takes months to became an expert to create 3D models intuitively. Past studies show that traditional CAD systems make designers to develop Bottom-up approach[1].The approach tends to create complex design cycle as users need to remember low-level commands and perform modeling.This Mode- switching operations affects human thought process[2] as additional mental load is imposed to build extra connections between representations[3].CAD systems eliminates direct interaction thereby creating a gap between realistic interac- tion and low-level commands.We asked whether there is better way to develop

Upload: imran-al-noor

Post on 31-Dec-2015

447 views

Category:

Documents


20 download

DESCRIPTION

Isphere -A tech Report

TRANSCRIPT

Page 1: Isphere UI Seminar Report

iSphere:A free-hand 3D modeling interface

Mohamed Imran, Noor Mohamed

Sign Language Synthesis and Interaction,66123,Saarbrucken, Germany

http://slsi.dfki.de

Abstract. Integrating a high-level 3D modeling interface in free-handsketching reduces the human cognitive load in 3D creations.Modern CADbased interfaces inhibit interactions with low level commands and therebycreating a psychological gap.Here we intend to develop Isphere with 24degrees of freedom to bypass the mental load of low-level commands.Isphereis an intuitive device embedded with 12 capacitive sensors which enablesobject design using Top-down approach.The interface uses simple Pushand Pull commands to interact with the user.We believe that Isphere cansave lot of time by bypassing traditional Mice and keyboard based mod-eling.Experimental results indicate that novices in 3D modeling learnfaster with the help of Isphere.We claim that Isphere is designed to mini-mize the barrier between the humans cognitive model of what they wantto accomplish and the computer’s understanding of the user’s task.AsIsphere lacks reliability in its input mechanisms, this paper suggest newimproved algorithms for sensor control and new novel methods to create3D models.

Keywords: Degrees of Freedom, Human-computer interaction, User In-terface, Input device, Proximity sensing

1 Introduction

For designers, a quick method to demonstrate an idea or principle might be Free-hand sketching.A sketch expresses an idea directly and interactively.Freehandsketching is a common way to model 2D objects but for 3D modeling it will becomplex and unintuitive to visualize.Recent developments in computer aided de-sign(CAD) helps to address the problem effectively.It involves series of low-levelcommands and mode-switching operations to model a 3D object.This introducesa well-known problem for novice to learn commands as it takes months to becamean expert to create 3D models intuitively.

Past studies show that traditional CAD systems make designers to developBottom-up approach[1].The approach tends to create complex design cycle asusers need to remember low-level commands and perform modeling.This Mode-switching operations affects human thought process[2] as additional mental loadis imposed to build extra connections between representations[3].CAD systemseliminates direct interaction thereby creating a gap between realistic interac-tion and low-level commands.We asked whether there is better way to develop

Page 2: Isphere UI Seminar Report

2 iSphere:A free-hand 3D modeling interface

an input interface to manipulate 3D objects effectively, to reduce our cogni-tive load.This answers our goal to develop an intuitive 3D user input interface-Isphere.

The aim of this research was to develop a high-level modeling system whichcan reduce human cognitive load in 3D creation.It creates a new dimensionto interact with 3D models intuitively.It projects designers idea into 3D modelinstantly as Isphere will manipulate 3D objects in a spatial way.Simply, an inputdevice that should make us to focus what is in our mind and built 3D models.Asshown in the Fig. 1, isphere models an object through spatial method.

Fig. 1. iSphere:A free-hand 3D modeling interface

We tested our hypothesis by conducting a study to compare the performancebetween command-based interfaces and Isphere.We claim that our approach todevelop new 3D input interface is better than other research works conducted[5–8] as it is a novel way to have ’3D input interface’.

2 Interactive Techniques

Isphere plays a physical role to control and shape the 3D models.It gives a tangi-ble modeling experience as compared with routine tasks modeling via. Mice andkeyboard because it involves tedious mode-switching activities.Generally, Isphereenables rich play and build environment that makes users feel 3D modeling as’Modeling a piece of clay’.In contrast, routine task modeling is disruptive as usersstumble in their thought process to remember all low level commands.Ispheregives Bi-manual interaction namely Pull and Push actions.To avoid heavy mentalactivities and to shape 3D models interactively bi-manual interaction is used.Bi-manual interaction saves designers time to create live 3D models on the fly

Page 3: Isphere UI Seminar Report

iSphere:A free-hand 3D modeling interface 3

instead of forming shapes with low-level machine commands.Well known, Clickand Select actions requires intense amount of Mode-switching activities.Hence,command based manipulation is non-interactive and non-feedback form to modelobjects.

Objects modeled using 2D representations lacks to capture all spatial char-acters.To avoid this drawback designers couple their mental view and visualrepresentation at a time which is non-intuitive and takes more mental activ-ity.2D modeling has a disadvantage in design outcomes at an early stage ofmodeling.Whereas, Isphere wins the situation as it enhances the 3D modeling inaddition to the 2D representation.

Fig. 2. Interactive methods using iSphere

Fig. 2 shows that Isphere handles Z axis manipulations by different handaction over the surface.From designers point of view, Isphere is a dummy objectfor manipulating 3D geometry but it cleverly maps the gestures.Playing with anobject enables real interaction.

3 Play and Build

Interactive Isphere follows top-down modeling, which gives freedom for users toplay around and build 3D scenes.Human hand actions are mapped to modelingcommands.Input interface has 6” * 6” * 6” dodecahedron (with 12 faces) thatacts as the only physical medium.Each face of the dodecahedron has capacitiveelectrode, which detects human hands motion.Isphere software architecture maps

Page 4: Isphere UI Seminar Report

4 iSphere:A free-hand 3D modeling interface

the input into the high level object which will be 3D model on the screen.Isphereis Proactive as every facets had been controlled leaving 24 Degrees of freedomfor better manipulation.

Device works as human hand gesture detector by capturing different actionsof human hands such as: Pull and Push.

Push action will be triggered when hands are less than 1 inch from Isphere.Itis also called as denting action.Push defines the press button like commandon routine actions.This action will be critical for precise modeling objects.Pullaction occurs when hands are one inch away from Isphere.Pull action defines thesensitivity of the device as the capacitive sensors go from 1 to 6 inches.Upto 6inches all gestures will be detected and internal calibration in done in units soIsphere measures upto 8 units. Interpreting hand actions into high-level modelingis combinational action of aggregating trivial modeling commands.

Fig. 3. iSphere: As a Gesture detector

Fig. 3 shows Push action has two more actions namely: Nearby and touch.Nearbyaction is used as the first level action from Pull to Push.It indicates the currentstate of push.For smoothing touch action is enabled.Consider touch actions astuning method as it gently smoothes our model.

As an example, Fig. 5, gives the possible states observed in Isphere whileModeling an Apple for a pre-loaded circular object which can be seen in Fig. 4.

Page 5: Isphere UI Seminar Report

iSphere:A free-hand 3D modeling interface 5

Fig. 4. Making an Apple using Isphere

Fig. 5. State Diagram for Making an Apple using Isphere

Page 6: Isphere UI Seminar Report

6 iSphere:A free-hand 3D modeling interface

4 Implementation

4.1 Hardware Implementation

Hardware gives a vivid picture of Isphere as it essentially captures all humanhand action.Constructing a right shape for Isphere was a challenge.Nevertheless,We built a foldable dodecahedron with acrylic material which makes a Pentago-nal structure.Every face is capable of sensing hands for eight degrees above thesurface.Capacitive sensors measures physical actions which are connected on theIsphere.Shunt mode operation is used to detect motions.Briefly, Shunt mode ca-pacitive sensing will work between a transmit and receive electrode.For Isphere,the hand movement of the performer changes the electric field between a trans-mit and receive electrode.Hence the current is measured at the second electrodecorrelates to the change in electrical field.Normally, the distance to the secondelectrode is known as Shunt mode[9](in our case less than six inches).Capacitivesensors detects the proximity of hands at twelve different directions which cor-responds to faces.

Fig. 6. iSphere: Hardware Implementation

Fig. 6 shows that received signal from capacitive sensors are level adjustedby signal conditioning circuit (Amplifier,Switch and a Low-pass filter).Finally,digital input will be received by PIC microcontroller. A Microcontroller interfacesthe incoming digital input to software module.

Page 7: Isphere UI Seminar Report

iSphere:A free-hand 3D modeling interface 7

Fig. 7. Software implementation

4.2 Software Implementation

As shown in Fig. 7 Isphere hardware is connected to microcontroller through aSerial interface (RS232).Meta-sphere maps the input signals into a meta-sphereto a target 3D object.We use Alias-Wavefront Maya 6.0 C++ API (Applica-tion Programming Interface) as iSphere plug-in.3D manipulation is realized byMEL(Maya Embedded Language).MEL modifies the functions by drawing rela-tionships from data.The system architecture is flexible for future upgrade.Newfunctions can easily be added into the system. Currently, iSphere manipulates3D mesh-based model in Alias-Wavefront Maya, 3DS Max or Rhino.

5 Experiment

An experiment was designed to capture the potential problem that may ariseafter evaluation.Aim of our experiment is How the Experts and novice try toaccomplish a task using Isphere.We claim that both experts and novice will takesame time to model a shape using Isphere.Our hypothesis seeks a method toexploit and eliminate a gap between novice and experts through an Intuitivemodeling interface.

5.1 Study Design

In order to conduct the experiment six volunteers were chosen in which two peo-ple had intermediate experience and four of them had no prior experience(Novice).Themedian of their age is 22.KLM-GOMS[10] metrics was used to calculate the per-formance of routine tasks in Maya using keyboard and Mice.Final evaluation willbe based on the comparison between the routine tasks given by KLM-GOMS

Page 8: Isphere UI Seminar Report

8 iSphere:A free-hand 3D modeling interface

versus task accomplished by novices using Isphere.Two groups was formed asNovice and Experts to model a 3D shape using Isphere.Before study, We ex-plained how-to use Isphere with a demonstration.Pre-experiment session wasdone in 30 Minutes.

5.2 Experimental Method

The set-up includes a Desktop with pre-loaded software interface so that subjectscould directly get started to perform the given tasks.A standard LCD moni-tor was used to view the shape of the modified object.As LCD screens wereused subjects were placed in a defined distance from the screen to avoid wideviewing as LCD is poor at wide viewing angle.Experiment was done in a wayto increase the proximity sensing so the maximum capacitive sensing could beacheived.The Isphere was placed on a soft-foam base to cushion user’s handwhile modeling.Rendering were done at the shadowing mode to increase the 3Dvisualization.

5.3 Experimental Task

To experiment all cases, four tasks were designed such as: 1.Pull 2.Push 3.Makingan Apple 4.Make any object within 5 minutes.Initially, the screen is loaded withthe default 3D sphere and task was presented to perform pull action upto 3units followed by Push action consecutively.Third task was to make an Applewhich involves sequences of push and pull actions.Final task was to accomplisha free-hand modeling were the subjects asked to model any shape in their mindwithin 5 minutes.

5.4 Experimental Analysis

Analyses based on the time taken to accomplish task using Mice and Keyboardas calculated by KLM-GOMS Versus.Time taken by novice to model the sameshape using Isphere.

As seen in the Fig. 8, Much time was taken by subjects for mental prepa-ration action which we analyze that users try to implement the learnt low-levelcommand.Tabular column explains the time taken for movement of the cursorwas about 1 to 1.5 seconds.Click action using Mice involved less time on thewhole time span but clicking was the most repeated activity during the wholeexperiment.Precisely, each click time was around 0.2 seconds and 15 clicks in-volved totally.All movements like mental preparation, clicking mouse totally cost10 seconds for the first task:Pull-up action and 20 seconds for Push task.

5.5 Experimental results

Using Isphere, all subjects learnt to model an object by controlling the differentfacets Fig. 9, shows that the time involved for a novice to do push and pull was

Page 9: Isphere UI Seminar Report

iSphere:A free-hand 3D modeling interface 9

Fig. 8. GOMS Analysis

Fig. 9. Isphere Vs. Routine tasks

Page 10: Isphere UI Seminar Report

10 iSphere:A free-hand 3D modeling interface

about 8.6 seconds and 12.5 seconds correspondingly.On average 25(percent)timewas saved during the Pull task and 75(percent) was saved during Push task.Itgives a big picture to use Isphere, as it saves lot of time compared with the rou-tine tasks done using mice and keyboard to model 3D object.Results shows thatIsphere gives more reliability and freedom for users to model using several com-binations of the selection, direction and commands.It also proves that two testswere finished by subjects in shorter span of time compared with intermediate.

Thus, Our results clarifies three aspects: Isphere is direct as control pointsare in a way to manipulate the surface directly, It takes less time than routinemodeling, Finally they are intuitive as less mental preparation is involved.

6 Discussion

We found that developing high-level 3D modeling can reduce low-level manip-ulations as it is modeled intuitively.Experimental results prove that modeling3D objects using Isphere eliminates gap between novice and experts.As claimed,the paper suggests that developing a free hand 3D input modeling interface is anovel development in 3D input hardware.

Data shows that Isphere has capability to enhance 3D modeling experienceusing natural human hand gestures.It bridges 3D modeling environment fromnon-interactive to Interactive.Isphere leads a new paradigm for 3D designers fromabstract commands to natural hand interaction.The thought process becomesmore intuitive and direct.Hence, learning becomes easier for novices to realizecomplex 3D objects.

Although Isphere is intuitive and direct, it lacks to model objects accu-rately.Isphere lacks fidelity when 3D objects need to modeled with minimal spanof time.One possible solution to improve speed and accuracy will be by increas-ing the sensitivity of capacitive sensors.Yet, We need confirmatory studies in thearea of sensor research.Readers may criticize that Isphere works for specializedmodes.Nevertheless, it is important to understand each methods performs betterin certain mode it leads a new way to unanswered questions and future directionsdealing with robust mapping into shapes and improving algorithms for sensingcontrol.

Our research implies that an intuitive free-hand modeling interface like Ispherepresents an effective way to express ideas directly without any intense mentalactivities. It aims to improve the interactions between users and computers in-tuitively.Whole idea was designed to minimize the barrier between the humanscognitive model of what they want to accomplish and the computer’s under-standing of the user’s task.

References

1. iSphere:A free-hand 3D modeling interface Chia-Hsun Jackie Lee,Yuchang Hu, andTedSelker

Page 11: Isphere UI Seminar Report

iSphere:A free-hand 3D modeling interface 11

2. Doug A. Bowman, Ernst Kruijff ans Joseph J. Laviola. 3D user interfaces: theoryand practice

3. Bloom, et al. (translated by Shibuya, Fujita and Kajita) Educational AssessmentMethod Handbook: Formative Assessment and Comprehensive Assessment of Sub-jects Learning (Daiichi-hoki shuppan, 1972)

4. Pashler, H. (1994). ”Dual-task interference in simple tasks: Data and theory”.Psychological Bulletin 116 (2): 220244. doi:10.1037/0033-2909.116.2.220. PMID7972591.Mayer, R., Moreno, R., Nine Ways to Reduce Cognitive Load in Multi-media

5. Aish, R., 3D input for CAAD systems. Computer-Aided Design, 1979, 66-70.6. Ishii, H. and Ullmer, B.,Tangible Bits:Towards Seamless Interfaces between People,

Bits, and Atoms. Proc. Of CHI 97,ACM Press, 1997, 234-241.7. Murakami,T. and Nakajima, N., Direct and Intuitive Input Device for 3-D Shape

Deformation. Proc. Of CHI 94,ACM Press, 1994, 465-4708. Rekimoto, J., SmartSkin: An Infrastructure for Freehand Manipulation on Interac-

tive Surfaces. Proceedings of the CHI 2002,ACM Press, 2002, 113-1209. Capacitive Sensors :Technical Notes , http://sensorwiki.org/doku.php/sensors/

capacitive

10. John, B. and Kieras, D.,The GOMS Family of User Interface Analysis Techniques:Comparison and Contrast. ACM Transactions on Computer-Human Interaction,3(4), 1996, 320-351