exploring a modular approach to redesigning interfaces for ... · is that it is a step toward a...

7
Exploring a Modular Approach to Redesigning Interfaces for Physical Interactive Devices Michael D. Jones Casey Walker Zann Anderson Candice Lusk Andrew Bryce Brigham Young University Provo, UT 84602 USA [email protected] Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). Copyright is held by the author/owner(s). TEI 2017, March 20–23, 2017, Yokohama, Japan. ACM ISBN 978-1-4503-4676-4/17/03. http://dx.doi.org/10.1145/3024969.3025075 Abstract Tools for creating graphical user interfaces (GUIs) hide implementation details associated with changing the position, size or shape of a widget. For example, a GUI designer can change the position of a button without explicitly reimplementing the code that determines if the button has been clicked. We seek to provide a similar experience for moving tangible widgets on physical interactive devices (PIDs). One reason that GUI design tools can hide implementation details is that widgets are modules. In this work in progress, we describe a system for leveraging modularity to hide implementation details associated with changing the position, size and shape of widgets in PID interfaces. We have used the system to design, redesign and fabricate 10 interfaces for 3 example applications. Fundamentaly, working with atoms to make PID interfaces is different than working with pixels to make GUI interfaces–but modularity in PID widgets appears to be a promising way to hide implementation details in PID interface redesign and fabrication. Author Keywords tangible interaction; interface redesign; 3D printing Work in Progress TEI 2017, March 20–23, 2017, Yokohama, Japan 465

Upload: others

Post on 08-Aug-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Exploring a Modular Approach to Redesigning Interfaces for ... · is that it is a step toward a tangible UI design process that has the flexibility of a graphical UI design process

Exploring a Modular Approach toRedesigning Interfaces for PhysicalInteractive Devices

Michael D. JonesCasey WalkerZann AndersonCandice LuskAndrew BryceBrigham Young UniversityProvo, UT 84602 [email protected]

Permission to make digital or hard copies of part or all of this work for personalor classroom use is granted without fee provided that copies are not made ordistributed for profit or commercial advantage and that copies bear this noticeand the full citation on the first page. Copyrights for third-party components ofthis work must be honored. For all other uses, contact the owner/author(s).Copyright is held by the author/owner(s).TEI 2017, March 20–23, 2017, Yokohama, Japan.ACM ISBN 978-1-4503-4676-4/17/03.http://dx.doi.org/10.1145/3024969.3025075

AbstractTools for creating graphical user interfaces (GUIs) hideimplementation details associated with changing theposition, size or shape of a widget. For example, a GUIdesigner can change the position of a button withoutexplicitly reimplementing the code that determines ifthe button has been clicked. We seek to provide asimilar experience for moving tangible widgets onphysical interactive devices (PIDs). One reason thatGUI design tools can hide implementation details isthat widgets are modules. In this work in progress, wedescribe a system for leveraging modularity to hideimplementation details associated with changing theposition, size and shape of widgets in PID interfaces.We have used the system to design, redesign andfabricate 10 interfaces for 3 example applications.Fundamentaly, working with atoms to make PIDinterfaces is different than working with pixels to makeGUI interfaces–but modularity in PID widgets appearsto be a promising way to hide implementation details inPID interface redesign and fabrication.

Author Keywordstangible interaction; interface redesign; 3D printing

Work in Progress TEI 2017, March 20–23, 2017, Yokohama, Japan

465

Page 2: Exploring a Modular Approach to Redesigning Interfaces for ... · is that it is a step toward a tangible UI design process that has the flexibility of a graphical UI design process

ACM Classification KeywordsH.5.2. [Information Interfaces and Presentation (e.g.HCI)]: Miscellaneous

IntroductionWe address the problem of simplifying the redesignand fabrication of tangible widget positions, sizes andshapes for PIDs. This problem is important todesigners who create functional prototypes of PIDsbecause simplifying the physical interface redesignprocess allows a designer to explore more interfacevariations in less time.

Other widget sets for PIDs have been proposed[10, 8, 6] but none of them simplify redesign.Prototyping methods for quickly iterating PID interfacedesigns have also been proposed [3, 1] but lack fidelityin functionality, shape, and placement. Previous workutilizing modular GUI elements in PID design [2]utilized modularity in electronics design but did notexplore modularity in widget repositining. We seek aprocess that has higher fidelity in physical interactionwhile retaining fluidity. Our work also diverges fromprevious work by focusing only on the placement ofinteractive elements into PIDs, rather thanprogramming or coupling of sensors with customhardware.

Designers of 2D GUIs have long had interface designtools that support rapid interface redesign. Widgetsused in these tools are software objects that can beautomatically re-instantiated with a new position, sizeor shape without rebuilding the entire interface.

The question that drives this work is: what is possible ifone utilizes aspects of GUI design in PID design? Or,

what can one learn from tools for redesigning GUIs tocreate better tools for redesigning PID interfaces?

In this work in progress, we consider modularity inwidgets for PIDs. A key aspect of GUI widgets is thatthey are modular. Modularity simplifies design changesby reducing the impact of each individual change onthe widget itself and on the interface as a whole.Changing one part of a widget, such as the color, doesnot change other parts, such as the code needed tohandle an input event. Moreover, changes to onewidget do not affect other widgets. In the context ofPIDs, modularity might similarly contain changes toonly those parts directly impacted by the change.

We have built a tool called Morphi for exploringmodularity in the context of redesigning andre-fabricating PID interfaces. We asked 2 students tocraft several interfaces for 3 devices with Morphi. Inthis small study, users were able to create andfabricate new interfaces without re-fabricating theentire device. Algorithms that support re-instantiatingwidgets correctly managed implementation detailsassociated with the changes.

We will bring a total of 10 variations on interfaces to 3different functional PIDs to the conference. Theseobjects and interfaces are shown in Figures 2-7.Participants will be able to interact with the devices.We will switch interfaces to demonstrate the process.

MorphiMorphi is a tool for iterating through tangible UI designsin a PID when the PID housing is a hollow shell thatcontains sensors on a prototyping board. Morphicontains three parts: reconfigurable widgets, automaticinstantiation and removable panels.

Work in Progress TEI 2017, March 20–23, 2017, Yokohama, Japan

466

Page 3: Exploring a Modular Approach to Redesigning Interfaces for ... · is that it is a step toward a tangible UI design process that has the flexibility of a graphical UI design process

Widgets are buttons, sliders and knobs that transfermotion from interactive surface elements to embeddedsensors along an internal mechanism. The interactivesurface element is the piece that the user sees,touches and moves to create input. Inside the PID, anembedded sensor converts that motion into anelectrical signal. The internal mechanism transfersmotion from the interactive surface element to theinternal sensor. Internal mechanisms contain severalmoving parts. The geometry of these parts depends onrelative positions of the surface elements and theembedded sensor.

A widget instantiation algorithm computes thegeometry of the internal mechanism from theplacement of the interactive surface element and theinternal sensor. Generating the geometry of theinternal mechanisms is tedious when done manuallyand needs to be precisely correct but the geometry iseasy to compute algorithmically.

Removable panels are pieces of the PID housing thatcontain interactive surface elements. When theposition, size, type or shape of an interactive surfaceelement changes, the panels contain those changes tojust a part of the housing so that only a few panels,rather than the entire housing, need to be re-printed.

To use Morphi, a designer marks interactive surfaceelement locations on a 3D CAD model of the PIDhousing. Morphi places the interactive surfaceelements as marked and computes the internalmechanisms. The parts are 3D printed for assembly.

From the designer’s perspective, the significance ofthis process is that Morphi abstracts the internalmechanisms allowing the designer to focus on form

and aesthetic. This simplifies placement andre-positioning of widgets–making it more likeplacement and repositioning of widgets in GUIs.

The novelty of Morphi is that it decouples interactivesurface elements from the embedded electronics. Thismeans that the UI can be modified without modifyingthe embedded electronics. The contribution of Morphiis that it is a step toward a tangible UI design processthat has the flexibility of a graphical UI design process.

WidgetsEach widget in Morphi is split into three modules. Thisallows the widgets to be customized such that two ofthe modules can be easily changed.

The modules in a widget are tangible interactivesurfaces, internal mechanisms, and embeddedelectronics. The tangible interactive surface element isthe part the user sees, touches, and interacts with. Theinternal mechanism transfers user generated motionfrom the tangible interactive surface element to theembedded electronics. The embedded electronics areoff-the-shelf components that convert motion into asignal which is processed by a computer program.Embedded electronics are connected and placed on abreadboard inside the device housing.

The designer changes the position, size and shape ofthe tangible interactive surface elements and Morphi’sinstantiation algorithm computes the internalmechanisms needed to connect the surface elementsto the electronics. This is similar to hidingimplementation details in GUI interface redesign.

Figure 1 shows the modular button, knob, and sliderwidget designs(starting from left to right). The

Work in Progress TEI 2017, March 20–23, 2017, Yokohama, Japan

467

Page 4: Exploring a Modular Approach to Redesigning Interfaces for ... · is that it is a step toward a tangible UI design process that has the flexibility of a graphical UI design process

Figure 1: Modular button, knob and slider widgets for use inPIDs. Widgets are shown in blue connected blackhexagon-shaped housing panels and silver electroniccomponents. Widgets can be automatically instantiated to fitdifferent placements on the housing surface.

elements of widgets are each rendered in shades ofblue with the housing panel shown in black. Theelectronic components are silver or gray. Theinteractive surface element for the button, shown onthe left, is the blue hemisphere above the blackhousing panel. Interactive surface elements for theknob and slider are obscured by the housing panel.The long blue vertical shafts in each widget are theinternal mechanisms and the silver components(mostly obscured for the button, but visible) at thebottom of each widget are the electronic components.

A button consists of a detachable cap, a friction-fitcollar, a rod with a ball joint on one end, and a bracket.The user creates motion by pressing on the detachablecap. The motion is transferred through the rod to theball joint. The bracket attaches the ball joint to anelectronic button, on which the ball sits. The electronicbutton is pressed when the ball moves.

The detachable cap is the interactive surface elementfor the button widget. Morphi generates the collar, rodand ball joint, and bracket based on the position of thecap.

The knob consists of a detachable dial and a flexiblerod with rigid ends as shown in the center of Figure 1.The user twists the dial and the flexible rod transfersthat rotation to a potentiometer knob. Morphidetermines the length and curvature of the rod basedon the positions of the dial and the electronics.

The rod transfers torque while remaining flexible. Therod consists of a flexible inner core surrounded by twohelices. The helices are made from rigid material andare joined by small rigid rods that pass straight throughthe center of the core. The flexible core, helices, andconnecting rods are printed as a single unit.

The slider consists of a detachable handle and threejoints: a ball joint near the handle, a prismatic jointalong the shaft, and a knuckle joint as shown on theright side of Figure 1. The user moves the handle tocreate linear motion on the housing surface. The jointstranslate the linear motion of the handle into rotationalmotion of the potentiometer knob. Software translatesknob rotation back into linear motion to match themotion of the slider.

Given the position of the handle on the surface and thelength and orientation of the track, Morphi computesthe length range of the prismatic joint and adds thejoints necessary to connect the prismatic joint to thehandle and the knob.

As the handle moves along the surface, the distanceand angle between the handle and the knob change. A

Work in Progress TEI 2017, March 20–23, 2017, Yokohama, Japan

468

Page 5: Exploring a Modular Approach to Redesigning Interfaces for ... · is that it is a step toward a tangible UI design process that has the flexibility of a graphical UI design process

ball joint under the housing panel allows a member ofthe prismatic joint to pivot relative to the slider handle.At the other end, a knuckle joint attaches the lower endof the prismatic joint to a shackle.

Instantiating Widgets

Figure 2: Partiallydisassembled speaker (see alsoFigure 4) which contains afaceplate printed in a flexibibleblack material.

Figure 3: An alarm clock (seeFigure 5) design showinginternal mechanisms, printed inblue and black, that connectinteractive surface elements toelectronics. The removed facetcontains a knob that wasconnected to the flexible bluerod right of center.

Given the locations of widget interactive surfaceelements on the housing surface, the instantiationalgorithm computes the internal mechanisms neededto connect the interactive surface elements to theelectronics, and cuts holes in the housing to allowmechanisms to pass through the housing.

If the mechanisms for two or more widgets collideinside the housing, the algorithm detects and reports,but does not resolve, the collision. Collisions betweeninternal mechanisms are an example of how buildingPID interfaces with atoms is different than building GUIinterfaces with pixels.

Connecting mechanisms are not printed in place andrequire assembly. To aid in assembly, the algorithmadds raised 3D printed labels to the differentmechanisms and then generates a text documentpairing each widget with the right labeled mechanisms.

PanelsUnlike moving GUI widgets on a 2D screen, whichrequires only recoloring pixels, moving widgets on aPID involves moving atoms. Placing widgets in panelscontains the widgets in different modules, whichreduces the amount of housing that needs to bereprinted when a widget is changed.

It is up to the designer to split the housing into panelsbased on their plans for the interface. As a starting

Figure 4: Three versions of an interface for a Bluetoothspeaker realized using the faceplate panel strategy.

Figure 5: An alarm clock with a sports ball theme. Thedesigner placed interactive surface elements ininterchangeable panels which facilities rapid exploration ofinterface layouts.

Work in Progress TEI 2017, March 20–23, 2017, Yokohama, Japan

469

Page 6: Exploring a Modular Approach to Redesigning Interfaces for ... · is that it is a step toward a tangible UI design process that has the flexibility of a graphical UI design process

point, we explored three strategies for creating thesepanels: face-plates, facets, and self-contained.

A faceplate panel contains all of the widgets for theinterface in one piece. Facet panels partition theinterface surface area into different panels that can bereplaced individually. The self-contained approachplaces the interface and the electronics into a singleremovable component which is replaced when theinterface changes. Each kind of panel is represented inone of the three example applications described next.

Example ApplicationsTwo undergraduate design students used Morphi toconstruct three iterations of interfaces for each of twoPIDs and four interfaces for a third device. In somecases, electronics had to be moved in order to avoidinternal mechanism collisions. The exampleapplications were a bluetooth speaker, an alarm clock,and a volume controller mounted on a car gear shifter.

Parts for the examples were printed using a StratasysObjet260 Connex3 multi-material 3D printer andprogrammed using a RedBearLab Blend Micro Arduinoprocessor connected via Bluetooth to a desktop PC.Morphi was implemented using the Rhinoceros 5 C#developer tools.

Figure 6: The self-containedinterface panel used in theshifter (see Figure 7). The blueframe holds the circuitry and thegrey rods left of center are partof the mechanism that connectsthe interactive surface elementsto the electronics.

The Bluetooth radio, shown in Figures 4 and 2, is anexample of the faceplate panel strategy. The alarmclock uses the facet panel strategy and is shown inFigures 3 and 5. Finally, the volume controller for a carradio uses the self-contained approach to modularizingcomponents and is shown in Figures 7 and 6.

Figure 7: A music controller intended to be mounted on agear shifter in a car with a manual transmission with twovariations on the inteface included on the left.

DiscussionWe have built Morphi to explore modularity for reducingthe effort needed to change the position, size or shapeof a PID widget. Much work remains to make moving awidget in a PID as easy as moving a button in a GUI.

We have assumed that a 3D printed object can bemodified only by printing and using new pieces. Recentwork in reshaping 3D printed objects by removing andadding material [5, 9] may eventually support reshapinginteractive surface elements and PID housings.

Also, we have not explored how the modular process inMorphi can be integrated with a physically-basedsculpting process. For example, it may make sense tosculpt the PID housing as in [4, 7] but to place thewidgets in Morphi. In this process, a designer canexplore functional versions of several interface layoutsdefined in software.

AcknowledgementsThis work is funded by NSF Grant IIS-1406578.

Work in Progress TEI 2017, March 20–23, 2017, Yokohama, Japan

470

Page 7: Exploring a Modular Approach to Redesigning Interfaces for ... · is that it is a step toward a tangible UI design process that has the flexibility of a graphical UI design process

References[1] Avrahami, D., and Hudson, S. E. Forming

interactivity: A tool for rapid prototyping of physicalinteractive products. In Proceedings of the 4thConference on Designing Interactive Systems:Processes, Practices, Methods, and Techniques,DIS ’02, ACM (New York, NY, USA, 2002),141–146.

[2] Greenberg, S., and Fitchett, C. Phidgets: Easydevelopment of physical interfaces throughphysical widgets. In Proceedings of the 14thAnnual ACM Symposium on User InterfaceSoftware and Technology, UIST ’01, ACM (NewYork, NY, USA, 2001), 209–218.

[3] Hudson, S. E., and Mankoff, J. Rapid constructionof functioning physical interfaces from cardboard,thumbtacks, tin foil and masking tape. InProceedings of the 19th Annual ACM Symposiumon User Interface Software and Technology, UIST’06, ACM (New York, NY, USA, 2006), 289–298.

[4] Jones, M. D., Seppi, K., and Olsen, D. R. Whatyou sculpt is what you get: Modeling physicalinteractive devices with clay and 3d printedwidgets. In Proceedings of the 2016 CHIConference on Human Factors in ComputingSystems, CHI ’16, ACM (New York, NY, USA,2016), 876–886.

[5] Peng, H., Wu, R., Marschner, S., andGuimbretiere, F. On-the-fly print: Incrementalprinting while modelling. In Proceedings of the2016 CHI Conference on Human Factors inComputing Systems, CHI ’16, ACM (New York,

NY, USA, 2016), 887–896.[6] Savage, V., Chang, C., and Hartmann, B. Sauron:

Embedded single-camera sensing of printedphysical user interfaces. In Proceedings of the26th Annual ACM Symposium on User InterfaceSoftware and Technology, UIST ’13, ACM (NewYork, NY, USA, 2013), 447–456.

[7] Savage, V., Follmer, S., Li, J., and Hartmann, B.Makers’ marks: Physical markup for designingand fabricating functional objects. In Proceedingsof the 28th Annual ACM Symposium on UserInterface Software & Technology, ACM (2015),103–108.

[8] Savage, V., Head, A., Hartmann, B., Goldman,D. B., Mysore, G., and Li, W. Lamello: Passiveacoustic sensing for tangible input components. InProceedings of the 33rd Annual ACM Conferenceon Human Factors in Computing Systems, CHI’15, ACM (New York, NY, USA, 2015), 1277–1280.

[9] Teibrich, A., Mueller, S., Guimbretiere, F., Kovacs,R., Neubert, S., and Baudisch, P. Patchingphysical objects. In Proceedings of the 28thAnnual ACM Symposium on User InterfaceSoftware & Technology, UIST ’15, ACM (NewYork, NY, USA, 2015), 83–91.

[10] Vazquez, M., Brockmeyer, E., Desai, R., Harrison,C., and Hudson, S. E. 3d printing pneumaticdevice controls with variable activation forcecapabilities. In Proceedings of the 33rd AnnualACM Conference on Human Factors in ComputingSystems, CHI ’15, ACM (New York, NY, USA,2015), 1295–1304.

Work in Progress TEI 2017, March 20–23, 2017, Yokohama, Japan

471