the house of olbrich - an augmented reality tour...

4
The House of Olbrich - An Augmented Reality Tour through Architectural History Jens Keil * Michael Z ¨ ollner Mario Becker Folker Wientapper § Timo Engelke Harald Wuest k Fraunhofer IGD, Darmstadt, Germany Figure 1: Restoring the past: Augmented Reality overlay superimposed on House of Olbrich ABSTRACT With ”House of Olbrich” we present an iPhone Augmented Real- ity (AR) app that visualizes the compelling history of Darmstadt’s unique Jugendstil (Art Nouveau) quarter with video-see through Augmented Reality. We propose methods for enabling high per- formance computer vision algorithms to deploy sophisticated AR visuals on current generation Smartphones by outsourcing resource intensive tasks to the cloud. This allows us to apply methods on 3D feature recognition for even complex tracking situations outdoors, where lightning conditions change and tracked objects are often oc- cluded. By taking a snapshot of the building, the user learns about the architect, design and history of the building. Historical media, like old photographs and blueprints are superimposed on the build- ing’s front, depicting the moved history of the famous House of Olbrich, which was destroyed during World War II and has been only rudimentary restored. Augmented Reality technology allows tourists to jump back in time visually by using their Smartphones: Mixing Realities empha- sizes the user’s experience and leads his attention to the impressive historical architecture of the Art Nouveau. In addition, we ease in- teraction means by superimposing snapshots. Tourists may view and read information also in a relaxed position without the need to front-up their mobiles all the time. * e-mail:[email protected] e-mail:[email protected] e-mail:[email protected] § e-mail:[email protected] e-mail:[email protected] k e-mail:[email protected] Index Terms: K.3.1 [Computing Milieux]: Computers and Education—Computer Uses in Education; H.4.m [InformationSystem Profession]: Miscellaneous— 1 I NTRODUCTION With the House of Olbrich, we present a mobile Augmented Reality history and architecture guide for tourism on cultural heritage sites. It focuses on Darmstadt’s Mathildenhoehe - the famous Artist’s Colony - with the leading architect’s residence, the Josef Maria Ol- brich House, as main example. The mobile application (app) uses AR technology to explain history and architecture visually at the real building in an outdoor environment. With recent smartphone generations, AR finally becomes a mainstream medium, ready for mass markets without the need for special or custom crafted hard- ware. But the current hype around AR on mobiles raised also high expectations on user experience and visual quality and 3D com- puter vision tracking that is working in every situation and not only on posters and markers in controlled environments. Thus qualita- tive AR of end users is even after years of research still a challenge. Hence, we present a robust and maintainable platform for advanced AR visuals on Smartphones in 3D environments. Our goal is to of- fer a robust AR application for iPhone users with sophisticated AR visuals and a great user experience. Established, so called AR browsers, like Wikitude [13], Layar [8] or acrossair [1] started to present AR to a broad audience in 2009 when they enabled simplified inertial-sensor based AR especially on the iPhone 3GS, which featured the necessary technical capa- bilities. However, these systems share only simplified AR views by depicting location-based information by small icons or bubble- overlays relying on often inaccurate GPS, compass and accelerom- eter data instead of applying computer vision (CV) techniques. In contrast, our visualization bases on working with original and historical material as overlay types such as pictures, drawings and blueprints in order to digitally reconstruct the building’s facade and shape in a stylized manner - hence employing AR views beyond 15 IEEE International Symposium on Mixed and Augmented Reality 2011 Science and Technolgy Proceedings 26 -29 October, Basel, Switzerland 978-1-4673-0059-9/10/$26.00 ©2011 IEEE

Upload: others

Post on 04-Apr-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: The House of Olbrich - An Augmented Reality Tour …i.document.m05.de/wp-content/uploads/2008/04/House-Of...The House of Olbrich - An Augmented Reality Tour through Architectural History

The House of Olbrich - An Augmented Reality Tourthrough Architectural History

Jens Keil∗ Michael Zollner† Mario Becker‡ Folker Wientapper§ Timo Engelke¶ Harald Wuest‖

Fraunhofer IGD, Darmstadt, Germany

Figure 1: Restoring the past: Augmented Reality overlay superimposed on House of Olbrich

ABSTRACT

With ”House of Olbrich” we present an iPhone Augmented Real-ity (AR) app that visualizes the compelling history of Darmstadt’sunique Jugendstil (Art Nouveau) quarter with video-see throughAugmented Reality. We propose methods for enabling high per-formance computer vision algorithms to deploy sophisticated ARvisuals on current generation Smartphones by outsourcing resourceintensive tasks to the cloud. This allows us to apply methods on 3Dfeature recognition for even complex tracking situations outdoors,where lightning conditions change and tracked objects are often oc-cluded. By taking a snapshot of the building, the user learns aboutthe architect, design and history of the building. Historical media,like old photographs and blueprints are superimposed on the build-ing’s front, depicting the moved history of the famous House ofOlbrich, which was destroyed during World War II and has beenonly rudimentary restored.

Augmented Reality technology allows tourists to jump back intime visually by using their Smartphones: Mixing Realities empha-sizes the user’s experience and leads his attention to the impressivehistorical architecture of the Art Nouveau. In addition, we ease in-teraction means by superimposing snapshots. Tourists may viewand read information also in a relaxed position without the need tofront-up their mobiles all the time.

∗e-mail:[email protected]†e-mail:[email protected]‡e-mail:[email protected]§e-mail:[email protected]¶e-mail:[email protected]‖e-mail:[email protected]

Index Terms: K.3.1 [Computing Milieux]: Computers andEducation—Computer Uses in Education;

H.4.m [InformationSystem Profession]: Miscellaneous—

1 INTRODUCTION

With the House of Olbrich, we present a mobile Augmented Realityhistory and architecture guide for tourism on cultural heritage sites.It focuses on Darmstadt’s Mathildenhoehe - the famous Artist’sColony - with the leading architect’s residence, the Josef Maria Ol-brich House, as main example. The mobile application (app) usesAR technology to explain history and architecture visually at thereal building in an outdoor environment. With recent smartphonegenerations, AR finally becomes a mainstream medium, ready formass markets without the need for special or custom crafted hard-ware. But the current hype around AR on mobiles raised also highexpectations on user experience and visual quality and 3D com-puter vision tracking that is working in every situation and not onlyon posters and markers in controlled environments. Thus qualita-tive AR of end users is even after years of research still a challenge.Hence, we present a robust and maintainable platform for advancedAR visuals on Smartphones in 3D environments. Our goal is to of-fer a robust AR application for iPhone users with sophisticated ARvisuals and a great user experience.

Established, so called AR browsers, like Wikitude [13], Layar[8] or acrossair [1] started to present AR to a broad audience in 2009when they enabled simplified inertial-sensor based AR especiallyon the iPhone 3GS, which featured the necessary technical capa-bilities. However, these systems share only simplified AR viewsby depicting location-based information by small icons or bubble-overlays relying on often inaccurate GPS, compass and accelerom-eter data instead of applying computer vision (CV) techniques.

In contrast, our visualization bases on working with original andhistorical material as overlay types such as pictures, drawings andblueprints in order to digitally reconstruct the building’s facade andshape in a stylized manner - hence employing AR views beyond

15

IEEE International Symposium on Mixed and Augmented Reality 2011Science and Technolgy Proceedings26 -29 October, Basel, Switzerland978-1-4673-0059-9/10/$26.00 ©2011 IEEE

Page 2: The House of Olbrich - An Augmented Reality Tour …i.document.m05.de/wp-content/uploads/2008/04/House-Of...The House of Olbrich - An Augmented Reality Tour through Architectural History

sole icons and text-overlays. Instead of 3D mesh reconstructions,which often suffer from poor visual quality or are too detailed torun on mobiles, our approach proposes working with original andhistorical material. This may not only save costs and time of anotherwise intense modeling process. By doing so, we provide his-torical correct representations and enable users to access profoundexisting historical media that is often not published or presented.

Our presented concept on Snapshot Augmented Reality [16] thatoutsources processing intense tracking techniques to cloud serviceshas been modified and extended with 3D feature tracking methods.Thereby we are establishing a robust and reliable outdoor 3D fea-ture tracking approach that is able to overcome e.g. changing light-ing and weather condition issues. It would not work on the mobilephones today, due to the limitations of processing power, batterylife, sensor accuracy (GPS, compass) and heterogeneous softwareplatforms.

2 RELATED WORK

Exploring history in compelling ways with augmented reality hasalways been an challenging area - particularly when being outside.Especially early scientific projects proposed own hardware setupswith custom made solutions (e.g. [11], [5] or [15]), because ready-to-buy hardware wasn’t available. Although recent smart phonesare well prepared for AR technology (equipment with componentssuch as cameras and inertial sensors) robust marker less trackingsolutions for outdoor situations stay critical, due to a still limitedhardware power.

Mobile AR systems have a recent history (see [10]) and alsotheir usage for heritage application.Archeoguide [11] proposed asystem for AR-enhanced exploration of heritage sites. The sys-tem visualized missing and reconstructed artifacts, damaged areasof Greeks old Olympia site and overlaid information of ancientOlympic sports events. Users were freely walking around equippedwith an Head Mounted Display (HDM) and a (heavy) portable com-puter in a backpack. The project resulted in a prototype that hasn’tbeen in real practical use, though. Due to the young technologythe performance was very low and vision based tracking as well asreal-time 3D graphics were at an early (low) stage.

iTacitus [18] presented a framework for an mobile AR guide forheritage sites. The work already presented concepts for using his-torical media as augmentation layers for historical sights. However,the system ran on an UMPC that was (although more mobile com-pared to others) still too heavy and too huge for taking along and thecosts for mass usage too high. iTacitus utilized 2D feature-trackingmethods that were also working outdoors at predefined spots butnot very robustly and users had to stick to predefined view angles.

The Westwood Experience project [14] introduced AR and otherMixed Reality techniques for site-based history- and fictional sto-rytelling. At certain spots users could use Nokia’s N900 to get ARviews of a building, to receive additional location-based informa-tion or to listen to audio narrations. The evaluation approved thatAR helped to enhance user’s immersion in historical facts and thefeeling of being in the story. But users disliked the linear pathwaysand found the fictional story as boring and distracting from history.

3 THE MOBILE APPLICATION

3.1 Application and UI DesignHeritage site are changing throughout time. Often, only ruins or atworst nothing is left of a historic place. Architect Olbrich craftedhis house around 1901 as part of the unique Artist’s Colony, Darm-stadt’s Jugendstil (Art Nouveau) quarter. The house was and is fa-mous for its artistic elements of Art Nouveau. However, being de-stroyed and rebuilt today’s remaining Art Nouveau features includeonly a bluish facia of glazed tiles that are surrounding the building’sfront. The historic half-hipped roof, the stylized overhang and theformer entrance are only rudimentarily present.

Our application is centered around exposing these changes of ar-chitecture. The central objective is to visually restore the building’shistoric appearance with Snapshot Augmented Reality. Based onthe architect’s drawings, a simplistic 3D model of the building wasreconstructed. The model is not only used during the server-basedtracking process for the 3D feature matching, but also for the visu-alization of the historic building in AR.

Figure 2: The snapshot process: The user points his smartphonetowards the building (left). The received augmented picture can beviewed directly inside the app. Interactive yellow hotspots lead tomore detailed information (left).

Starting from the apps main menu, the user can choose betweentwo different tours: the augmented reality tour and a classic infor-mation tour. Selecting the AR function from the menu opens theAR photography tour. The user sees the live camera image andpoints his smartphone towards the building. As soon as he hasfound the desired view, he takes a photo by touching the shutterrelease button, which starts the AR process. The picture is sent tothe server for tracking. After waiting 2 to 3 seconds, the smartphonereceives the augmented result from the server. The augmented pic-ture reveals how the building looked like before it was destroyed(shown in figure 2). The different elements that have changed or aremissing nowadays are emphasized by interactive yellow hotspots.Touching a hotspot opens a corresponding info panel with detailedinformation and historical media.

Hence, information is deployed in two steps: First, solely visu-ally by the AR overlays with a 3D reconstruction of the historicbuilding and in a second step with more detailed textual informa-tion and featured media content. With a back button the user returnsto the augmented image to either select another hotspot, to take anew picture or to return to the main menu.

Instead of generating a semi realistic model we decided to keepthe look-and-feel of the architect’s drawings. On the one hand, be-cause early tests proofed the model was perceived not fitting in, ifthe model was not really photo realistically rendered accounting tothe changing ambient and lighting conditions. On the other hand,by keeping it stylized like sketches it is much easier to grasp theresidence’s different facade features, than one would with a morephotorealistic reconstruction. This is as well very close to drawingsone knows from an instruction guide or manual, where drawingsare used to communicate essential parts of a product or machine. Asketch-like appearance also clarifies that the reconstruction is ratheran approximation and interpretation.

Our application’s information architecture and structure is quietlinear and was kept as simplified as possible. Beside the AR tour,

16

Page 3: The House of Olbrich - An Augmented Reality Tour …i.document.m05.de/wp-content/uploads/2008/04/House-Of...The House of Olbrich - An Augmented Reality Tour through Architectural History

the classical tour features additional textual background informa-tion on the architect and offers a 3D model view. Users can interactwith it directly with their fingers, e.g. rotating the model aroundits center by touching it. Taking a closer look by zooming in or outis possible with the two-finger multitouch pinch-and-zoom-gesture.Similar to the AR view, interactive hotspots depict interesting areasand touching them will open the panel with detailed information.

Even the cloud-based tracking is not fail save. The sole 3Dmodel also acts as a fallback showing the building from the user’scurrent position that has been approximated via GPS and compassreadings. Thereby the 3D model view is turned into a ”locative VR”once a tracking failure occurs: The model reacts on user movementsand changes its orientation and scale in account to the user’s posi-tion in order to match to the real building which the user sees, whenhe is walking around. Hence both, real and virtual building, arein line and the latter presents the same facade. Although the userleaves the AR the 3D model presents the same information onlywithout the video-see-through effect.

The system also detects the user’s GPS position to check whetherhe is close to the building or too far away. If so, the AR feature isnot disabled, but the system rather states that the tracking might notwork correctly. By not disabling the function, users can still use theservice and superimpose for example printings of the building thatwill be processed even without being in front of the real one.

Figure 3: Virtual 3D model of the building based on blueprints (left).The information panel with text and historical footage (right).

3.2 Technical Structure and ArchitectureA multitude of different platforms and operating systems for mo-bile devices and Smartphones have evolved during the last years,which increases also the workload to develop and deploy an appli-cation for each of them. Therefore, we decided to use a wrapperconcept for our application, similar to apps such as PhoneGap [9].Tools like this run so called web-apps that are written in HTMLand JavaScript, but with the look and feel of native ones and withaccess to a majority of devices functions and (hardware) compo-nents. Likewise, our client application consists of tow parts: thenative written wrapper application and the front-end that is imple-mented and based on HTML5, CSS and Javascript. It employs thejQuery[7] and jQTouch[6] JavaScript to appear as a native appli-cation in terms of layout and style. Our work was tested and isrunning on an iPhone 4. The native app can access hardware com-ponents and take advantage of iPhone’s operating system API. AJavaScript interface connects the native app with the HTML front-end and gives the latter access to the camera and sensor readings.

With our proposed client structure we could easily add other build-ings or set up other applications without the need to recompile orto change the native implementation. The wrapper app consists ofseveral layers, starting with camera-view that presents the camera’svideo image in the background. It is followed by a web-view layerthat uses the iPhone’s Webkit render engine in order to display andprocess the front-end’s HTML files. By employing Webkit’s 3DCSS we are also able to render the 3D model (cp. Figure 3, left)view without the need of an extra 3D render engine.

We have also implemented a layer with a lightweight version ofour InstantVision system [2]. It grabs the snapshot image and sendsit to the server, packed with the phone’s current GPS position, atimestamp, the phone’s heading and acceleration data, wheneverthe user takes a picture. Although, we are currently not employingCV methods in our mobile application on the phone directly, thelightweight InstantVision system will enable this for future work.

4 SERVER ARCHITECTURE

InstantVisionAR engine server

Google App Engine / frontend

InstantVisionAR engine server

iPhone

iPhone

client port 1

client port ntracking data

3D model, POIs

Reference images

requ

est q

ueue

Figure 4: Server Architecture

The architecture of our cloud computer is outlined in figure 4. Itconsists of a Google App Engine, the client which usually will bethe users mobile device and hidden behind the curtain are one ormore AR engine servers.

The Application on the client device comes with HTML basedcontent which can be used without a network connection, it al-ready provides certain functions like browsing information aboutthe scene viewing images etc. The AR functionality becomes avail-able with a network connection. The user can take an image of thescene, this is sent to the App Engine and from there to one of the ARengines. The AR engine will process the image and add 3D aug-mentations to it and sends the result back to the App engine, whichpasses it back to the client. The Client Application can display theresult and add other content to it.

4.1 App EngineThe Google App Engine [4] is used as the main application serverof our system. The clients connect to this service and the data isrelayed to the AR Engine for processing. Using the App Engineallows us to hide all the technical components of the system fromthe outside world and we are able to alter everything behind it tofit our needs. Furthermore the App Engine allows us to delegaterequests from clients to different servers depending on the client’sapplication in the future. To support high demands we can instan-tiate multiple AR engines. These can process data from differentclients independently and thus reduce response time for the clients.

The data sent around is simply packed in a http post request withattached form data, which is the image taken by the user plus somenumerical data like GPS or compass send as text. For technical

17

Page 4: The House of Olbrich - An Augmented Reality Tour …i.document.m05.de/wp-content/uploads/2008/04/House-Of...The House of Olbrich - An Augmented Reality Tour through Architectural History

reasons the results from the AR engine is a xml encoded stringsand the iPhone app prefers json format, so the App Engine alsodoes data conversion for us.

4.2 AR EngineThis component is responsible for augmenting images with a 3Dmodel of the scene. We use our InstantVision software package [2],which provides a set of tracking methods which can be combinedto provide an optimized object recognition and tracking solution.

In this application we use a structure from motion approach torecord the target scene. As a result we get a 3D feature point cloudwhich also contains image patches around the tracked features. Thispoint cloud is aligned with the 3D model of the scene to transformit into the model coordinate system. We then use the image patchesto generate a randomized tree which can be used for initialization.The whole processing is described in detail in [12]. We also usea subset of the recorded images to generate reference image whichcan be used as an alternative initialization method.

To improve reliability the scene is recorded under different light-ing conditions and with several tracking methods. The trackingsystem tries to initialize by cascading through reconstructed pointclouds, randomized trees and reference images until it finds a validpose. After successful initialization a view tracking iterations withKLT are processed to increase precision. This is a bit brute force butusually one of the models will work and response time is a less crit-ical issue in this single image recognition procedure. We found themethod to be quite robust against changes in the light conditions,thus allowing us to do the outdoor tracking.

The AR Engine also processes a list of ”hotspot points” Thesepoints have been selected on the 3D model of the scene, they areprojected into the image, checked for visibility and send back to theapplication which will use them to show the touchable hotspots ontop of the augmented image.

5 FUTURE WORK

As a next step, we would like to add further buildings from Darm-stadt’s Art Nouveau quarter in order to present a complete guide byconveying information on every historically important building inthe same manner and fashion. We would also like to add person-alization features. The user should be able to save the augmentedphotos on his smartphone and we already thought about generat-ing a KML-file for Google Maps and -Earth applications. By doingso, the user would not only have the geo-located pictures but also apath trace including his waypoints around the area, where the mapmarkers would link to the captured AR-photos. We also intent toexpand our application to handle multiple scenes. Ideally, the usershould be able to walk around the whole city and be guided to otherhistorically interesting spots. In this case we could use GPS datafrom the client to identify the scene of interest and use multipleserver-based instances of the AR engine to process the data.

Having the lightweight vision system on the mobile client, weare also planning to manipulate the video image by applying ourconcept on Reality Filtering[17] and experiment with stylization(see e.g. [3]) in general, which will increase visual quality of theAR views additionally. This goes with the intention to test andapply video streaming techniques instead of processing only singlesnapshots. We would also like to move the tracking or at least partsof the frame to frame tracking onto the mobile device.

6 CONCLUSION

In this paper, we have present an iPhone Augmented Reality (AR)app that visualizes the compelling history of House of Olbrich,which is part of Darmstadt’s unique Jugendstil (Art Nouveau) quar-ter. By employing video-see through Augmented Reality touristsjump back in time visually with their Smartphones. AR viewspresent the building that was destroyed during World War II and

that is only partly restored nowadays in its original historical ap-pearance. We therefore illustrated the concept of Snapshot AR,where tourists take a photo of the residence to learn about architect,design and history of the building. Historical media, like old pho-tographs and blueprints are superimposed on the building’s front,depicting the moved history in a sketch-like manner.

We proposed methods for enabling high performance computervision algorithms on current generation Smartphones by outsourc-ing resource intensive tasks to the cloud. By doing so, we couldpresent a reliable and robust AR application that applies methodson 3D feature recognition for even complex tracking situations out-doors, where lightning conditions change and tracked objects areoften occluded.

REFERENCES

[1] acrossair. http://www.acrossair.com.[2] M. Becker, G. Bleser, A. Pagani, D. Stricker, and H. Wuest. An archi-

tecture for prototyping and application development of visual trackingsystems. 2007.

[3] J. Fischer and D. Bartz. Stylized augmented reality for improved im-mersion. In Proceedings of the 2005 IEEE Conference 2005 on VirtualReality, VR ’05. IEEE Computer Society, 2005.

[4] GoogleAppEngine. http://code.google.com/appengine/.[5] T. Hoellerer, S. Feiner, T. Terauchi, G. Rashid, and D. Hallaway. Ex-

ploring mars: Developing indoor and outdoor user interfaces to a mo-bile augmented reality system. Computers and Graphics, 1999.

[6] jQTouch. http://jqtouch.com/.[7] jQuery. http://jquery.com/.[8] Layar. http://www.layar.com.[9] PhoneGap. http://www.phonegap.com/.

[10] D. Schmalstieg and D. Wagner. Experiences with handheld augmentedreality. In Mixed and Augmented Reality, 2007. ISMAR 2007. 6thIEEE and ACM International Symposium on, pages 3 –18, nov. 2007.

[11] V. Vlahakis, J. Karigiannis, and N. Ioannidis. Augmented reality tour-ing of archaeological sites with the archeoguide system. 2003.

[12] F. Wientapper, H. Wuest, and A. Kuijper. Composing the featuremap retrieval process for robust and ready-to-use monocular tracking.Computers & Graphics, In Press, Accepted Manuscript:–, 2011.

[13] Wikitude. http://www.wikitude.com.[14] J. Wither, R. Allen, V. Samanta, J. Hemanus, Y.-T. Tsai, R. Azuma,

W. Carter, R. Hinman, and T. Korah. The westwood experience: Con-necting story to locations via mixed reality. In Mixed and AugmentedReality - Arts, Media, and Humanities (ISMAR-AMH), 2010 IEEE In-ternational Symposium On, pages 39 –46, oct. 2010.

[15] J. Wither, S. Diverdi, and T. Hllerer. Using aerial photographs forimproved mobile ar annotation. In International Symposium on Mixedand Augmented Reality, pages 159–162, 2006.

[16] M. Zollner, M. Becker, and J. Keil. Snapshot augmented reality -augmented photography. 2010.

[17] M. Zollner, A. Pagani, Y. Pastarmov, H. Wuest, and D. Stricker. Real-ity filtering: A visual time machine in augmented reality. 2008.

[18] M. Zollner, D. Stricker, G. Bleser, and Y. Pastarmov. itacitus - novelinteraction and tracking paradigms for mobile ar. In Proceedings ofthe 7th International Symposium on Virtual Reality, Archaeology andCultural Heritage (VAST), VAST ’07, pages 110–117, 2007.

18