deliverable 4 - cr-play › wp...final...videogame-development.pdf · cr-play project no. 661089...

53
CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 1 CR-PLAY “Capture-Reconstruct-Play: an innovative mixed pipeline for videogames development” Grant Agreement ICT-611089-CR-PLAY Start Date 01/11/2013 End Date 31/10/2016 Deliverable 4.5 Final implementation of mixed pipeline for videogame development and report on technical validation

Upload: others

Post on 03-Jul-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

1

CR-PLAY “Capture-Reconstruct-Play: an innovative mixed pipeline for

videogames development”

Grant Agreement ICT-611089-CR-PLAY

Start Date 01/11/2013

End Date 31/10/2016

Deliverable 4.5 Final implementation of mixed pipeline for

videogame development and report on technical validation

Page 2: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

2

Document Information

Deliverable number: 4.5 Deliverable title: Final Prototypes of mixed pipeline for videogame

development Deliverable due date: 31/07/2016 Actual date of delivery: 31/07/2016 Main author(s): Ivan Orvieto, Matteo Romagnoli (TL) Main contributor(s): George Drettakis, Sebastien Bonopera (INRIA), Gabriel

Brostow (UCL) Version: 1.0

Versions Information Version Date Description of changes 0.1 01/07/2016 Structure and contributors 0.2 16/07/2016 Contributions integrated 0.3 23/07/2015 Pre-final version for internal review 1.0 31/07/2015 Final version

Dissemination Level PU Public X

PP Restricted to other programme participants (including the Commission Services)

RE Restricted to a group specified by the consortium (including the Commission Services)

CO Confidential, only for members of the consortium (including the Commission Services)

Deliverable Nature

R Report

P Prototype X

D Demonstrator

O Other

CR-PLAY Project Information The CR-PLAY project is funded by the European Commission, Directorate General for Communications Networks, Content and Technology, under the FP7-ICT programme. The CR-PLAY Consortium consists of:

Participant Number

Participant Organisation Name Participant Short Name

Country

Coordinator

1 Testaluna S.R.L. TL Italy Other Beneficiaries

2 Institut National de Recherche en Informatique et en Automatique

INRIA France

3 University College London UCL UK 4 Technische Universitaet Darmstadt TUD Germany 5 Miniclip UK Limited MC UK 6 University of Patras UPAT Greece 7 Cursor Oy CUR Finland

Page 3: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

3

Summary This is the fifth and final deliverable of Work Package 4: “Design and Development (Requirements, Functional Specifications, and Prototypes)”. The leader of this work package is TL, with involvement of all other partners. The objective of this work package is focused on gathering end-user needs, forming these into functional specifications, and creating the prototypes of the mixed pipeline for videogame development. This WP sets into practice the user-centred design approach adopted by CR-PLAY, ensuring that the technologies developed will result in tools that are effective and usable for professional and semi-professional use.

This deliverable - “Final implementation of mixed pipeline for videogame development and report on technical validation” -describes the results of Task 4.5 “Final Prototype” and Task 4.6 “Technical Validation of Mixed pipeline for videogames development”. The document presented here is meant to support the Final prototype software package, explaining the main features developed, starting from the High-Fidelity prototype (D4.4).

The structure of this deliverable is as follows:

Section 1 describes the process of analysis-implementation-optimization-validation of the mobile porting of the technology.

Section 2 presents the integration of the features from WP2 - Image Based Rendering.

Section 3 draws conclusions and describes the status of the technology at M33.

The updated User Manual and Technical Manual are included as Annexes at the end of the document, as well as some examples of games that can be created with CR-PLAY technology (as presented in D4.4).

Page 4: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

4

Table of Contents Summary ...................................................................................................................................................... 3

Table of Contents ......................................................................................................................................... 4

Abbreviations and Acronyms ........................................................................................................................ 5

Introduction ................................................................................................................................................. 6

1. Mobile Porting ...................................................................................................................................... 7

1.1 Analysis and Technical activities ..................................................................................................... 7

1.2 Optimization ................................................................................................................................... 9

1.3 Performance measures ................................................................................................................. 13

2. Image Based Rendering ....................................................................................................................... 15

2.1 Removing Shadows and Relighting ............................................................................................... 15

2.2 Multiview in-painting.................................................................................................................... 17

3. Conclusion .......................................................................................................................................... 18

References .................................................................................................................................................. 19

Annex A – Final Prototype: User Manual 1.1 ............................................................................................... 20

Annex B – Final Prototype: Technical manual.............................................................................................. 30

Annex C – Gameplay examples ................................................................................................................... 42

Annex D – Improvements on Multi-View Intrinsic Images of Outdoors Scenes ............................................ 51

Page 5: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

5

Abbreviations and Acronyms IBR: Image Based Rendering

IO: Input/Output

API: Application Program Interface

WP: Work Package

CPU: Central Processing Unit

SP: superpixel

POT: Power Of Two

NPOT: Non Power Of Two

Page 6: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

6

Introduction In the first two years of the project, we created an innovative and solid technology. Though still in the prototype stage, that technology has been used to create several game prototypes for the Windows platform. Among the various possibilities for improvement, the final objective of bringing the results onto the market (although with further research and development) has guided us toward a very ambitious goal for the third year: allowing games created with CR-PLAY technology to run also on mobile devices.

Consequently, starting from results of Task 4.4 (that led to the release of the High-fidelity prototype at M22), in Task 4.5 the work was focused on two main goals, porting the technology to mobile and integrating features coming from WP2.

Based on the results of the High-fidelity prototype evaluation (MS14), an analysis and prioritization of the feedback coming from game developers has been done, with the resulting definition of the roadmap towards the creation of the Final prototype (MS12). The main goal was the delivery of the final version of the pipeline based on Capture and Reconstruction, IBR and VBR, that allows game developers to create game prototypes, deployable both on windows desktops and mobile platforms.

To achieve that, a thorough technical analysis (Task 4.6) highlighted the weaknesses and bottlenecks of the High-fidelity prototype of the pipeline. This spotlighted the interventions focused on reduction of memory-use and performance improvements, which are both critical factors on mobile devices.

As a parallel activity, two main outcomes coming from WP2 have been integrated as well. In particular, in Section 2 of this deliverable we will describe Removing Shadows and Relighting and Multiview Inpainting.

Major improvements in terms of rendering quality and processing speed for the reconstruction tool are coming from WP1 too. Given the modularity of the CR-PLAY technology, they will be immediately available to game developers as the reconstruction tool gets updated.

Page 7: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

7

1. Mobile Porting Given the importance of the mobile market for game developers, restricting the deployment of games made with CR-PLAY technology only to desktop platforms would have prevented the possibility for an effective exploitation strategy of the project outcomes.

For this reason, during the third year of the project, porting to mobile has been one of the most natural objectives among the improvements implemented and integrated on the CR-PLAY Mixed Pipeline for videogame development in WP4.

In this section, we will describe the process that made CR-PLAY technology able to deploy scenes and games on mobile devices, the optimizations that allow them to run at a proper framerate, and a measure of the performance of the current version of the mobile port.

Android has been chosen as the mobile target platform for the project. The rationale behind this choice is the availability of not-so-expensive high-performance devices, and the easier distribution system that allows project partners and game developers to run and share games created with the technology. The deployment on iOS devices is of course not prevented by this choice and, given the preparatory work already done, would require just a small development effort to be enabled.

1.1 Analysis and Technical activities

The first step made towards mobile porting was a detailed analysis of the IBR plugin used by the CR-PLAY Mixed Pipeline. This analysis, focused on the architecture strategies and code patterns used, highlighted that further modularization was feasible by separating editor and runtime functionalities into two different plugins. This was advisable because mobile devices are used as deployment platforms, and not for game development, thus just the runtime plugin needed a proper platform-port.

Once the analysis was done, several approaches for the porting of IBR features were investigated (to different extents). Some of them are described below.

The first approach to porting the IBR features onto mobile was to translate the entire codebase into C# and integrate it inside Unity to obtain all the multiplatform capabilities of the game engine. Despite the intrinsic feasibility of this operation, we performed further analysis on the performance and found that the performance drop would be too substantial. Moreover, the entire port would have required a big investment in terms of effort, with no guarantees of success on the task.

The second approach we considered was to directly include the C++ code inside the IL2CPP scripting pipeline, to let Unity [UNITY] handle the marshaling/unmarshaling of data structures and values that need to be passed between unmanaged and managed code. This approach looked like the most convenient solution, but unfortunately the IL2CPP scripting engine provided by Unity is still under development for the Android platform and the native code injection on the actual system would not have guaranteed good results from both stability and performance points of view.

After trying some other routes, the WP4 team concluded that the most promising strategy to port the IBR code to mobile was to build a version of the Runtime Plugin for the Android platform, and to include it in the Unity project as an external plugin.

One point to be confirmed was about the full functionality and performance of Eigen, the mathematical library used to solve the Cholesky Factorization (described in D4.4). To verify this, we prepared a test application implementing a low-level camera using the mathematical features provided by Eigen, and we scripted specific behaviors to stress the library itself. Then we compared the numerical results with the

Page 8: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

8

same behaviors implemented with the default camera provided by Unity. The good results convinced us about the feasibility of the mobile porting, both in terms of stability and performance.

Going more into the details of the performed activities, we extended the CMake [CMAKE] building chain that generates Microsoft Visual Studio’s solutions to build both Monolithic and Split plugins for standalone deployment, and a runtime plugin for Android deployment using NVIDIA CodeWorks for Android [CODEWORKS] (useful for debugging and testing during development). Moreover, we created the building scripts that allow the direct creation of the Android runtime plugin using the Android NDK [ANDROIDNDK], which will be used by game developers. To achieve this result, part of the code needed to be refactored and tested, especially parts that use third-party libraries.

Figure 1 – Code building schematic

Once the build chain was ready to generate the Android Library, and Eigen showed a smooth run on mobile platform tests, we moved to testing on Unity by building a test example with a small reconstructed scene (15 images) for the Android deployment target. Despite the extensive multiplatform capabilities of Unity, switching between standalone and mobile targets is never straightforward. The main reasons for difficulties are the implicit differences between desktop and mobile operating systems, and in our specific case, the different filesystem and file access strategies. Without going too much into details of the errors raised by the Android platform, in essence the problem was due to how the Android device was preventing the CR-PLAY mobile app from accessing its filesystem, to load the data needed by the IBR rendering algorithm. To overcome this issue, we decided to develop a dedicated file loader that uses the low-level input/output APIs provided by C#, and bypasses the normal IO layer provided by Unity. The file loader keeps a dictionary that associates a key, represented by the filename, with the raw file content in bytes, by loading the data from the filesystem asynchronously. When the app reaches a point where a certain file is needed, it waits until it is correctly loaded or proceeds if the file is already in memory.

Writing a system dedicated to low-level file loading and on-memory reading provided the needed flexibility to run the CR-PLAY technology on all Android mobile platforms supported by Unity.

Page 9: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

9

1.2 Optimization

1.2.1 Performance

Even though the technical activities described above allowed the IBR algorithm to run on Android devices, performance were still the main issue that was preventing the use of the technology for developing mobile games.

The first optimization steps were taken on both Behaviour (C# Unity scripts) and Shader code, by reducing some over-sized data structures and fixing lingering bugs. While useful, these small interventions did not introduce significant performance boosts, but improved the order and tidiness of the code, helping us have a strong and more stable codebase for performance profiling.

The second step was the performance analysis using Unity’s profiler for mobile devices, to isolate the most expensive parts of the IBR rendering. Unsurprisingly, it confirmed that the warp and the hole-filling (Poisson passes) are the heavier steps in terms of computational load. The warp step in particular takes most of the CPU time during rendering.

Figure 2 – A snapshot of the Unity Profiler in action

The heaviest part of the warping step is the non-linear system solver used to compute the deformation of a single superpixel (SP), depending on the position of the virtual point of view. The size of this system of equations depends on the number of vertices randomly sampled inside the SP when the warping mesh and the sparse matrices are created. So, during the preparation of the IBR data, we made it possible to tweak the number of vertices and generate lighter meshes and sparser matrices to be used to solve the system at runtime. Before proceeding with other optimization steps, we performed dedicated tests to be sure that the modification of the number of sampled vertices did not adversely affect the final rendering quality of the IBR rendering algorithm. This optimization can have substantial impact on performance, especially if the rendered scene has many SPs selected for warping. For this reason, we exposed this parameter on the Unity Editor to let game developers adjust its value during development, depending on the scene to be rendered.

Page 10: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

10

An additional optimization involved the reduction of current Poisson steps during the hole-filling execution. The current number of steps is set to 5, but several tests showed that reducing it to 2 does not significantly decrease the final rendering quality, and allows the mobile app to gain some more frames per second. Since this setting is highly dependent to the rendered scene, the number of Poisson steps has been exposed to developers via the Unity Editor, to allow them to customize this value during development.

All previous optimizations provided positive effects and let the app gain several frames per second, depending on the reconstructed scene, but to get to an acceptable frame rate on mobile, we needed to use a more drastic approach that consisted of distributing the rendering on two different frames. The advantage of a distributed rendering is simple: the computational and rendering loads are split between two frames rather than having all on a single frame.

To implement the distributed rendering, we split the IBR algorithm steps in two groups: the first group includes camera selection, warping and blending and is performed on odd frames, whereas the second group, including hole filling and rendering, is performed on even frames. Splitting the load on two frames provides a sensible performance improvement as shown by the data presented in the section 1.3.

Figure 3 - IBR scene running with 82 images on Android (Google Tango).

Despite the lag introduced by the distributed rendering, tests did not highlight any evident visual effects when the camera moves quickly during specific gameplay situations.

Page 11: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

11

1.2.2 Memory footprint

Beside the computational performance, another important parameter to be considered, especially when developing on mobile platforms, is the memory footprint at runtime. Memory use is proportional to the number of images (input cameras) composing a specific scene (as an example, the original memory footprint with no optimization is about 300MB for a scene of 15 images, and 600MB for a scene of 82 images).

The optimization step that we adopted to decrease the memory use is to compress the input images.

Texture compression in mobile devices is effective when the texture sizes are in powers of two (POT). To maintain the visual quality of the rendering and to still be able to use such compression, we added an optional step in the dataset importing phase that resizes the original images to a POT value, and we modified warping and blending shaders accordingly, to account for the padding of extra rows or columns introduced around the images.

Figure 4 - On the left, an input image Non Power Of Two. On the right, the same image resized to be Power Of Two.

1024 x 576

1024 x 1024

NPOT POT

Page 12: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

12

Figure 5 – The improved importing tool with POT options

To fit the POT constraint, resized images are bigger than originals, but thanks to the high compression ratio achievable on modern video cards, the compressed images are smaller: in the example above, original images are 1.7MB each, whereas the compressed POT versions are 0.5MB each. So, thanks to this optimization we are now able to decrease the memory footprint at runtime to about 30%.

Page 13: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

13

1.3 Performance measures

In this section, we provide some numerical evidence that shows the good results achieved in porting the IBR technology to mobile. Tests have been performed on two high performance tablets:

Google Tango o NVIDIA Tegra K1, 2.3 GHz o Screen resolution: 1920 x 1200px o RAM: 4GB

Samsung Note 4 (910f) o Quad-core 2.7GHz Krait 450 (Adreno 420) o Screen resolution: 2560 x 1440px o RAM: 3GB

The scene used for performance tests is one of the scenes used for the game IBR Fighting League, composed of 82 images.

The following tables show how the two devices perform on the same scene under different configurations.

Google Tango

Configuration

Avg FPS with Distributed Rendering

Avg FPS with Simple

Rendering

WARP vertices = 10

Poisson passes = 5

47 24

WARP vertices = 10

Poisson passes = 2

49 27

WARP vertices = 3

Poisson passes = 5

50 25

WARP vertices = 3

Poisson passes = 2

50 28

Table 1 - Performance on Google Tango. Higher numbers are better.

Samsung Note 4

Configuration

FPS with Distributed Rendering

FPS with Simple

Rendering

WARP vertices = 10

Poisson passes = 5

31 20

WARP vertices = 10

Poisson passes = 2

35 22

Page 14: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

14

WARP vertices = 3

Poisson passes = 5

33 21

WARP vertices = 3

Poisson passes = 2

36 23

Table 2 - Performance on Samsung Galaxy Note 4

As we anticipated in the previous section, the most important impact on the overall performance is introduced by the distributed rendering that provides an improvement in frames per second of about 45% on a Google Tango and about 30% on a Samsung Note 4. Other optimizations bring small improvements, but since their impact depends on the specific scene, it is important to give game developers sliders to tweak them based on their specific needs.

Page 15: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

15

2. Image Based Rendering

2.1 Removing Shadows and Relighting

In Deliverable 2.1, we already presented an algorithm to extract intrinsic images and relight outdoor scenes using image-based rendering. In the long term, this algorithm should allow us to change the sunlight and to merge multiple scenes captured at different times of day, while ensuring the coherence of their shadows. In addition, we can use information from intrinsic images for other tasks, such as refining the reconstructed mesh using detected shadows, or generating a texture map free of illumination effects.

This relighting mechanism is divided into two steps; we first need to extract intrinsic images for all the pictures of a dataset before relighting the scene to simulate a change of sun direction. The first step is a pre-process that can be launched from Unity Editor or done automatically after the reconstruction (depending on user settings). The second step will be available as a preprocessing tool as well as a runtime tool so that game developers can change sunlight during the game.

Figure 6 - Example of preprocessing tool integration into Unity Editor

The previously mentioned algorithm is described in detail within the paper “MultiView Intrinsic Images of Outdoors Scenes with an Application to Relighting”. Although successful, our first implementation of this algorithm required hours for generating intrinsic images on typical datasets. Consequently, we worked on a more robust and efficient implementation. Our new implementation simplifies several aspects of the algorithm, in particular unifying the coarse and a fine shadow classifier, which were quite distinct in the original algorithm (see ANNEX D below for additional details). In addition, we accelerated processing times by parallelizing critical steps. For a scene captured with around thirty pictures, we are now able to decompose all images in less than twenty minutes.

Finally, we also extended our automatic algorithm to support user inputs. In practice, users can visualize the output of the automatic shadow classifier and annotate erroneous shadows with coarse scribbles. This user interface, which can be launched from the Unity Editor, displays each image overlayed with its detected shadows. Users can mark pixels that should be in shadow (using the left mouse button) or pixels that should be in light (using the right mouse button). The image pixels are automatically grouped into superpixels (closed regions of nearby similar pixels) based on color similarity and spatial proximity, which greatly facilitates the annotation of precise object and shadow boundaries. In addition, the scribbles drawn in one image are projected on all other images using the 3d points computed during the reconstruction step. This projection mechanism allows the quick annotation of large datasets. The shadow classifier algorithm then updates its prediction to best satisfy the user-provided annotations, which effectively propagates the corrections to other parts of the images.

Page 16: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

16

Figure 7 - Example of user corrections. Estimated shadows are colored in red (upper left). Added user scribbles are colored in green.

Note that this interface can be relaunched whenever the user wants to correct the result. Only a fraction of the intrinsic image algorithm needs to be executed to take into account new scribbles, and to update the solution. Obviously, better results in the extraction of the intrinsic images yield better inputs for the subsequent step of image relighting.

Concerning relighting, we are developing a new strategy consisting of refining the reconstructed 3D mesh using intrinsic images. Indeed, now that we know the shadow boundaries in the images and the sun direction, we can update the 3D mesh to ensure that its virtual shadow agrees with the extracted shadows. The main intuition of the algorithm is to dilate or erode the multiview 3D reconstruction to align it with the shadow volume obtained by extruding the shadow boundary towards the sun direction. We now describe this new algorithm in more detail.

First, we store all the 3d points computed during the reconstruction step into a uniform 3d grid, which is a data structure better suited to erosion and dilation operations. Each grid cell contains a counter initialized with a value of 0 if the cell is empty, and 1 if the cell contains one or several 3d points. The resolution of this grid could be adjusted by the user depending on the desired accuracy of the reconstruction. Then, we complete missing geometry using shadow information. For each image, we first recover the 3D position of each shadow pixel by projecting it on the 3D mesh obtained from multi-view stereo. For each such 3d point, we cast a ray toward the direction of the sun. If this ray hits an already-filled cell of the grid, then we increment its counter. If this ray doesn’t hit anything, then we search along the ray for the cell that is closest to an already-filled cell and increment its counter, as well as the counters of all the cells between the ray and the closest already-filled cell. This process effectively fills in holes in the geometry that should cast a shadow. However, it does not remove extraneous geometry that casts false shadows. This is why we also cast a ray for each point in light and decrement the counters of every cell it traverses. After completion of this algorithm, each grid cell stores a value of occupancy that reflects the number of shadow pixels that “voted” for it, and as such provides us with a notion of confidence in our reconstruction. While we first plan to simply consider that all nonzero cells are occupied, the confidence value may prove useful for more advanced processing. We finally extract a 3D mesh from the grid by using a standard algorithm based on Marching Cube.

Page 17: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

17

2.2 Multiview in-painting

This section of the deliverable describes the integration of the multiview inpainting algorithm into the mixed pipeline. This algorithm exploits information from multiview 3d reconstruction to drive an image inpainting technique that allows semiautomatic editing of image datasets. The complete algorithm will be described in detail in the next deliverable. This new algorithm allows us to remove elements from captured pictures, such as people, cars and billboards. Further, our initial prototype shows that the removed objects can be rendered at different locations and even moved at runtime within the 3D scene.

To perform such editing, an interface is provided to let users select objects in each image of a dataset. This interface is similar to the shadow annotation interface described in the previous section of this document. By reusing the same interface, we seek to minimize the learning time required for CRPlay users.

Figure 8 - Generation of masks used by the inpainting tools. In the interface (left), users scribbles are colored in green, whereas blue regions correspond to propagated scribbles from other views.

Once again, the images are segmented into groups of pixels to form superpixels. Users can mark superpixels of the targeted object by pressing the left button of the mouse. The right mouse button is used to unselect superpixels. A pixel-based brush is also available to edit borders that are not well captured by the superpixels. This interface also relies on a similar reprojection mechanism as the shadow annotation interface to propagate edits from one image to all others. Small modifications have been done for avoiding propagating information from 3d points near object borders. The segmentation into superpixel is also adapted for this new context; superpixels are smaller to allow more precise selections.

In the same way that other preprocessing tools, the multiview inpainting tools are available through the Unity Editor in the CRPLAY functionalities.

Page 18: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

18

3. Conclusion The Final prototype of CR-PLAY pipeline is the most advanced release of the technology that allows the creation of Unity scenes starting from images and videos of real locations. Such scenes are usable for making videogames and, in more generally, for every kind of application where real-time navigation and interaction in a photorealistic 3D environment is required.

Starting from the Windows-only release (D4.4), a great amount of effort in WP4 has been dedicated to making the port to mobile possible. This was not guaranteed at the beginning of the activities, given the limitations in terms of processing power and memory availability on existing mobile devices. Several loops of analysis-rapid prototyping-validation have occurred in the first months of the third year, towards finding the right approach for mobile platforms.

The visual quality and performance we now have on mobile devices pays back all the effort spent in making it happen.

The future adopters of CR-PLAY technology will be able to deploy their applications both on Windows platforms, and on mobile devices, with a simple change in deploy configuration in Unity. This will allow them targeting two very important markets for games and apps in general.

Further improvements are expected, coming from WP1, WP2, and WP3, and some of them will be made available to game developers (also in view of the upcoming workshop), while others will require some additional effort to be integrated.

The CR-PLAY technology is now mature enough to play two important roles in the remaining duration of the project: i) to be used for evaluation activities with game developers in the context of Task 5.3 and in the workshop that will be held in September 2016 in Patras, and ii) as a technological foundation to support exploitation activities during the project (the strategy will be detailed in D6.3), and after its end.

Page 19: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

19

References [ANDROIDNDK] https://developer.android.com/ndk/index.html

[CMAKE] https://cmake.org/

[CODEWORKS] https://developer.nvidia.com/codeworks-android

[EIGEN] http://eigen.tuxfamily.org

[UNITY] http://unity3d.com/

Page 20: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

20

Annex A – Final Prototype: User Manual 1.1

1. CR-PLAY Technology in a nutshell CR-PLAY is an innovative mixed pipeline for videogame development that allows the integration of Image Based Rendering, Video Based Rendering and 3D Textured Objects generated from images, together with traditional 3D contents, in the videogame development process. The pipeline is composed by three main phases:

Capture: the User takes pictures of the scene/object that he wants to insert in the game.

Reconstruct: the User use specific tools to create IBR Assets, VBR Assets and 3D Textured Objects generated from images.

Edit and Play: reconstructed assets are integrated with traditional contents and the game is created and deployed.

Each asset created with the CR-PLAY mixed pipeline have specific characteristics that can be exploited in the creation of a game.

IBR Assets are detailed representations of outdoor and indoor environments, created from a set of input images. They are mainly suited for advanced backdrops thanks to their intrinsic parallax and the possibility of simulating occlusion on traditional 3D objects.

Figure 9 – Images taken during the Capture phase (above) and example of IBR scene (derived from the images) with occlusion from different views (below)

VBR Assets allow the detailed simulation of dynamic elements (moving and animated objects) and represent them in advanced video-textures, able to reproduce specific video sequences according to external gameplay events.

Page 21: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

21

Figure 10 – Example of VBR assets (moving candles) in the Unity Editor

3D Textured Models are traditional polygon models with high-res photorealistic textures, captured and created from a high number of images. They can be used as traditional assets and provide a full interaction with the 3D world.

Figure 11 - 3D Textured Model (left) with different optimization levels (right)

Next image (Figure 12) introduces the main steps characterizing the use of the CR-PLAY Technology (next sections of the document provide a detailed explanation of each step).

Page 22: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

22

Figure 12: CR-PLAY Use Case schema

Page 23: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

23

2. CR-PLAY System Requirements The features listed in this section are intended to describe the target developer machine, which will allow the CR-PLAY mixed pipeline to work smoothly. Less powerful machines can give unexpected errors and hinder user experience, especially in configurations with less video memory than the value suggested.

Processor: Intel i5 4590 3.3 GHz or higher

Memory: RAM 8GB or higher

Video card: ATI Radeon R9 200, 3GB VRAM or NVIDIA equivalent or higher

CR-PLAY needs Unity 3D 5.x version.

3. Installation of the CR-PLAY mixed pipeline In this section the User will install the Reconstruction Tool and all Unity components needed to use CR-PLAY Technology to create games.

3.1 Setup of Unity project and installation of CR-PLAY packages

3.1.1 The User creates a batch file in the project's root folder, in order to start CR-PLAY Unity project in DX11 mode.

Example of batch command:

"<path_to_Unity_Editor>/Unity.exe" -force-d3d11 -projectPath %~dp0

By running the batch command Unity will start in DX11 mode.

3.1.2 The User opens Unity 5 using the batch file created (double-click on it), sets building parameters and game resolution, and creates the "IbrProxy" layer.

Unity is ready to be used.

3.1.3 The User downloads the CR-PLAY Reconstruction Tool Unity Package from http://www.cr-play.eu/download_sw/ and installs it in the Unity project (username and password required).

Unity is ready to use Reconstruction Tool features.

If the download is not available due to server or credentials problems, the User can contact [email protected] to get assistance [Error 1].

3.1.4 The User downloads the CR-PLAY Unity Package http://www.cr-play.eu/download_sw/ and installs it in the Unity project (username and password required).

Unity is ready to use IBR features.

CR-PLAY Unity package cannot be installed. Unity provides a very detailed output log that can be used in order to fix specific issues, if he/she cannot fix the issue. He/she could contact [email protected] to get assistance [Error 2].

Page 24: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

24

4. Use of CR-PLAY mixed pipeline for videogame creation This section explains how to use CR-PLAY technology to create games. Various phases are described: Capture, Reconstruct, Edit and Play, Deploy on Windows platform.

4.1 CAPTURE the scene

4.1.1 The User takes pictures of the scene following the provided capture guidelines.

See Technical Manual: Guidelines for capturing datasets

The scene has been captured.

4.1.2 User copies the pictures from the camera memory to the computer where the Reconstruction Tool is installed.

Pictures are available for Reconstruction.

4.2 RECONSTRUCT the scene

4.2.1 User opens Unity and click on the “CR-PLAY/Reconstruction Tool” menu item, sets the “Image folder” and creates the batch file.

See Technical Manual: How to use the Reconstruction Tool in Unity

The batch script is present in the temporary folder.

The batch file is not present. Unity can show errors in the console that user can use in order to fix specific issues. He/she could contact [email protected] to get assistance [Error 3].

4.2.2 User runs the batch script from the command prompt and waits for the process to finish correctly.

After a while the dataset has been generated in the output folder. NOTE: this can take up to several hours, so please plan ahead

Reconstruction process fails, the User can contact [email protected] to get assistance [Error 4].

4.3 (EDIT and) PLAY the scene - IBR

4.3.1 User moves the "<ibr_dataset>.zip" file generated in the Reconstruction Tool output folder to the "Assets/IBRDatasets" folder.

IBR scene is ready to be imported.

4.3.2 User selects the "CR-PLAY/Import IBR" menu item.

See Technical Manual: How to use CR-PLAY mixed pipeline in Unity

ImportDataset window opens.

Page 25: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

25

4.3.3 User checks the <ibr_dataset> item and clicks on "Import" button, waiting until the importing is completed.

It can take some time depending on the machine and on the number of images in the dataset.

See Technical Manual: How to use CR-PLAY mixed pipeline in Unity

IBR dataset is imported in the project.

4.3.4 User instantiates the IBRscene prefab in the scene and selects it in order to open its properties in the Inspector window.

Unity is ready to show the IBR scene in its scene view.

4.3.5 User sets the text field "Dataset Name Folder" to <ibr_dataset>, checks the "VSFM Coord Sys" and clicks the "Load IBR Scene" button to load the IBR in the Unity scene.

See Technical Manual: How to use CR-PLAY mixed pipeline in Unity

IBR scene is loaded in the Unity scene.

Unity can raise an error in case the "Dataset Name Folder" is not set to a valid folder or if the Import operation (step 0) has not yet been done. For any other case, the User can contact [email protected] to get assistance [Error 5].

Figure 13 - The IBR scene is loaded in Unity

4.3.6 [Only the first time a specific dataset is imported or modified] User clicks on "Prepare IBR Scene” in order to create superpixel intermediate textures and all needed data.

See Technical Manual: How to use CR-PLAY mixed pipeline in Unity

Page 26: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

26

Intermediate superpixel textures and other data are generated and placed in "Assets/Resources/", “Assets/IBRAssets/” and “Assets/StreamingAssets” folders. This operation can take time depending on the machine.

Unity can crash if, due to the high number of input images, the texture generation uses all the available video memory. In this case the User can contact [email protected] to get assistance [Error 6].

4.3.7 User PLAYS the Unity scene.

Unity scene is played and the IBR scene is shown after a loading phase.

Unity scene can give errors depending on the specific gameplay code. Unity provides a very detailed output log that the User can use in order to fix specific issues, if the User cannot fix the issue and it is related to CR-PLAY, he/she can contact [email protected] to get assistance [Error 7].

Figure 14 - The IBR scene is played in Unity

4.4 (EDIT and) PLAY the scene – VBR

In case you plan to use animated content in your game that could be done with the VBR technology, please contact [email protected] to receive assistance.

4.4.1 User moves the <vbr_dataset>.zip in the "Assets/VBRDatasets" folder.

VBR scene is ready to be imported.

4.4.2 User moves the focus on Unity 5 and selects the "CR-PLAY/VBR Import" menu item.

ImportDataset window opens.

4.4.3 User checks the <vbr_dataset> item and clicks on the "Import" button, waiting until the importing has been completed.

VBR dataset is imported in "Assets/VBRAssets" folder.

Page 27: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

27

4.4.4 User instantiates the VBRObject prefab in the scene and selects its child "VideoTexture" in order to open its properties in the Inspector window.

Unity is ready to show VBR object in its scene view.

4.4.5 User sets the textfield "Dataset Name" to <vbr_dataset> and clicks the "Load Dataset" button to load the VBR video textures.

Video textures are loaded in the scene.

Unity can raise an error in case the "Dataset Name" is not set to a valid folder or if the Import

operation (step 0) has not yet been done. For any other case, User can contact [email protected] to get assistance [Error 8].

4.4.6 User sets playback parameters depending on his/her needs.

VBRObject is ready to be played.

4.4.7 User PLAYS the Unity Scene.

Unity scene is played and VBR object is shown.

Unity scene can give errors depending on the specific gameplay code. Unity provides a very detailed output log that User can use in order to fix specific issues, if User cannot fix the issue

and it is related to CR-PLAY, he/she can contact [email protected] to get assistance [Error 9].

Page 28: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

28

4.5 Deploy the game on Windows

4.5.1 The User clicks "Edit/Project Settings/Graphics" menu item and adds shaders in "Assets/Shaders" folder in the "Always Included Shaders" list.

IBR shader will be linked to deployed packet.

4.5.2 The User clicks on "File/Build Settings" menu item.

Building Settings window appear.

4.5.3 The User clicks "Build" button, selects the final executable location and start building.

Building process is started. NOTE: Only a Windows build target is supported for now.

The building process can fail while compiling the scripts due to code errors. Unity provides a very detailed output log that User can use in order to fix specific issues, if User cannot fix the

issue and it is related to CR-PLAY, he/she can contact [email protected] to get assistance [Error 10].

4.5.4 User creates a batch file in order to start the created exe in DX11 mode.

The batch file is created.

4.5.5 User double-clicks on the batch file and starts the game.

The game runs showing CR-PLAY technology.

Deployed game can crash for several reasons. Unity deployed applications provide a very detailed output log that the User can use in order to fix specific issues, if the User cannot fix the

issue and it is related to CR-PLAY, he/she can contact [email protected] to get assistance [Error 11].

4.6 Deploy the game on Android

4.6.1 The User clicks "Edit/Project Settings/Graphics" menu item and adds shaders in "Assets/Shaders" folder in the "Always Included Shaders" list.

IBR shader will be linked to deployed packet.

4.6.2 The User clicks on "File/Build Settings" menu item.

Building Settings window appear.

4.6.3 The User clicks on “Android” menu item and then he/she clicks again on “Switch Platform”.

The project is now set for ready to be compiled for Android.

Page 29: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

29

4.6.4 The User connects an Android Device to his/her workstation.

4.6.5 The User clicks “Build” and select the final executable location.

Building process is started.

The building process can fail while compiling the scripts due to code errors. Unity provides a very detailed output log that User can use in order to fix specific issues, if User cannot fix the

issue and it is related to CR-PLAY, he/she can contact [email protected] to get assistance [Error 10].

4.6.5.1 The User navigates to the folder containing the executable location.

4.6.5.2 The User opens another Explorer Window and navigates into a folder of the connected Android Device.

4.6.5.3 The User copies the APK file and pastes it into the previously opened folder of the device.

4.6.5.4 The User launches a File Manager (on the connected device) in order to open the folder which contains the APK.

4.6.5.5 The User taps the APK and taps “Install”. After the installation, the system allows the User to launch the game by tapping “Open”.

The game runs showing CR-PLAY technology.

4.6.6 ALTERNATIVE METHOD: The User clicks “Build And Run” button, select the final executable location and start building.

Building process is started.

The building process can fail while compiling the scripts due to code errors. Unity provides a very detailed output log that User can use in order to fix specific issues, if User cannot fix the

issue and it is related to CR-PLAY, he/she can contact [email protected] to get assistance [Error 10].

4.6.6.1 The User waits for the termination of compilation and deployment processes.

The Android Game is deployed on the device.

Deployed game can crash for several reasons. Unity deployed applications provide a very detailed output log that the User can use in order to fix specific issues, if the User cannot fix the issue and

it is related to CR-PLAY, he/she can contact [email protected] to get assistance [Error 11].

4.6.6.2 The game is automatically launched by Unity.

The game runs showing CR-PLAY technology.

Page 30: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

30

Annex B – Final Prototype: Technical manual

How to use the Reconstruction Tool in Unity

Installation To install the Reconstruction Tool in Unity, it is necessary to import the unity package cr-play-reconstruction-tool.unitypackage, that automatically installs all needed resources in the selected Unity project.

Usage

From Unity click on the “CR-PLAY / Reconstruction Tool” menu item.

The Reconstruction Tool window will appear presenting the options to the user.

The user sets the “Image directory” field with the path of the folder containing the images captured and the “Temporary directory” and “Output directory” with the path to the temporary folder and the folder where the output archive will be copied respectively.

The user sets the “Scene scale” field to set the final size of the reconstruction. The scale value is indicated as the distance in meters between the first two pictures of the dataset. Modifying this value will allow the user to scale the size of the resulting reconstruction up or down. Example: setting

Page 31: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

31

the scale value to 2, will provide a reconstruction where the cameras that took the first two pictures are at 2 meters from each other.

Other fields can be left as they are.

Once all needed parameters are set, the user can proceed by clicking the “3D Textured Object” button to create a traditional 3D textured model (typically for an isolated object for which photos have been taken all around) or by clicking the “IBR Dataset” button to generate the IBR dataset (typically for backdrops).

Once one of the buttons is clicked, a batch script is created inside the “Temporary Directory”. To run the reconstruction process the batch file needs to be executed from the command prompt.

In the table below you we show the description of all the parameters available through the Reconstruction Tool interface and their explanation.

FIELD NAME DESCRIPTION

Image directory Path of the folder that contains the captured images.

Temporary directory Path of the temporary folder used by the different steps of the reconstruction pipeline.

Output directory Path of the folder that will contain the output archive.

Scene scale Scaling factor applied during the reconstruction pipeline (i.e., the distance in meters between the cameras that took the first two pictures of the captured image set).

MVE scale * Scale used for the whole reconstruction.

Max. image width/height * Maximum pixel size the captured images will be resized to. 0 means no resizing. Use only even values, since otherwise some of the IBR steps won’t work properly.

meshclean options: Component size *

Minimum number of vertices per component on the mesh reconstruction step.

meshclean options: Threshold *

Threshold on geometry confidence: N in a range between 0 and the image number. Takes into account 3D points for surface recognition only if visible from N cameras.

Page 32: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

32

Depth extraction options: min MVS threshold *

The minimum MVS point confidence threshold, useful for back-projection. Try 0.0 if black depth-maps are generated.

Superpixel options: Threads *

Number of threads used to compute superpixels.

Superpixel options: Num superpixels *

Defines the number of superpixels you want for the oversegmentation's step.

Superpixel options: Compactness weight *

Defines the compactness factor, in range for the oversegmentation's step.

DepthSynth options: Samples *

Indicates how many additional depth samples will be found in depth maps images.

(*) Advanced parameter: please do not use unless you know what you are doing.

Page 33: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

33

How to use CR-PLAY mixed pipeline in Unity

Installation To install the CR-PLAY Mixed Pipeline in Unity, it is necessary to import the unity package cr-play.unitypackage, that automatically installs all needed resources in the selected Unity project. This package contains both IBR and VBR scripts and resources.

Import IBR(VBR) datasets Before working with IBR(VBR), datasets must be imported to the Unity project using the dedicated tools provided. Importing operations automatically take the dataset zip coming from the Reconstruction Tool and import the needed assets in the Unity project folders. Copy and paste the dataset file coming from the reconstruction tool in the

Assets/IBRDatasets(VBRDatasets) folder (create this folder if needed) [Default datasets folder can be changed by selecting the CR-PLAY/Options menu.]

Select the CR-PLAY/Import IBR(Import VBR) menu to open the Import window. Check the dataset you want to import and click Import to import the IBR(VBR) datasets.

Page 34: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

34

IBR(VBR) datasets are automatically imported in Assets/StreamingAssets/<dataset_name> folder and Assets/IBRAssets(VBRAssets)/<datase_name>, depending the asset nature. [StreamingAssets is a special Unity folder that is automatically copied in the deployment packet, IBR asseets need to be copied in the deployment packet because IBR C++ Plugin needs to use them during rendering]

Setup IBR scene To setup the IBR scene it is necessary to load the pre-configured IBRScene prefab (Assets/Prefabs/) in the current scene and initialize the associated behaviors to let the IBR scene to load a particular dataset. Drag and drop the IBRScene prefab in the scene view (or in hierarchy view). Attach the IBRRender behavior to the Main Camera by drag and drop the IBRRenderer.cs script on

the main camera game object. Select the IBRScene game object in the Hierarchy View to open its Inspector View. Go to the IBRSceneLoader behaviour and set the "Dataset Name Folder" field with the

<dataset_name> to be loaded.

Page 35: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

35

To load the IBR scene in Unity, click on the "Load IBR scene" button in the IBRSceneLoader behaviour.

To generate the superpixel textures and all the needed data, click on the "Prepare IBR scene" button in the IBRSceneLoader behaviour; this step may take some time depending on the number of input images. [This operation need to be performed only the first time a dataset is loaded or every time a dataset is changed].

Page 36: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

36

Use of proxies for Physics Collisions and Depth Occlusion on IBR scene Thanks to the point cloud generated by the Reconstruction Tool, it is possible to have an overall representation of the theoretical virtual spatial occupation of the IBR scene and use 3D models as proxies to simulate specific behavior, such as physics collision and depth occlusion. Physics Collisions can be simulated by placing an invisible collision mesh where the point cloud indicates that a specific portion of the IBR scene will be rendered. The image below show an example of collision mesh in green.

To do this the user has to place the appropriate mesh in the scene, add a MeshCollider component and set an invisible material to its MeshRenderer behaviour. Depth Occlusion simulates depth information in an IBR scene by using 3D depth proxies placed in the 3D world using the spatial reference provided by the point cloud. The procedure to set a depth occlusion proxy is more complex, so let us consider the example of a 3D character model placed behind the column of a small temple rendered using IBR.

Page 37: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

37

In order to simulate the IBR depth information, a 3D proxy cylinder is placed where the column will be rendered (using the point cloud as a spatial reference).

The next step is to ignore the geometric information of the cylinder by applying the DepthMask material that can be found in folder Assets/Materials.

Page 38: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

38

Additionally to the User must change the Render Queue position of the 3D model that needs to be masked. This operation can be performed by adding the script behavior SetRenderQueue to the target model. The script can be found in the Assets/Scripts/Utils folder.

As a result, the proxy model pulls the background (the IBR scene) in front of the 3D target model according to the depth information provided by the proxy.

Page 39: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

39

The same 3D proxy model can be used for both the Physics Collision and Depth Occlusion purposes.

Setup of the VBR scene To setup the VBR scene it is necessary to load the pre-configured VBRBehaviour prefab Assets/Prefabs/ in the current scene and initialize the associated behaviors to let the IBR scene to load a particular dataset. Drag and drop the VBRObject prefab in the scene view (or in hierarchy view). Select the VBRObject/VideoTexture game object in the Hierarchy View to open its Inspector View.

Go to the VBRBehaviour behaviour and set the "Dataset Name" field with the <dataset_name> to load and the "Video Framerate" field with the frame-rate of videotextures.

To load videotextures, click on the "Load Dataset" button in the VBRBehaviour behaviour.

Page 40: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

40

Guidelines for capturing datasets

Camera settings If your camera has a resolution setting, choose the highest resolution possible. If needed, the

algorithms will downscale the images later. Set a fixed aperture, either using aperture priority or manual mode (consult the camera manual if

necessary). Take images that are well-exposed (neither too dark nor too bright). This might require you to

change the exposure time or ISO speed if you are in manual mode. In aperture priority mode, check that the camera selected sensible values for these settings (consult the camera manual if necessary). Make sure that the exposure time DOES NOT CHANGE from one picture to another.

If the depth of field is too small, try setting a larger f-Number (smaller aperture) but be aware that this might darken the image if the exposure time is not increased accordingly.

Focus! If the scene or object are out of focus, reconstruction will fail. It is ok to refocus in each of the images.

Camera placement

Take pictures with sufficient overlap. If in doubt, just capture images at small intervals. This takes more time but ensures that everything has been seen with sufficient overlap. Each point on the surface should be visible in at least four images.

NO YES

Do not take panoramas. The reconstruction will not work if you just rotate the camera while standing still. The algorithms need to observe the surface from different viewpoints. The further away from the object, the larger the translation between shots can be.

NO YES

It makes sense to take “connecting shots”. For example if you take images from farther away and then move closer to capture some details, the algorithm might not be able to connect the details to the large scale surface even though they overlap. Try to take some additional shots while moving towards the details.

Page 41: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

41

NO YES

It is allowed (and can actually be quite beneficial) to take images from different heights. So, if you have the chance to climb on a nearby wall, do not hesitate to do so.

Scene content

The reconstruction is based on the assumption of a static scene. Try to avoid moving parts such as leaves, shadows, or pedestrians as much as possible. A certain amount of movement is tolerable but those parts will be reconstructed less accurately or not at all.

Surfaces with texture variation work well. If your object does not display a lot of texture, it can help if at least some other textured surfaces are visible in the images. In the case of blank walls, it will help if you add a poster for example.

Transparent and mirroring objects do not work. It is possible to have a car in a scene as long as the light is not too bright (for example on a cloudy day or in shadow) and the car does not cover a large part of the image.

Best practices (if possible) and additional information Try to do (at least) two (semi-) circles with different radii around the object. Don’t forget to add

transitions pictures if you want a smoother experience in specific regions. Surface points that are observed under grazing angles in all images are hard to reconstruct. Reconstruction might work better on overcast days (even illumination, no moving shadows, …). Take overview images as well as close-ups for specific details (but remember the connecting shots).

Page 42: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

42

Annex C – Gameplay examples

Fighting Games

The first example for usage is a fighting game with left-right movements. The game camera can move left-right, but also rotate a bit around Y-axis. If more rows of cameras are captured, zoom in-out can be done as well.

Figure 15 - (example from Mortal Kombat X): fighters are facing each other moving from left to right and vice-versa. Camera does the same movement, keeping them in the centre of the screen.

Figure 16 - (example from Mortal Kombat X): while doing special moves, camera rotates around Y-axis to emphasize the particular moment of gameplay.

Page 43: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

43

For such a scenario, the capture of environment should be done by moving on a line, taking pictures from left to right (or vice-versa), with sufficient overlap of the environment between two consecutive photos.

In order to obtain the rotation of the camera as in Figure 16, the same line of capture should be done, but keeping the camera rotated (as it is intended to be in the game).

Doing multiple lines at different distances from the environment allows for a zoom in-out effect in the game.

Figure 17 – A snapshot from Unity showing a scene being prepared for a fighting game. Photos were taken from two parallel lines, with cameras facing forward and slightly left-right.

Page 44: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

44

2D (2,5D) Platformers

The second example of games is for 2D platformer games. The idea is to capture a scene from a frontal point of view, and create the gameplay taking advantage of the captured area. Thus platforms, obstacles etc. will be created in 3D and integrated with the captured scene. Imagine a Rayman level (where movement is possible in four directions left-right-up-down) where the environment is done with CR-PLAY technology.

Camera can also rotate a bit to obtain the 2,5D effect.

Figure 18 - mockup inspired by RAYMAN

In this type of game, character can jump, run, hit objects etc. by using 3D objects added on scene by game dev. The captured environment is bigger than just one screen, so the camera will pan to keep the character in the centre of the screen, maximizing the 3D effect.

Capture of the environment should be done by covering the entire game area, keeping in mind the type of different cameras that are needed in the game (i.e. just frontal or also slightly rotated).

Page 45: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

45

Angry Birds-like games

Similar to 2D platformers, with a different gameplay.

Figure 19 - snapshot from Angry Birds 2

Imagine substituting the background in Figure 19 with a scene captured from a real environment. Of course platforms, birds, pigs will be created using traditional 3D assets.

Camera will pan in four directions, emphasizing the 3D feeling of the scene.

Page 46: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

46

FPS

Gameplay is a FPS, where players can rotate limited to certain degrees. The movement of the character is also restricted to avoid having the camera go outside the capture field.

Figure 20 - snapshot from Silver Arrow, game prototype made with low-fidelity prototype of CR-PLAY Technology – Video is available at https://youtu.be/mKXsoB3OpKU

Capture cameras should be placed accordingly with the intended rotation of game camera and space of movement of the character.

Page 47: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

47

Railway shooters

Similar to FPS, but camera movement is controlled by the game, typically along a rail. The player moves the crosshair with the mouse to hit targets.

Figure 21 - snapshot from Rage HD

The design of the rail along which the camera moves should be accurately thought out in advance and used as a reference when capturing the environment.

Page 48: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

48

Point&Click Adventures

The player interacts with the mouse on objects/parts of the screen that can be either 3D or captured. The captured scene allows for camera movements managed by the game (i.e. controlled pan/rotate).

Figure 22 - snapshot from Syberia II

Page 49: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

49

Multidirectional shooters

Imagine a far-west scenario where outlaws appear from windows, behind a corner etc. Players can move the camera left-right-up-down, and control the crosshair with the mouse to shoot them down.

Figure 23 - mockup image showing a possible scenario for multidirectional shooter

Page 50: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

50

Urban sports

Games such as Basketball and Squash can be done in a captured urban scenario, with some 3D elements like the basket, player etc. The camera can move to some extent.

Figure 24 - snapshots from SuperHoops, iOS game

Page 51: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

51

Annex D – Improvements on Multi-View Intrinsic Images of Outdoors Scenes

Page 52: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

52

Page 53: Deliverable 4 - CR-PLAY › wp...Final...videogame-development.pdf · CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline 2 Document Information Deliverable

CR-PLAY Project no. 661089 Deliverable 4.5 Final Prototypes for mixed pipeline

53