making vr with unreal engine luis cataldi

Post on 20-Feb-2017

793 Views

Category:

Education

35 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Luis Cataldi - Epic Games

Making VR with Unreal Engine

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

Recent Developments

At Epic, we drive engine development by creating content.

We use these as real-world test cases to drive our framework and optimizations to the engine.

20

New: Forward Shading Renderer with MSAA

Supported forward rendering features include:

• Full support for stationary lights, including dynamic

shadows from movable objects which blend

together with precomputed environment shadows

• Multiple reflection captures blended together with

parallax correction

• Planar reflections of a partial scene, composited

into reflection captures

• D-Buffer decals

• Precomputed lighting and skylights

• Unshadowed movable lights

• Capsule shadows

• Instanced stereo compatible

The forward renderer can be faster than the deferred

renderer for some content. Most of the performance

improvement comes from features that can be

disabled per material. By default, only the nearest

reflection capture will be applied without parallax

correction unless the material opts-in to High Quality

Reflections, height fog is computed per-vertex, and

planar reflections are only applied to materials that

enable it.

Forward RenderingForward rendering is the standard, out-of-the-box rendering technique that most engines use. You supply the graphics card the geometry, it projects it and breaks it down into vertices, and then those are transformed and split into fragments, or pixels, that get the final rendering treatment before they are passed onto the screen.

gamedevelopment.tutsplus.com/articles/forward-rendering-vs-deferred-rendering--gamedev-12342

Deferred RenderingIn deferred rendering, as the name implies, the rendering is deferred a little bit until all of the geometries have passed down the pipe; the final image is then produced by applying shading at the end.

gamedevelopment.tutsplus.com/articles/forward-rendering-vs-deferred-rendering--gamedev-12342

Lighting PerformanceLighting is on of the reason for going one route versus the other. In a standard forward rendering pipeline, the lighting calculations have to be performed on every vertex and on every fragment in the visible scene, for every light in the scene.

Deferred Rendering is a very interesting approach that reduces the object count, and in particular the total fragment count, and performs the lighting calculations on the pixels on the screen, thereby using the resolution size instead of the total fragment count.

gamedevelopment.tutsplus.com/articles/forward-rendering-vs-deferred-rendering--gamedev-12342

There are advantages to both!

The UE4 Deferred Render is a full-featured workhorse, but takes a bit of skill to fully leverage. Temporal Anti Aliasing limits how sharp your image can be.

A new UE4 Forward Renderer will be a specialized renderer, with less features, but faster baseline. Multi Sampling Anti Aliasing (MSAA) is the sharpest solution for anti-aliasing.

Instanced Stereo RenderingLets us use a single draw call to draw both the left and right eyes, saving CPU (and some GPU) time.

• Currently works on PC, and coming soon on PS4 and mobile platforms• Enable it in your project’s settings

Instanced Stereo RenderingInstanced stereo is available now for the deferred (desktop) renderer and the PS4.

We’re also implementing multi-view extension support for the forward (mobile) renderer, and improving PS4 support to better utilize the hardware.

Hidden And Visible Area Mesh

We draw meshes to cull out geometry that you can’t see, and only apply post processing where you can.

For the deferred renderer, the visible area mesh is a bigger optimization!This is specific per-platform.

Hidden And Visible Area MeshThe Hidden Area Mask uses a mesh to early-out on pixels that aren’t visible in the final image.

Hidden And Visible Area MeshThe Visible Area Mask uses a mesh to constrain our post processing to the visible pixels only.

Camera RefactorAs of 4.11, we’ve completely rewritten the camera system in order to make developing much easier!

• Camera Components now move exactly as the real HMD is moving

• You can attach components (meshes, UI, etc) directly to the camera component!

Platform Support

As of 4.12, we support the following platforms out of the box:

• Oculus Rift• Steam VR (including the HTC Vive)• PSVR• OSVR (preview)• Samsung Gear VR• Google Daydream• Leap Motion

Platform Support

Create once, and deploy anywhere.

Mobile

Desktop / ConsoleOculus RiftHTC Vive / Steam VRPSVROSVR

Samsung Gear VRGoogle Daydream

Platform Support

All of these platforms go through UE4’s common VR interfaces, so you can make your content once, and deploy it anywhere.

• Unified Camera System• Motion Controller System• Optimized rendering paths• Low-latency optimizations

Oculus

Vive PSVROSVR

VR Project Template

We’ve added a new project template designed for Virtual Reality on desktop and console.

The template can be selected from the Blueprint tab as a new project is created.

VR Project Template

The motion controller template provides examples for object interaction and manipulation as well as point based teleportation.

VR Project Template

39

The VR Editor

New: Contact ShadowsContact shadows allow for highly detailed dynamic shadows on objects.

The Contact Shadows feature adds a short ray cast in screen space against the depth

buffer to know whether a pixel is occluded from a given light or not. This helps provide

sharp detailed shadows at the contact point of geometry.

New: Automatic LOD GenerationUnreal Engine now automatically reduces the polygon count of your static meshes to create LODs!

Automatic LOD generation uses what is called quadric mesh simplification. The mesh simplifier will calculate

the amount of visual difference that collapsing an edge by merging two vertices would generate. It then picks

the edge with the least amount of visual impact and collapses it. When it does, it picks the best place to put

the newly merged vertex and removes any triangles which have also collapsed along with the edge. It will

continue to collapse edges like this until it reaches the requested target number of triangles.

44

New: Improved Per-Pixel Translucent Lighting

Recent Developments

45

New: Reflection Capture Quality Improvements

Recent Developments

46

New: Full Resolution Skin Shading

Recent Developments

47

UE4.11: Realistic Eye Shading

Recent Developments

48

UE4.11: Realistic Hair Shading

Recent Developments

49

UE4.11: Realistic Cloth Shading

Recent Developments

50

UE4.12: Cinematic Cameras and Cinematic Viewports

Recent Developments

51

All New Audio Engine for Unreal

What’s Next

52

All New Animation Tools

What’s Next

53

All New Mesh and Authoring Tools

What’s Next

And much more!

How can we learn to harness the power of

Unreal Engine?

VR Learning Resources for Unreal Engine:Video:

• 2015 UE4 - VR and Unreal Engine• Making Bullet Train and Going off the Rails in VR• VR Bow and Arrow Tutorial w/ Ryan Brucks - Training Livestream• Training Livestream - Sam and Wes' VR Stream: Cameras, Multiplayer, Tips and Tricks!• Creating Interactions in VR with Motion Controllers 1-3• Setting Up VR Motion Controllers• VR Networking and 3D Menus• Up and Running with Gear VR• Developing for VR • Integrating the Oculus Rift into UE4 

Presentations:• UE4 VR - Niklas Smedberg• Lessons from Integrating the Oculus Rift into UE4• Going Off The Rails: The Making of Bullet Train

Links:• Sam Deiter - 10 VR tips for Unreal Engine• Tom Looman’s - Getting Started with VR in Unreal Engine 4

•Unreal Engine 4.14 Release Notes

•Unreal Engine 4.13 Release Notes

•Unreal Engine 4.12 Release Notes

•Unreal Engine 4.11 Release Notes

•Unreal Engine 4.10 Release Notes

•Unreal Engine 4.9 Release Notes

•Unreal Engine 4.8 Release Notes

•Unreal Engine 4.7 Release Notes

•Unreal Engine 4.6 Release Notes

•Unreal Engine 4.5 Release Notes

•Unreal Engine 4.4 Release Notes

•Unreal Engine 4.3 Release Notes

•Unreal Engine 4.2 Release Notes

•Unreal Engine 4.1 Release Notes

The Unreal Engine Release Notes:

Mitchell McCaffrey’s - Mitch VR Labs

• Mitch's VR Lab - an Introduction

• Mitch's VR Lab - Look Based interaction

• Mitch's VR Lab - Simple Teleportation Mechanic

• Mitch's VR Lab - Introduction to SteamVR

• Mitch's VR Lab - Simple Head IK

• Mitch’s UE4 Forum Post

Education Community VR for UE4:

Education Community VR for UE4:

Free UE4 Community Blueprints:• Communication Training - Zak Parrish• Blueprints Compendium - VOLUME II• BP_Compendium.pdf• Network Compendium

Free UE4 Community VR Learning Channels:• Unreal Engine VR Curriculum

Free UE4 Community ArchViz Learning Channels:• Architectural Visualization Tutorials

Education Community VR for UE4:Paid Elearning Courses:

• Unreal Engine 4: The Complete Beginner's Course

• Learn to Code in C++ by Developing Your First Game

• Complete Introduction to Unreal 4• An Introduction to Creating Realistic Materials i

n UE4• Master Blueprints in Unreal Engine 4 - Endless

Runner• Create a Helicopter Game Control System in U

nreal Engine 4• Unreal Essential Training - Lynda.com• Unreal: Learn Lighting - Lynda.com• 3dmotive - Unreal Engine courses• Pluralsight - Unreal Engine courses

Much more coming soon.

Improved metadata support● Skill level

● Engine version

● Sitemap filters

● Checkpoints

Learning Resources for Unreal Engine:

Learning Resources for Unreal Engine:

Getting the most value from the UE4 Launcher Learn Tab

Learning Resources for Unreal Engine:

Getting the most value from Content Examples

Design Considerations for VR

One of the biggest issues for working in VR is Motion/Simulation Sickness.

How is it caused?

en.wikipedia.org/wiki/Virtual_reality_sickness

Sensory conflict theory believes that sickness will occur

when a user's perception of self-motion is based on

incongruent sensory inputs from the visual system,

vestibular system, and non-vestibular proprioceptors, and

particularly so when these inputs are at odds with the user's

expectation based on prior experience.

Five typical causes of Motion/Simulation Sickness in VR

Read more about it

1. Non-forward movements• No unnatural movements

2. Awareness of Vection• When a large part of the visual field moves, a viewer feels like

he has moved and that the world is stationary

3. The feeling of accelerations

4. Too much camera YAW

5. Helped by adding a static reference frame

Things we CAN DO in Unreal Engine to improve VR Games and Experiences

You MUST maintain framerate

For the VR experience to feel smooth, your game needs to run on 75 hz (Oculus DK2) or even 90 hz. (HTC Vive and Oculus CV1) depending on the device. To see the current framerate type in “stat fps” or “stat unit” (for more detailed breakdown) in your console when running the game.

Here is where VR Instanced Stereo Can Help“Basically, we’re utilizing hardware instancing to draw both eyes simultaneously with a single draw call and pass through the render loop. This cuts down render thread CPU time significantly and also improves GPU performance. Bullet Train was seeing ~15 – 20% CPU improvement on the render thread and ~7 – 10% improvement on the GPU.” – Ryan Vance.

To enable this feature in 4.11 and above, go to your Project Settings and look for “Instanced Stereo” under the Rendering category.

Things to keep in at the front of your mind:

Check your performance constantly to ensure that you are hitting

your VR performance targets.

Things to keep in at the front of your mind:

○ Maintain a very simplistic approach to making your content.  

○ Minimize complex shaders as best possible.

○ Add detail to the mesh within reason in lieu of relying of complex

shaders for surface details.

Things to keep in at the front of your mind:

LOD's and aggressive culling are a must to ensure that you are

hitting your VR performance targets.

Known issues and possible workarounds:

Screen Space Reflections(SSR)

SSR will work in VR but may not give you the results that you

want. Instead consider working with reflection probes.

Known issues and possible workarounds:

Normal Mapping Issues  

When viewing Normal maps on objects in VR, you will notice that they do not have the

impact that they might have once had. This is because normal mapping does not

account for a binocular display or motion parallax. Because of this, Normal maps have

a tendency to look flat when viewed with a VR device.

Known issues and possible workarounds:

Parallax Mapping  

Parallax mapping takes Normal mapping to the next level by accounting for depth

cues, Normal mapping does not. A Parallax shader can better display depth

information, making objects appear to have more visually rich detail. This is because

no matter what angle you look at, a Parallax map will always correct itself to show you

the appropriate depth information from that viewpoint. The best use of a Parallax map

would be for cobblestone pathways and fine detail on surfaces.

Known issues and possible workarounds:

Tessellation Shader Displacement  

Tessellation Shader Displacement will displace 3D Geometry in real time by adding

details that are not modeled into the object. Tessellation shaders do a great job of

displaying information because tessellation shaders actually create the missing detail

by creating more vertices and displacing them in 3D Space.

Launching VR Preview:

Testing out your VR headset is straightforward, simply select “VR Preview” from the

Play dropdown button. By default the head tracking will work without changes to your

existing project or template.

GPU Profiling:To capture a single frame with GPU timings press Ctrl+Shift+, or type in “profilegpu” in the console. This command collects accurate timings of the GPU, you will find that certain processes are a heavy burden on the framerate (Ambient Occlusion is one common example) when using VR.

See GPU Profiling & Performance and Profiling for documentation.

Disable Heavy Post-Processors:Due to the demanding requirements of VR many of the advanced Post Processing features that are normally use should be disabled. This will need to be done per-level.

• Add a Post Process(PP) volume to your level if there is not already one there.• Select the PP volume and enable the Unbound option so that these settings will be applied to the entire level.

• Expand each Settings of the Post Process Volume and disable any undesired active PP settings by enabling that property by clicking on it and then set the value from the default, usually 1.0, to 0 to disable the feature.

• Consider first disabling the biggest offenders to VR performance like Lens Flares, Screen Space reflections, Screen Space Ambient Occlusion, and anything else that might be impacting performance.

• While some of the features are disabled by settings in your .INI files, this ensures that performance will not be affected if the .INI is removed by mistake.

UE4 – Lighting for VR

• Dimmer lights & colors can help reduce simulation sickness.

• Use Static Lighting over Stationary or Dynamic.

• Make sure your Stationary / Dynamic Lights do not overlap.

• Baked lights are the best option for VR environments.

• If using Dynamic Shadows, only have one shadowing light.

• Use Stat LightRendering to see current lighting cost.

• Profile, Profile, Profile to ensure you are maintaining performance goals.

Fake shadows Wherever You Can!!Using cheats like fake blob shadow drop to simulate dynamic shadows are a good consideration for keeping VR project running at frame.

Blob shadow example. Image by Eric Chadwick

UE4 – Effects for VR• Mesh based VFX work the best for VR.

• Camera Facing particles do not hold up well in VR on their own due to the stereoscopic

view.

• The Dither Temporal AA Material Function can make Opacity masked objects look like

Translucent ones.

• Local Space rotation does not look correct in VR.

UE4 – Environments for VR

• Use Reflection probes instead of screen space reflections.

• Again… Texture Blob shadows are a cheap alternative to dynamic shadows.

• The Merge Actor Tool can help cut down on Static Mesh draw call without having to do work

outside of UE4.

Some very important things we all need to know about Unreal Engine.

The Unreal Engine Framework

The Unreal Engine Framework

GameInstance

GameMode

Pawn Class

HUD Class

PlayerController Class

GameState Class

PlayerState Class

The Unreal Engine Framework

GameInstance

GameMode

Pawn Class

HUD Class

PlayerController Class

GameState Class

PlayerState Class

The GameMode is the definition of

the game.

● It should include things like

the game rules and win

conditions.

● It also holds important

information about:

○ Pawn

○ PlayerContoller

○ GameState

○ PlayerState

The Unreal Engine Framework

GameInstance

GameMode

Pawn Class

HUD Class

PlayerController Class

GameState Class

PlayerState Class

The Pawn class is the base class

of all Actors that can be controlled

by players or AI.

● The Pawn represents the

physical location, rotation,

etc. of a player or entity

within the game.

● A Character is a special type

of Pawn that has the ability to

walk around.

The Unreal Engine Framework

GameInstance

GameMode

Pawn Class

HUD Class

PlayerController Class

GameState Class

PlayerState Class

A PlayerController is the interface between the Pawn and the human player controlling it.

● The PlayerController decides what to do and then issues commands to the Pawn (e.g. "start crouching", "jump").

● Putting input handling or other functionality into the PlayerController is often necessary.

● The PlayerController persists throughout the game, while the Pawn can be transient.

The Unreal Engine Framework

GameInstance

GameMode

Pawn Class

HUD Class

PlayerController Class

GameState Class

PlayerState Class

The GameInstance is a class who’s

state persists switching of levels,

game modes, pawns etc. Where

classes like GameMode or

PlayerController are being reset

and data stored in those classes is

removed.

The Unreal Engine Framework

GameInstance

GameMode

Pawn Class

HUD Class

PlayerController Class

GameState Class

PlayerState Class

A PlayerState is the state of a

participant in the game, such as a

human player or a bot that is

simulating a player. Non-player AI

that exists as part of the game

would not have a PlayerState.

The Unreal Engine Framework

GameInstance

GameMode

Pawn Class

HUD Class

PlayerController Class

GameState Class

PlayerState Class

The GameState contains the state

of the game, which could include

things like the list of connected

players, the score, where the

pieces are in a chess game, or the

list of what missions you have

completed in an open world game.

The Unreal Engine Framework

GameInstance

GameMode

Pawn Class

HUD Class

PlayerController Class

GameState Class

PlayerState Class

The HUD is the base object for

displaying elements overlaid on the

screen. Every human-controlled

player in the game has their own

instance of the AHUD class which

draws to their individual Viewport.

Object

Actor

Pawn

Character

Base building blocks in the Unreal Engine

Any object that can be placed into a level

Subclass of Actor and serve as an in-game avatar

Subclass of a Pawn that is intended to be used as a player character

CharacterMovementComponent

Pawn

Character

Default Pawn

Wheeled Vehicle

CapsuleComponent

SkeletalMeshComponent

Etc.

VehicleMovementComponent

SkeletalMeshComponent

PhysicsHandle

Etc.

SpectatorPawn

DefaultPawnMovementComponent

StaticMeshComponent

CollisionComponent, Etc.

Controller1-to-1 Relationship

How about programming interactivity for VR?

Programming VR Interaction with Blueprints

Blueprints in Unreal Engine is a complete visual scripting system based on the concept of using a node-based interface to create interactions from within Unreal Editor.

Programming VR Interaction with Blueprints

Learning Blueprints through Content Examples

Hey!! We need AUDIO for VR too!!

UE4 – Audio for VRAmbient Sound Actors in VR

Ambient Sound Actor can be used for

many purposes such as ambient

looping sounds and non-looping

sounds. Generally, the Ambient

Sound Actor conforms to the real

world where the closer you are to a

sound, the louder it will appear.

UE4 – Audio for VRSound Properties

You can assign a sound asset from

the Details panel by selecting an

asset from the Sound settings drop-

down menu or by highlighting a

sound asset in the Content Browser

and clicking the button.

UE4 – Audio for VRAttenuation Properties

Attenuation is the ability of a sound

to decrease in volume as the player

moves away from it.

It is advisable to use Sound

Attenuation objects whenever

possible, if for no other reason than

to give broad control over the

settings for many Actors.

UE4 – Audio for VRNew: Stereo Spatialization

3D spatialization is now possible for

stereo audio assets.

The 3D Stereo spread parameter

defines the distance in game units

between the left and right channels

and along a vector perpendicular to

the listener-emitter vector.

UE4 – Audio for VRAudio Volume

Audio Volumes allow you to control

and apply various sounds in your

level as well as provide an avenue

to create compartmentalized audio

zones where you can control what

is heard inside and outside of the

volume.

Additional toolsets in Unreal Engine to enhance VR:

Complete state of the art suite of AI Tools.

Additional toolsets in Unreal Engine to enhance VR:

Complete set of tools for animation and animation retargeting

So What’s Next?

Let’s Build a new VR Project, VR Pawn and VR Player

Controller!

Begin a new Project with a Third Person Template ✓ Desktop/Console

✓ Maximum Quality

✓ With Starter Content

Bring in new assets ✓ Infinity Blade Grass Lands

✓ Infinity Blade Ice Lands

✓ Audio

Navigate to the Infinity Blade Grass Lands folder ✓ Open the ElvenRuins map

➔ Establish a Profiler workflow with camera bookmarks

➔ Evaluate the profile data➔ Address performance issues on

a per-case basis

* Look out for Post Process related issues

* Look out for lighting and shadow related issues

* Look out for issues related to reflections

* Look out for issues related to transparency

** Consider the tradeoffs for keeping or changing these elements in the level

Working with the GPU Profiler

Utilize BumpOffset material nodes to enhance the VR experience ** When viewing Normal maps on objects in VR, you will notice that they do not have the same impact. This is because Normal Mapping does not account for having a binocular display or motion parallax. Because of this, Normal Maps will often look flat when viewed with a VR device. However that does not mean that you should not or will not need to use Normal maps, it just means that you need to more closely evaluate if the data you are trying to convey in the Normal Map would be better off being made out of geometry. Below you will find some different techniques that can be used in place of Normal maps. This is a good place to explore BumpOffset.

The BumpOffset Reference Place ✓ The ReferencePlane

✓ Specifies the approximate height in texture space to apply the effect. A value of 0 will appear to distort the texture completely off the surface, whereas a value of 0.5 (the default) means that some of the surface will pop off while some areas will be sunken in.

Understand the different parts of the Blueprint editor ✓ Components ✓ Menu Bar✓ Details✓ Viewport✓ Construction Script✓ Graph Editor✓ My Blueprint: Variables,

Functions, Components✓ Debug✓ Compiler Results

Build the VR-Pawn-BP ✓ Scale the Capsule Component Height to 120

✓ Add a Spring Arm Component✓ Add a Camera Component✓ Nest the Camera Component

under the Spring Arm✓ 0 out the Target Arm length of the

Spring Arm✓ Move the Spring Arm up 90 units

in Z✓ Toggle on the Use Pawn

Controller Tick Box

Build the VR-Pawn-BP ✓ Move to the Event Graph✓ Get the InputAxis for LookUp

✓ Connect it to the Add Controller Pitch Input

✓ Get the InputAxis for Turn✓ Connect it to the Add

Controller Yaw Input

Create a new GameMode Override ✓ Call the new GameMode Override something like VRGameMode

✓ Save it to something like the Blueprints folder

Change the Default Pawn Class to VR-Pawn-BP ✓ Assign VR-Pawn-BP to the Default Pawn Class on the GameMode Override

Create a new PlayerController Class Blueprint in the GameMode Override ✓ Call the new PlayerController something like VRPlayerController

✓ Save it in the Blueprints folder

Navigate to the Project Settings ✓ Add some inputs

Create two new Action Mappings ✓ Create a Teleport Action Mapping and assign it the Middle Mouse Button and the Gamepad Right Shoulder Button

✓ Create a Glide Action Mapping and assign it the Right Mouse Button and the Gamepad Left Shoulder Button

Back in the VRPlayerController, build the Glide Linetrace Blueprint graph ✓ In the EventGraph on the VRPlayerController, bring in the InputAxis for Glide

✓ Get Player Camera Manager✓ Pull out the GetActorLocation✓ Pull out the

GetActorFowardVector✓ Multiply Float x Float from the

GetActorForwardVector and promote the multiple to a GlideDistance Variable set to something like 1500 units

✓ Create a LineTraceByChannel✓ Connect the ExecutionPin from

the Glide InputAxis✓ Connect the Start from the

GetActorLocation✓ Connect the Out from the Addition

of the GetActorLocation and the multiple of the below nodes

Create 2 new Vector Variable ✓ Create a StartTrace Vector Variable

✓ Create a EndTrace Vector Variable

Create BreakHitResult and connect the new Variables ✓ Set the EndTrace to the TraceEnd of the BreakHitResault

✓ Wire through the Set StartTrace but wire that in from a GetPlayerPawn to a GetActorLocation node

Create a Timeline called something like GlideCruve ✓ Make a FloatCurve called something like GlideFloat

✓ Add 2 Keys✓ The first Key set to 0 and 0✓ The second Key set to something

like .9 and .9✓ Set the Use Last Keyframe

Tickbox

Use the Timeline Component to get the SetPlayRate node to readjust the rate ✓ Wire the SetPlayRate through to the StartTrace

✓ Divide the NewRate by 1 over a new Variable called RateOfGlide

✓ Set this RateOfGlide to something like 20

✓ Wire this through to the PlayfromStart of the GlideCurve Timeline

Add a LERP (Vector) and a SetActorLocation node ✓ Connect the GlideFloat out from the Timeline Curve to the LERP Alpha

✓ Connect the StartTrace to A✓ Connect EndTrace to B✓ Wire the Timeline to a

SetActorLocation brought in from the GetPlayerPawn Node

✓ Connect the LERP output to the Location X and Y but take the Z Location from the PlayerPawn

Compile and test the Glide Input ✓ Compile

✓ Test

✓ Tweak

Duplicate the Trace with inputs for the Teleport ✓ Copy the first part of the Graph

✓ Paste below

To the Teleport Input, change the GlideDistance to a new Variable called TeleportDistance ✓ Delete the GlideDistace from the

Teleport Trace

✓ Add a new Variable called

TeleportDistance

✓ Set it to a distance of 10000

To the Teleport Input, add a DotProduct, CanTeleport Boolean, and Point to Nav ✓ Create a new Variable called CanTeleport of type Boolean

✓ Set it by dragging out from the Impact normal of the Hit Result, Setting Z to 1, Dragging out and getting a GreaterThan Node, Setting it to something like .9, and wiring it to the CanTeleport node

✓ Wire that through to a Branch node to set True

✓ Wire all that into the graph ✓ Drag out from Location and

create a ProjectPointToNavigation connected to the SetEndTrace

✓ Drag out from it, AddVector, and connect to the SetActorLocation after adding 90 to the Z axis to offset the player height

Let's add a Teleport Preview Sphere ✓ Create a new Actor Class Blueprint

Make a new Material ✓ Create a New Material called something like VR-MAT

✓ Change it to a Translucent Blend Mode with Unlit Shading Model

✓ Add a White Constant3Vector for the Emissive Color

✓ Wire a Scalar Value, a Fresnel Node, and a Power node into the Opacity

✓ Set the Scalar value to something like 4.5

✓ Apply it to the VR-Preview Sphere

In the VRPlayerController, let set up the VR-Preview ✓ Duplicate the Teleport trace graph✓ Change the Teleport input to a

EventTick input

Add a SpawnActor from Class to the new VR-Preview graph ✓ Use Event Begin Play to call the SpawnActorFromClass node

✓ Assign the VR-Preview-BP to it✓ Wire the HitResult's Location into

the Spawn Transform✓ Promote the Return Value to a

variable called something like VR Preview

Call the PreviewSphere to toggle its Visibility ✓ Use the Branch statement from the CanTeleport to set up the Visibility toggle for the VR-Preview sphere component

✓ Wire the output to the SetActorLocation node with the VR-Preview-BP as the target node to be relocated

✓ Make sure to offset the sphere up in Z value using the add vector node

Let's lay in some Localized VR Audio ✓ Navigate to the Audio folder

✓ Find the Bird and Wind sounds

Let's lay in some Localized VR Audio ✓ Drag in the 7 Bird sounds to the

Cue

✓ With the Bird Sounds selected,

choose Random from the Pallet

to auto connect them

Let's lay in some Localized VR Audio ✓ Drag in a wind loop sound

✓ Add a Looping Pallet node

✓ Wire it all though

✓ Save the node

Let's lay in some Localized VR Audio ✓ Another Mix Cue for the top of the hill with howling winds

Let's lay in some Localized VR Audio ✓ The new CUE is set with Override

Attenuation set

✓ The Radius scaled up

✓ The Falloff Distance values

increased

Let's build some interactable Blueprints ✓ Make a new Actor Class Blueprint

✓ Call it something like Idol-BP

Assemble the new Idol Blueprints ✓ Bring in an Angle Statue Static

Mesh Component

✓ Bring in a Torch Static Mesh

Component

✓ Bring in a Particle System Fire

Component

✓ Bring in a Sphere Mesh

Component

✓ Duplicate the VR-MAT material

and tint the color to be slightly

yellowish in hue

Right Click to create a new Blueprint Interface ✓ Call the new Blueprint Interface something like VR-Interact-BPI

Right Click to create a new Blueprint Interface ✓ Make a new BPI Variable Function called OnLookAt

Right Click to create a new Blueprint Interface ✓ Make a new BPI Variable Function called OnLookAt

Let's add an Interact Action Mapping Input ✓ Call it something like Interact

✓ Assign it to the F key on the

keyboard

✓ Assign it to the Gamepad Face

Button Bottom

Add the Interact Action Mapping to the VRPlayerController Blueprint ✓ Duplicate the LineTrace and wire in the Interact ActionMapping

Add Wire in the OnLookAt BPI to the VRPlayerController ✓ Add an IsValid to make sure you are hitting an object from the HitResult’s Hit Actor

✓ Wire that through to the OnLookAt node from the BPI

Let’s replace the existing angel Idol with our new Idol-BP ✓ Select the Idol-BP in the Content Browser

✓ Select the angel mesh in the level✓ Right click and use the Replace

Selected Actor With utility to swap it for the Idol-BP

✓ Zero out the rotation in Z to get it facing the right way

Modify the Interact-MAT to prepare it for interaction ✓ Modify the Interact-MAT by

adding a Scalar Node called

something like MatPower

✓ Multiply it to the Fresnel Node

✓ Pipe the output to the PowerNode

✓ This will make the sphere around

the idol go transparent

✓ We will be driving the MatPower

value with the Idol-BP to toggle

the visibility of the Sphere

Back in the Idol-BP, set up the Torch and Sphere Preview ✓ Call the BPI Event OnLookAt

✓ Wire in a .2 Delay

✓ Wire in a Create Dynamic

Material Instance Node with the

Sphere as the Target

✓ Wire in a Set Scalar Parameter

Value with the Return Value as

the Target and the MatPower as

the Parameter Name and the

Value set to .8

Back in the Idol-BP, set up the Torch and Sphere Preview ✓ Add in another Delay of about 1.3

Duration

✓ Reset the Scalar Parameter

Value to 0 for the Parameter

Name of MatPower again

✓ Wire in a Set Visibility for the

Particle System to turn it on after

the sphere Material blinks out

Place more Idols around the level. Both Idol-BP and Static Mesh Idols.✓ This way, the VR player can

explore the level looking for interactable objects

Let’s keep track of all the Idol’s we interact with to affect gameplay ✓ Add an Integer Variable in the

Idol-BP called IdolValue

✓ Add an Integer Variable in the

VR-Pawn-BP called IdolValue

In the VR-Pawn-BP In the Idol-BP

Let’s keep track of all the Idol’s we interact with to affect gameplay ✓ Create a new GameState in the

GameMode Override called

VRGameState

In the VRGameState ✓ Create a CustomEvent

✓ Call it something like LevelTest

✓ Cast to the VR-Pawn-VP

✓ Use it to call IdolCount

✓ Test to see if IdolCount is Equal

or Greater than X

✓ Branch to test if True

✓ Create a Boolean Variable called

LevelClear set to False by default

✓ If the Branch Statement tests

True, set LevelClear to True

In the Idol-BP, insert this below increment function ✓ After the OnlookAt Event and the

Delay…

✓ Cast to the VR-Pawn-BP

✓ Set IdolCount

✓ Get IdolCount

✓ Add IdolValue to IdolCount

✓ Wire that through to the

DynamicMaterialInstance

** You can add a PrintString after the

Set IdolCount to see the value

increasing on screen

Let’s set up a simple Sequencer sequence to play once we find all the Idols ✓ In the top menu Cinematics

dropdown, choose Add Level

Sequence

Let’s set up a simple Sequencer sequence to play once we find all the Idols ✓ Once the Sequencer window

opens

✓ Add a Camera Cuts

✓ An Events track

✓ A Fade Track

Let’s set up a simple Sequencer sequence to play once we find all the Idols ✓ Use the Camera Icon to add a

new Cinematic Camera

✓ Key frame the camera movement

as desired

Let’s set up a simple Sequencer sequence to play once we find all the Idols ✓ Add a set of doors as Actors to

Sequencer

✓ Use the Rotation Transform to

key frame the doors to open as

the camera animates into place

during the sequence

✓ You can add a fade track to face

in and out with ease

Back at the Idol-BP, add a Level Sequence Actor Variable ✓ Add a Level Sequence Actor

✓ Set it to Public

Back at the Idol-BP, add a Level Sequence Actor Variable ✓ Add a Level Sequence Actor

✓ Set it to Public

Let’s modify the torch flame and change the color for effect ✓ Change the initial color

✓ Change color over life

✓ Add a light to the smoke

✓ Remove the light from the

Trans_Square

Duplicate the Idol-BP to Idol-Trans-BP ✓ Change the ParticleSystem to the

blue fire

Modify the Idol-Trans-BP EventGraph to open a new level ✓ Use the same Event OnLookAt

from the BPI to run a Delay

✓ Set the Visibility on for the Blue

Flame ParticalSystem

✓ Test to ensure that the LevelClear

is True

✓ Delay again

✓ Then Open a new Level

✓ Promote Level Name to a new

Variable and make it Public

Assign the Next Level in the Idol-Trans-BP ✓ Assign a NextLevel name in the

public variable of the placed Idol

in the Details Panel

Setting up an optional Fade Effect for the Teleport • Add a Sphere Static Mesh

component to the VR-Pawn

• Turn collision OFF on the

Sphere or it will block the

LineTrace

Setting up an optional Fade Effect for the Teleport ✓ If you want to fade to white,

make the Vector3Node color

White

✓ If you want to fade to black,

make the Vector3Node color

Black

Make it Two Sided

Call the Parameter “Alpha”

Setting up an optional Fade Effect for the Teleport ✓ Create a Custom Event in the

VR-Pawn and set up the blink

graph

Setting up an optional Fade Effect for the Teleport ✓ Make a Timeline to drive the

Fade

✓ Set the out value to .09 and set

Use Last Keyframe

Setting up an optional Fade Effect for the Teleport ✓ Drive the Scalar Parameter of

the Alpha with the Timeline

✓ ** Make sure to connect the

SwitchonEtimelineDirection on

Finished

Setting up an optional Fade Effect for the Teleport ✓ Wire the FadeSphere Custom

Event into the Teleport Graph

before the SetActorLocation

182

Give it a try, it’s a lot of fun.  

luis.cataldi@epicgames.com

top related