high dynamic range and other fun shader...

57
High Dynamic Range and other Fun Shader Tricks High Dynamic Range and other Fun Shader Tricks Simon Green

Upload: nguyenanh

Post on 22-May-2018

216 views

Category:

Documents


2 download

TRANSCRIPT

High Dynamic Range and other Fun Shader Tricks

High Dynamic Range and other Fun Shader Tricks

Simon Green

NVIDIA PROPRIETARY

Demo Group Motto

“If you can’t make it good, make it big. If you can’t make it big, make it shiny.”

NVIDIA PROPRIETARY

Overview

The OpenGL vertex program and texture shaderextensions enable hardware acceleration of effects that were previously only possible in offline rendering3 case studies:

High Dynamic Range ImagesReal-time CausticsProcedural Terrain

NVIDIA PROPRIETARY

Case Study 1: High Dynamic Range

Summary: displaying HDR images in realtimeDefinition of dynamic range:The ratio of the maximum intensity in an image to the minimum detectable intensityMost imagery used in computer graphics today is stored in 8 bits per componentLow Dynamic Range: 0 = black, 255 = whiteLight in the real world is not constrained in this way!Dynamic range between bright sunshine and shadow can easily be 10,000 to 1

NVIDIA PROPRIETARYExposure time = 0.270 s

NVIDIA PROPRIETARYExposure time = 0.133 s

NVIDIA PROPRIETARYExposure time = 0.066 s

NVIDIA PROPRIETARYExposure time = 0.033 s

NVIDIA PROPRIETARY

What is High Dynamic Range?

The human visual system adapts automatically to changes in brightnessIn photography, shutter speed and lens aperture are used to control the amount of light that reaches the filmHDR imagery attempts to capture the full dynamic range of light in real world scenesMeasures radiance = amount of energy per unit time per unit solid angle per unit area W / (sr.m2)

8 bits is not enough!

NVIDIA PROPRIETARY

Why Do We Need HDR?

It effectively allows us to change the exposure after we've taken/rendered the pictureDynamic adaptation effects – e.g. moving from a bright outdoor environment to indoorsAllows physically plausible image-based lightingBRDFs may need high dynamic rangeEnables realistic optical effects – glows around bright light sources, more accurate motion blurs

NVIDIA PROPRIETARY

Creating HDR Images from Photographs

"Recovering High Dynamic Range Radiance Maps from Photographs", Debevec, Malik, Siggraph 1997Using several images of the same scene taken with different exposures:

Calculates the non-linear response curve of cameraRecovers the actual radiance at each pixel

Environment maps can be captured by either:Photographing a mirrored sphere (“lightprobe”)Combining 2 or more 180 degree fisheye images

NVIDIA PROPRIETARY

Displaying HDR Images

To display an HDR image at a given exposure, we use the following equation:

Z = f(Et)

whereZ = pixel valueE = irradiance valuet = exposure timef = camera response curve

NVIDIA PROPRIETARY

Displaying HDR Images using Graphics Hardware

Previous work:“Real-Time High Dynamic Range Imagery”, Cohen, Tchou, Hawkins, Debevec, Eurographics 2001Split HDR image into several 8-bit textures,display by recombining using multitexturing and register combiners on NVIDIA TNT2 and aboveHard because combiners treat texture values as fixed-point numbers between 0 and 1. Largest number you can multiply by is 4Requires different combiner setups for different exposure ranges, so exposure can only be changed on a per-primitive basis

NVIDIA PROPRIETARY

NVIDIA PROPRIETARY

Representing HDR Imagery in OpenGL

GeForce3/4 support a 16-bit format known as HILOStores 2 16-bit components: (HI, LO, 1)Filtered by hardware at 16-bit precisionSigned version intended for storing high-precision normal mapsWe can also use this format to store high(er) dynamic range imageryRemap floating point HDR data to gamma encoded 16-bit fixed-point range [0, 65535]Unfortunately, only two components so we need two HILO textures to store RGB

NVIDIA PROPRIETARY

Displaying HDR Images using the OpenGL Texture Shader Extension

To display the image, we need to multiply the HDR radiance values by the exposure factor, and then re-map them to the displayable [0,255] rangeThis can be achieved using the GL_DOT_PRODUCT_TEXTURE_2D operation of the OpenGL texture shader extensionExposure is sent as texture coordinates, the dot product performs the multiply for both channelsWe create a 2D texture that maps the result back to displayable values

NVIDIA PROPRIETARY

Displaying HDR Images using OpenGL Texture Shaders

NVParse code:!!TS1.0

texture_cube_map();

dot_product_2d_1of2(tex0);

dot_product_2d_2of2(tex0);

Pseudo code:0: hilo = texture_cube_map(hdr_texture, s0, t0, r0)

1: dot1 = s1*hi + t1*lo + r1*1.0; // = r_exposure*r + 0 + r_bias

2: dot2 = s2*hi + t2*lo + r2*1.0; // = 0 + g_exposure*g + g_bias

color = texture_2d(lut_texture, dot1, dot2)

NVIDIA PROPRIETARY

Displaying HDR Images using OpenGL Texture Shaders

Requires 2 passes to render RGB, using glColorMask to mask off color channelsFirst pass renders R and G:

texcoord1 = (r_exposure, 0.0, r_bias) texcoord2 = (0.0, g_exposure, g_bias)

Second pass renders B:texcoord1 = (0, 0, 0)texcoord2 = (b_exposure, 0.0, b_bias)

NVIDIA PROPRIETARY

Exposure = 0.25

NVIDIA PROPRIETARY

Exposure = 0.0625

NVIDIA PROPRIETARY

Exposure = 0.015625

NVIDIA PROPRIETARY

HDR Effects

HDR FresnelGlowAutomatic exposureVignette

NVIDIA PROPRIETARY

HDR Fresnel

Surfaces more tangent to the viewer reflect moreReflectivity can vary by a factor of 20 or moreHDR environment map produces more accurate resultsCalculate per-vertex in vertex programApproximate Fresnel function as (1-V.N)^pSend down exposure as texture coordinate

NVIDIA PROPRIETARY

NVIDIA PROPRIETARY

Image-Space Glow

Also known as:GlareSpecular bloomFlare

Blur image of bright parts of sceneCan use hardware mipmap generation and LOD bias to calculate box filteringIdeally should do convolution with HDR valuesReal Gaussian blur would be smoother

Blend back on top of original imageGlow reaches around object

NVIDIA PROPRIETARY

Original image

NVIDIA PROPRIETARY

Blurred version

NVIDIA PROPRIETARY

Glow = original + blurred

NVIDIA PROPRIETARY

Image Based Lighting

Lighting synthetic objects with “real” lightAn environment map represents all light arriving at a point for each incoming directionBy convolving (blurring) an environment map with the diffuse reflection function (N.L) we can create a diffuse reflection mapIndexed by surface normal N, this gives the sum of N.L for all light sources in the hemisphereVery slow to createLow freq - cube map can be small - e.g. 32x32x6HDRShop will do this for you

NVIDIA PROPRIETARY

NVIDIA PROPRIETARY

NVIDIA PROPRIETARY

NVIDIA PROPRIETARY

NVIDIA PROPRIETARY

References"Recovering High Dynamic Range Radiance Maps from Photographs", Debevec, Malik, Siggraph 1997"Real-time High Dynamic Range Texture Mapping", Cohen, Tchou, Hawkins, Debevec, Eurographics Rendering Workshop 2001“Illumination and Reflection Maps: Simulated Objects in Simulated and Real Environments”, Gene S. Miller and C. Robert Hoffman, Siggraph1984 Course Notes for Advanced Computer Graphics Animation"Real Pixels", Greg Ward, Graphics Gems II P.80-83http://www.debevec.org/

NVIDIA PROPRIETARY

Case Study 2: Real-time Caustics

Summary: simulating refractive caustics inrealtime using OpenGL and vertex programsInspired by Jos Stam’s work at A/WWhat are caustics?

Light patterns seen on bottom of swimming poolsCaused by focusing of reflected or refracted lightTraditionally calculated offline using photon mapping etc.Usually approximated in realtime using pre-calculated textures

NVIDIA PROPRIETARY

Step 1: Generate Water Surface

Drawn as triangle meshDisplaced using 4 octaves of procedural noiseEach octave translates at speed proportional to frequencyCalculated on CPU

NVIDIA PROPRIETARY

Step 2: Refract Light Ray

Using a vertex program:Calculate light ray from local light source to surface vertexCalculate refraction of ray about vertex normalDetermine intersection between refracted ray and floor planeY = Yo + Yd * t = 0t = -Yo / YdSet vertex position to intersectionThis gives refracted mesh on bottom of pool

NVIDIA PROPRIETARY

NVIDIA PROPRIETARY

Step 3: Simulate Light Focusing

Use additive blendingBut we want intensity to be inversely proportional to area of triangle.Assuming the same of amount of light hits each triangle on the surface, smaller triangles = more focused, therefore brighter.We could send all three triangle vertices to calculate area, but that would be slow.Trick: use texture LOD as measure of projected area.

NVIDIA PROPRIETARY

Step 3: Simulate Light Focusing (cont.)

Create a texture with just two mipmap levels:2x2: all black1x1: all white

Apply texture to refracted mesh, set texture coordinates so that pixels map roughly to texels.With tri-linear filtering, this will produce shades of gray depending on amount texture is minified.Unfortunately this is view dependent.Solution - render caustics from above using orthographic projection.Copy to texture.

NVIDIA PROPRIETARY

NVIDIA PROPRIETARY

NVIDIA PROPRIETARY

NVIDIA PROPRIETARY

Step 4: Final Surface Refraction

Calculate refraction of view vector about surface normal.Intersect refracted ray with floorCalculate texture coordinates for caustic textureAlso calculate reflected ray, used to index into environment cubemap.Attenutate reflection using Fresnel approximation.Result: convincing refractive caustics in real-time.Can also do refraction three times with different indices of refraction to simulate refractive dispersion (aka “chromatic aberration”)

NVIDIA PROPRIETARY

NVIDIA PROPRIETARY

NVIDIA PROPRIETARY

References

Jos Stam’s “Periodic Caustic Textures”:http://www.dgp.toronto.edu/people/stam/reality/Research/PeriodicCaustics/index.html

NVIDIA PROPRIETARY

Case Study 3: Procedural Terrain

Summary: generate procedural terrain using vertex programs, register combiners and 3D texturesAdvantages of Procedural Modeling

Small storage requirementsNon-repeatingParameterized

Disadvantages of Procedural ModelingComputation timeHarder to control

Not really practical on current hardware

NVIDIA PROPRIETARY

Step 1: Noise in Vertex Program

Displace triangle mesh using procedural noiseGeometry doesn’t move, just the displacementSimilar to Perlin noiseUses permutation table stored in constant memoryGenerates a repeatable random value using recursive lookup into table based on vertex positionInterpolates between 4 neighbors to produce smooth resultRequires 42 vertex program instructions!

NVIDIA PROPRIETARY

Step 2. Ridged Multi-fractal Function

Ken Musgrave’s trick to make noise look more like terrainRidges:

Take absolute value of signed noiseSubtract from 1Square result to produce sharper ridges

“Multi-fractal”Scale each octave by previous resultValleys are smooth, peaks are rough

We only have room for 2 octaves

NVIDIA PROPRIETARY

Ridged Multi-fractal Function

NVIDIA PROPRIETARY

Step 3: Lighting

Hard to calculate normals in vertex programCalculate in image-space using register combiners insteadRender terrain height field from above, color = heightCopy to textureBind to three texture units with offset texture coordinatesCalculate approximate normal in register combiners: ( h(x,y)-h(x+1,y), h(x,y)-h(x,y+1), 1)Calculate diffuse lighting as N.L

NVIDIA PROPRIETARY

Step 4. Texturing

Use a 3D textureDifferent terrain type for each 2D sliceR texture coordinate determines terrain type, computed in vertex program based on heightUnfortunately 3D textures are mip-mapped in 3D!From a distance, all layers blend to a single imageCan duplicate slices to help avoid blending

Add reflective lakes:render scene upside down to texturedisplace in image space using GL_OFFSET_PROJECTIVE_TEXTURE_2D_NV

NVIDIA PROPRIETARY

Terrain Textures

NVIDIA PROPRIETARY

NVIDIA PROPRIETARY

The Future

Hardware is gettingFasterMore programmableHigher precision

Today’s off-line rendering effects will be real-time tomorrowStart thinking about it now!

NVIDIA PROPRIETARY

Questions?

E-mail: [email protected]