efficient cloth modeling and renderinggraphics.stanford.edu/~lensch/papers/egwr01-cloth... ·...

9
Efficient Cloth Modeling and Rendering Katja Daubert , Hendrik P. A. Lensch , Wolfgang Heidrich , and Hans-Peter Seidel Max-Planck-Institut für Informatik, The University of British Columbia 1 Abstract. Realistic modeling and high-performance rendering of cloth and clothing is a challenging problem. Often these materials are seen at distances where individual stitches and knits can be made out and need to be accounted for. Modeling of the geometry at this level of detail fails due to sheer complexity, while simple texture mapping techniques do not produce the desired quality. In this paper, we describe an efficient and realistic approach that takes into ac- count view-dependent effects such as small displacements causing occlusion and shadows, as well as illumination effects. The method is efficient in terms of mem- ory consumption, and uses a combination of hardware and software rendering to achieve high performance. It is conceivable that future graphics hardware will be flexible enough for full hardware rendering of the proposed method. 1 Introduction Fig. 1. Woolen sweater rendered using our ap- proach (knit and perl loops). One of the challenges of modeling and rendering realistic cloth or clothing is that individual stitches or knits can of- ten be resolved from normal viewing dis- tances. Especially with coarsely wo- ven or knitted fabric, the surface cannot be assumed to be flat, since occlusion and self-shadowing effects become sig- nificant at grazing angles. This rules out simple texture mapping schemes as well as bump mapping. Similarly, model- ing all the geometric detail is prohibitive both in terms of the memory require- ments and rendering time. On the other hand, it is probably possible to compose a complex fabric surface from copies of individual weaving or knitting patterns unless the viewer gets close enough to the fabric to notice the periodicity. This leads to approaches like virtual ray-tracing [5], which are more feasible in terms of memory consumption, but still result in long ren- dering times. In this paper we present a fast and memory-efficient method for modeling and ren- dering fabrics that is based on replicating weaving or knitting patterns. While the ren- dering part currently makes use of a combination of hardware and software rendering, it is conceivable that future graphics hardware will be flexible enough for full hardware rendering. Our method assumes we have one or a small number of stitch types, which are repeated over the garment. Using a geometric model of a single stitch, we first compute 1 Part of this work was done during a research visit of the first author to the University of British Columbia

Upload: others

Post on 22-Jul-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Efficient Cloth Modeling and Renderinggraphics.stanford.edu/~lensch/Papers/EGWR01-cloth... · modeled micro-geometry in Sections 4, and 5. After discussing the rendering algorithm

Efficient Cloth Modeling and RenderingKatja Daubert

�, HendrikP. A. Lensch

�, Wolfgang Heidrich

�, andHans-PeterSeidel

�� �Max-Planck-Institut für Informatik, � � TheUniversityof British Columbia1

Abstract. Realisticmodelingand high-performance renderingof cloth andclothing is a challenging problem. Often thesematerialsareseenat distanceswhereindividual stitchesandknits canbe madeout andneedto be accountedfor. Modelingof thegeometryat this level of detailfailsdueto sheercomplexity,while simpletexturemappingtechniquesdo not producethedesiredquality.In this paper, we describean efficient andrealisticapproach that takes into ac-countview-dependenteffectssuchassmalldisplacementscausingocclusionandshadows, aswell asilluminationeffects.Themethodis efficient in termsof mem-ory consumption, andusesa combination of hardwareandsoftwarerenderingtoachieve highperformance.It is conceivablethatfuturegraphicshardwarewill beflexible enoughfor full hardwarerenderingof theproposedmethod.

1 Intr oduction

Fig. 1. Woolensweaterrenderedusingour ap-proach(knit andperl loops).

One of the challengesof modeling andrendering realistic cloth or clothing isthat individual stitchesor knits can of-tenberesolved from normalviewing dis-tances. Especially with coarselywo-ven or knitted fabric, the surfacecannotbe assumedto be flat, since occlusionand self-shadowing effects becomesig-nificantat grazing angles.This rulesoutsimpletexture mapping schemesaswellas bump mapping. Similarly, model-ing all thegeometricdetail is prohibitiveboth in terms of the memory require-mentsandrendering time. On the otherhand, it is probablypossibleto composea complex fabric surfacefrom copiesofindividual weaving or knitting patternsunlessthe viewer gets closeenough tothefabricto noticetheperiodicity. This leadsto approacheslikevirtual ray-tracing[5],which aremorefeasiblein termsof memory consumption, but still resultin long ren-dering times.

In this paperwe presenta fastandmemory-efficient methodfor modeling andren-dering fabricsthat is basedon replicatingweaving or knitting patterns.While theren-dering partcurrently makesuseof a combinationof hardwareandsoftwarerendering,it is conceivablethatfuturegraphicshardwarewill beflexible enoughfor full hardwarerendering.

Our method assumeswe have oneor a small number of stitch types,which arerepeatedover thegarment.Usingageometric modelof asinglestitch,wefirst compute

1Partof thiswork wasdoneduringaresearchvisit of thefirst authorto theUniversityof Briti shColumbia

Page 2: Efficient Cloth Modeling and Renderinggraphics.stanford.edu/~lensch/Papers/EGWR01-cloth... · modeled micro-geometry in Sections 4, and 5. After discussing the rendering algorithm

the lighting (including indirect lighting andshadows) usingthe methods describedin[3]. By samplingthestitchregularly within a planewe thengenerateaview dependenttexture with per-pixel normalsandmaterialproperties.

Beforewe cover thedetailsof this representation in Section3, wewill briefly sum-marizerelatedwork in Section2. We thendescribe acquisition andfitting of datafrommodeledmicro-geometry in Sections4,and5. After discussingtherenderingalgorithmin Section6 we finally present our resultsin Section7.

2 RelatedWork

In orderto efficiently render replicatingpatterns suchascloth without explicitly repre-sentingthegeometry at the finestlevel, we canchoosebetweenseveral different rep-resentations. Thefirst possibility is to composeglobalpatternsof partswith precom-putedillumination, suchaslight fields [13] andLumigraphs[6]. However, theseap-proachesassumefixedilluminationconditions,andexpandingthemto arbitrary illumi-nationyieldsan8-dimensionalfunction(whichhasbeencalledthereflectancefield [4])thatis too largeto storefor practicalpurposes.

Anotherpossibility is to model the patterns asvolumes [7, 14] or simplegeome-try (for example, height fields) with a spatiallyvarying BRDF. Hardware acceleratedmethods for renderingshadowing andindirect illumination in height fields have beenproposedrecently[8, 16], aswell ashardwarealgorithms for renderingarbitraryuni-form [9, 10] and space-variant materials[11]. However, the combination of space-variant materialswith bump- or displacement mapsis well beyond thecapabilitiesofcurrentgraphicshardware.Thiswouldrequireanexcessivenumber of renderingpasseswhich is neitherpracticalin termsof performancenor in termsof numerical precision.

For high-performancerenderingwe therefore needto comeup with moreefficientrepresentationsthatallow usto simulateview-dependentgeometriceffects(shadowingandocclusion)aswell asillumination effects(specularity andinterreflection) for space-variant materialsin a way thatis efficientbothin termsof memoryandrendering time.

In work parallel to ours,Xu etal. [18] developedthelumislice, whichis arenderingmethod for textiles that is more tailoredfor high-quality, off-line rendering, whereasourmethodusesmoreprecomputationto achievenear-interactiveperformance. In fact,thelumislicecouldbeusedasa way to precomputethedatastructureswe use.

Themethodweproposeis mostcloselyrelatedto bidirectional texturefunctions [2]andvirtual ray-tracing[5]. As we will discussbelow, our representation is, however,morecompactandis easyto filter for correctanti-aliasing. Our approachis alsore-latedto imagebasedrenderingwith controllableillumination, asdescribedby Wongetal. [17]. Again, our representation is morecompact, easierto filter andlendsitself topartial useof graphics hardware. Futurehardwareis likely to have enough flexibilityto eliminatethe remaining softwaresteps,making the methodsuitablefor interactiveapplications.

3 Data Representation

Ourrepresentationof clothdetailis basedonthecompositionof repeating patterns (in-dividualweavesorknits)for whichefficientdatastructuresareused.In orderto capturethevariation of theopticalpropertiesacrossthematerial,weemploy aspatiallyvaryingBRDF representation. The two spatialdimensions arepoint sampledinto a 2D array.For eachentry we storedifferent parameters for a Lafortunereflectionmodel[12], a

Page 3: Efficient Cloth Modeling and Renderinggraphics.stanford.edu/~lensch/Papers/EGWR01-cloth... · modeled micro-geometry in Sections 4, and 5. After discussing the rendering algorithm

lookup table,aswell asthenormal andtangent.An entry’s BRDF � � � � �� � for the light direction � andthe viewing direction �� is

given by thefollowing equation:� � � � �� �� �� � �� ��� � l � � �� � (1)

where� l � � �� � denotestheLafortunemodel and � � �� � is thelookup table2.TheLafortunemodelitself consistsof a diffusepart � anda sumof lobes3:� l � � �� �� � � ��� ����� ���� � ! " � �#�#$&% ! ')(*(( % " ')((+( % � '-, �

�� � !� "� �/.0213�4 ' . 50 (2)

Eachlobe’sshapeandsizeis definedby its fourparameters % ! % " % � , and6 . Since� lis wavelengthdependent,we representevery parameter asa three-dimensionalvector,onedimensionpercolorchannel. Beforeevaluating thelobewetransform thelight andviewingdirection into thelocalcoordinatesystemgivenby thesamplingpoint’saveragenormal andtangent, yielding � and �� . In orderto account for areaforeshortening wemultiply by

�.

Thelookuptable� � �� � storescolorandalphavaluesfor eachof theoriginal viewingdirections. It thereforecloselyresemblesthedirectional partof a light field. Valuesfordirectionsnotstoredin thelookuptableareobtainedby interpolation.Althoughgeneralview-dependent reflectionbehavior including highlights etc.,couldbe described by asimpleLafortuneBRDF, weintroducethelookuptableto takemorecomplex propertieslike shadowing andmasking(occlusion) into account that arecausedby the complexgeometryof theunderlying clothmodel.

Like in redistribution bumpmapping[1], this approachaimsat simulatingtheoc-clusioneffectsthatoccurin bumpmapsat grazing angles.In contrastto redistributionbumpmapping, however, we only needto storea singlecolor valueperviewing direc-tion, ratherthanacompletenormal distribution. Figure5 demonstratestheeffectof themodulationwith thelookuptable.Thesamedata,acquiredfromthestitchmodelshownin the middle,wasusedto fit a BRDF model without a lookup table,only consistingof several cosinelobes(displayed on the left cloth in Figure5) anda modelwith anadditional lookup table(cf. Figure5 on the right). Both imageswererendered usingthe samesettingsfor light andviewing direction. Generally, without a lookup table,theBRDF tendsto blur over thesingleknits. Also theBRDF without thelookuptableclearly is not ableto capturethecolor shifts to redat grazingangles, which arenicelyvisibleon theright cloth.

Thealphavaluestoredin the lookup tableis usedto evaluatethe transparency. Itis not considered in the multiplication with � l , but usedasdescribedin Section6 todetermine if thereis aholein themodelatacertainpoint for agiven viewing direction.Thealphavaluesareinterpolatedsimilarly to thecolorvalues.

4 Data Acquisition

After discussingthedatastructurewe usefor representing thedetailof thefabrics,wenow describehow to obtainthenecessarydatafrom a given 3D model.

2Both 7�8 9: ; and < l aredefinedfor each color channel, so = denotesthecomponent-wisemultiplication ofthecolor channels.

3Theoperator > ? is definedto return zero if >A@CB .

Page 4: Efficient Cloth Modeling and Renderinggraphics.stanford.edu/~lensch/Papers/EGWR01-cloth... · modeled micro-geometry in Sections 4, and 5. After discussing the rendering algorithm

We modelthe basegeometry of our knits andweavesusingimplicit surfaces, theskeletonsof which aresimpleBéziercurves. By applying the Marching Cubesalgo-rithm we generatetrianglemeshes,whicharetheinput for ouracquisition algorithm.

Now we canobtaintherequired data.As mentionedin Section3, thespatialvaria-tionsof thefabricpatternarestoredasa 2D arrayof BRDF models. Apart from radi-ancesamplesD � � �� � for all combinationsof viewing andlight directions,we alsoneedanaveragenormal, anaveragetangent,andanalphavaluefor eachviewing directionfor eachof theseentries.

We usean extensionof Heidrich et al.’s algorithm ([8]) to trianglemeshes([3]),whichallowsusto compute thedirectandindirect illumination of a trianglemeshfor agiven viewing andlight direction pervertex in hardware(for detailssee[3]). In orderto accountfor maskingandpartsof therepeatedgeometrybeingvisible through holes,wepastetogether multiple copiesof thegeometry.

Fig. 2. Computingthe samplinglocationsfor the radiancevalues.Left: top view, middle: projection,right: resultingsamplingloca-tions,discardingsamplesat holes.

Now we need tocollecttheradiancedatafor eachsamplingpoint.We obtainthe2D sam-pling locationsby firstdefining a setof evenlyspacedsamplingpointson the top face of themodel’s bounding box,as can be seenon theleft in Figure 2. Then we project thesepoints according to the current viewing di-rection(seeFigure2 in themiddle)andcollect theradiancesamplesfrom thesurfacevisible through these2D projections(seeFigure2 right), similarly to obtaining a lightfield.

Notethat,dueto parallax effects,for eachentrywecombineradiancesamplesfroma number of different pointson the actualgeometry. Like in [17], we will usethisinformationfrom differentsurfacepoints to fit aBRDFfor thegivensamplinglocation.

for each 9: {ComputeSamplingPoints();RepeatScene(vertex color=normals);StoreNormals();StoreAlpha();

for each 9E {ComputeLighting();RepeatScene(vertex color=lighting);StoreRadiance();

}}AverageNormals();

Fig. 3. Pseudocodefor theacquisitionprocedure.

As the stitch geometry can haveholes,theremight be no surfacevis-ible at a samplingpoint for a certainviewing direction. We store this in-formation as a boolean transparencyin the alphachannel for that sample.Multiple levelsof transparency valuescan be obtained by super-sampling,i.e. consideringthe neighboring pix-els.

In order to compute the normals,we display the sceneonce for eachviewing direction with the normalscodedascolorvalues.An averagenor-

mal is computedby adding the normals separatelyfor eachsamplingpoint andaver-agingthemat theend.We canconstructa tangentfrom thenormal andthebi-normal,which in turn we defineasthevector perpendicularto boththenormal andthe F -axis.Figure3 showshow thestepsareput togetherin theacquisitionalgorithm.

Page 5: Efficient Cloth Modeling and Renderinggraphics.stanford.edu/~lensch/Papers/EGWR01-cloth... · modeled micro-geometry in Sections 4, and 5. After discussing the rendering algorithm

5 Fitting Process

Oncewe have acquired all the necessarydata,we useit to find an optimal setof pa-rameters for the Lafortunemodel for eachentry in the arrayof BRDFs. This fittingprocedurecanbe divided into two majorstepswhich areappliedalternately. At first,theparametersof the lobesarefit. Then, in thesecondstep,theentriesof the lookuptableareupdated.Now thelobesarefit againandsoon.

Givena setof all radiancesamplesandthecorrespondingviewing andlight direc-tions acquired for onesamplingpoint, the fitting of the parameters of the Lafortunemodel � l requiresa non-linearoptimizationmethod. As proposedin [12], we appliedtheLevenberg-Marquardtalgorithm [15] for this task.

Theoptimizationis initiatedwith anaveragegrayBRDF with a moderatespecularhighlight andslightly anisotropic lobes,e.g. % ! HG I J JLK % " for the first and % " G I J JMK % ! for thesecondlobeif two lobesarefit. For thefirst fitting of theBRDF thelookup table � � �� � is ignored,i.e. all its entriesaresetto white.

After fitting thelobeparameters,weneedto adapt thesamplingpoint’s lookuptable� � �� � . Eachentryof the tableis fit separately. This time only thoseradiancesamplesof thesamplingpoint thatcorrespondto theviewing directionof thecurrent entryareconsidered.Theoptimal color for oneentryminimizesthefollowing setof equations:N D � � O �� � D � � P �� � I I I D � � Q� �� � RTSU �� � �� � N � l � � O �� � � l � � P �� � I I I � l �V� Q� �� � RTS (3)

where D � � O �� � I I I D � � QA �� � are the radiance samplesof the samplingpoint with thecommon viewing direction �� andthedistinct light directions � O I I I � Q . Thecurrentlyestimatedlobesareevaluatedfor every light directionyielding � l � � � �� � . Treatingthecolor channels separately, Equation3 canberewritten by replacingthecolumn vectoronits left sideby �D � �� � , thevectoronits right sideby ��#� �� � , yielding �D � �� �� �� � �� �W� ��#� �� � .Theleastsquaressolutionto this equation is given by� � �� �� YX ��#� �� � Z �D � �� � [X ��#� �� � Z ��#� �� � [ (4)

where X � Z � [ denotesthedotproduct. This is doneseparatelyfor every color channel andeasilyextends to additional spectralcomponents.

To further improve the result we alternatelyrepeatthe stepsof fitting the lobesandfitting the lookup table. The iterationstopsassoonasthe averagedifferenceoftheprevious lookuptable’s entriesto thenew lookup table’s entriesis below a certainthreshold.

In addition to thecolor, eachentryin thelookup tablealsocontainsanalphavalueindicating the opacityof the samplepoint. This value is fixed for every viewing di-rectionandis not affectedby thefitting process.Insteadit is determinedthrough ray-castingduring thedataacquisition phase.

Currently, wealsoderive thenormal andtangent ateachsamplepoint directly fromthegeometric model.However, theresultof thefitting processcouldprobably befurtherimprovedby alsocomputinganew normal andtangent to bestfit theinputdata.

5.1 Mip-Map Fitting

The samefitting we have donefor every singlesamplepoint canalsobe performedfor groups of samplepoints. Let a samplepoint bea texel in a texture. Collectingall

Page 6: Efficient Cloth Modeling and Renderinggraphics.stanford.edu/~lensch/Papers/EGWR01-cloth... · modeled micro-geometry in Sections 4, and 5. After discussing the rendering algorithm

radiancesamplesfor four neighboringsamplepoints,averaging thenormals,fitting thelobesandtheentriesof thelookup tablethenyieldstheBRDF correspondingto a texelon thenext highermip-map level.

By grouping even moresamplepoints, further mip-maplevels canbe generated.Theoverall effort per level staysthesamesincethesamenumber of radiancesamplesareinvolvedat eachlevel.

6 Rendering

After the fitting processhasbeencompleted for all samplingpointswe arereadytoapplyour representationof fabric patternsto a geometric model. We assumethegivenmodel haspervertex normalsandvalid texturecoordinates\/] ( I I ^ 4A_ P , where 4 is thenumberof timesthepatternis to berepeatedacrossthewholeclothgeometry. Further-more, we assumethefabricpatterns arestoredin a 2D array, thedimensions of whichcorrespondto thepattern’sspatialresolution� res! res" � . Our renderingalgorithmthenconsistsof four steps:

1. Interpolateperpixel normals2. Computeindices into thepatternarray, yielding aBRDF �W�3. Evaluate� � with light andview mappedinto geometry’s localcoordinatesystem4. Write resultto framebuffer

Thegoalof Step1 is to estimateanormal for eachvisiblepoint ontheobject. Wedothis by colorcoding thenormals at theverticesandrendering thesceneusingGouraudshading. Eachframebuffer valuewith analphavalue ` ( now codesa normal.

Thenext stepis to find out which BRDF we needto evaluate in order to obtainthecolor for eachpixel. In orderto do this we first generatea texture with the resolution� res! res" � in which theredandgreen channel of eachpixel encode its position.Notethat this texture hasto begenerated only onceandcanbe reusedfor otherviews andlight directions. Using hardware texture mapping with the above mentioned texturecoordinates,thetextureis replicated 4 timesacrosstheobject. Now theredandgreenchannel of eachpixel in theframebuffer holds thecorrectindicesinto the2D arrayofBRDFsfor this specificfabricpattern.

Oncewe know which BRDF to evaluate,we mapthe light andviewing directioninto thegeometry’s local coordinatesystem,usingthenormalsobtained in Step1 anda tangent constructedasdescribedin Section4. Note that two mappingsneedto takeplace: this one, which mapsthe world view and light to the cloth geometry’s localcoordinatesystem(yielding � and �� ), andanother whenevaluating the BRDF, whichtransformsthesevaluesto thepattern’s localcoordinatesystem(yielding � , �� ).

Thesoftwareevaluationof theBRDFmodel(seeSection3) returns threecolorsandanalphavalue from thelookuptable,whichwe thenwrite to theframebuffer.

The presentedrendering technique utilizes hardware asfar aspossible.However,the BRDF model is still evaluatedin software,although mapping this onto hardwareshouldbefeasiblewith thenext generation of graphics cards.

6.1 Mip-Mapping

As describedin Section5.1, we cangenerate several mip-maplevels of BRDFs. Wewill now explain how to enhance the above algorithmto correctlyusedifferent mip-maplevels,therebyexploiting OpenGLmip-mapping.

Page 7: Efficient Cloth Modeling and Renderinggraphics.stanford.edu/~lensch/Papers/EGWR01-cloth... · modeled micro-geometry in Sections 4, and 5. After discussing the rendering algorithm

First we modify Step2 andnow generateonetexture permip-maplevel. Eachtex-ture’s resolution correspondsto theBRDF’s spatialresolutionat this level. As before,theredandgreenchannel code thepixel’s locationin thetexture. Additionally, wenowuseeachpixel’s bluechannel to codethemip-maplevel of thecorresponding texture.For example, if wehave a levels,all pixel’s bluevalues are ( in texture0, ( I J in texture1, ( I b in texture 2 andsoon.

If we set up OpenGLmip-mappingwith thesetextures specifiedfor the correctlevels,thebluechannel of eachpixel will tell uswhichtexture to use,while theredandgreenchannel still codetheindicesinto thearrayof BRDFsat this level.

Blendingbetweentwo mip-maplevelsis alsopossible.As we donotwantto blendthetexture coordinatesin theredandgreenchannels,however, we needtwo passestodo so. Thefirst passis thesameasbefore. However, in thesecondpasswe setupthemip-map technique to linearly interpolatebetweentwo levels. We avoid overwritingthevalues in theredandgreenchannelsby usinga color mask.Now thevalueof theblue channel � c codesbetweenwhich levels to blend(in the above example betweenlevels

d#e f hg � c i ( I J j and d�k l hm � c i ( I J n ) andalsotellsustheblending factor(here� � c�o d�e f � ( I J � i ( I J ).

7 Resultsand Applications

We implementedour algorithms on a PC with an AMD Athlon 1GHz processoranda GeForce2 GTS graphics card. To generatethe imagesin this paperwe appliedtheacquired fabric patterns to cloth models we modeledwith the 3D StudioMax plug-ins GarmentMaker andStitch. Our geometric models for the knit or weave patternsconsistof 1300–23000verticesand2400–31000 triangles.Thecomputationtimesofthe acquisitionprocessdependon the number of triangles,as well as the samplingdensityfor the viewing and light directions, but generally vary from 15 minutes toabout 45 minutes. We typically used32 p 32 or 64 p 64 viewing andlight directions,uniformly distributedover thehemisphere,generatingup to 4096 radiancesamplespersamplingpoint on thelowestlevel. We found a spatialresolutionof 32 p 32 samplestobesufficient for ourdetailgeometry, whichresultsin 6 mip-maplevelsand1365BRDFentries. The parameterfitting of a BRDF arrayof this sizetakesabout 2.5 hours. Inour implementationeachBRDF in thearray(including all themip-map levels)hasthesamenumberof lobes. We found out thatgenerallyoneor two lobesaresufficient toyield visuallypleasingresults.Thethreshold mentionedin Section5 wassetto 0.1andwe notedthatconvergencewasusuallyachieved after2 iterations.Onceall parametershave beenfit we needonly 4 MB to storethecompletedatastructure for onetypeoffabric,including all mip-maplevelsandthelookup tableswith 64entriesperpoint.

Therendering timese.g. for Figure1 areabout 1 framepersecondfor a resolutionof q r ( p b ( ( pixels. The bulk of this time is spenton readingbackthe framebuffercontents in order to evaluatetheBRDF for every pixel. We thereforeexpect thatwiththeadventof moreflexible hardware, which will allow us to implementtherenderingpartof thispaperwithout suchasoftwarecomponent,theproposedmethod will becomefeasiblefor interactiveapplications.

The dressin Figure4(a) displaysa fabric patterncomputedwith our method. InFigure4(b) we compare theresultsof a mip-mappedBRDF to a singlelevel one. Asexpectedthe mip-mapping nicely getsrid of the severealiasingclearly visible in thenot mip-mappedleft half of the table. Figure5 illustrateshow evencomplex BRDFswith colorshiftscanbecapturedusingourmodel.Figure1 andFigure6 show differentfabricpatternsdisplayedon thesameclothgeometry.

Page 8: Efficient Cloth Modeling and Renderinggraphics.stanford.edu/~lensch/Papers/EGWR01-cloth... · modeled micro-geometry in Sections 4, and 5. After discussing the rendering algorithm

8 Conclusions

In this paper we have presenteda memory-efficient representationfor modelingandrenderingfabricsthatisbasedonreplicating individualweavingorknittingpatterns.Wehave demonstratedhow our representationcanbegeneratedby fitting samplesfrom aglobal illumination simulationto it. In asimilar fashionit shouldbepossibleto acquirea fitted representation from measuredimagedata. Our model is capableof capturingcolor variations dueto self-shadowing andself-occlusion aswell as transparency. Inaddition, it naturally lendsitself to mip-mapping, therebysolvingthefiltering problem.

Furthermore we presentedan efficient rendering algorithm which canbe usedtoapplyourmodelto any geometry, achieving near-interactive framerateswith a combi-nationof hardwareandsoftwarerendering. With theincreasingflexibility of upcominggenerationsof graphics boards,we expect to beableto implement therenderingalgo-rithm completely in hardware soon. This would make the approachsuitablefor fullyinteractive andevenrealtimeapplications.

References1. B. Becker andN. Max. SmoothTransitionsbetweenBumpRendering Algorithms. In SIGGRAPH’93

Proceedings, pages183–190, August1993.2. K. Dana,B. vanGinneken,S.Nayar, andJ.Koenderink. ReflectanceandTextureof Real World Surfaces.

ACM Transactionson Graphics, 18(1):1–34,January1999.3. K. Daubert, W. Heidrich, J. Kautz, J.-M. Dischler, and Hans-Peter Seidel. Efficient Light Transport

UsingPrecomputed Visibil ity. TechnicalReportMPI-I-2001-4-003,Max-Planck-Institut für Informatik,2001.

4. P. Debevec,T. Hawkins, C. Tchou,H.-P. Duiker, W. Sarokin, andM. Sagar. Acquiring the reflectancefield of a humanface. In SIGGRAPH2000Proceedings, pages145–156,July 2000.

5. J.-M. Dischler. Efficiently Rendering MacroGeometric SurfaceStructureswith Bi-Di rectionalTextureFunctions. In Proc.of EurographicsWorkshopon Rendering, pages169–180, June1998.

6. S.Gortler, R.Grzeszczuk,R.Szelinski,andM. Cohen. TheLumigraph. In SIGGRAPH’96 Proceedings,pages43–54,August1996.

7. E. Gröller, R. Rau,andW. Straßer. Modeling textiles asthreedimensional textures. In Proc. of Euro-graphicsWorkshopon Rendering, pages205–214,June1996.

8. W. Heidrich, K. Daubert, J. Kautz, andH.-P. Seidel. Illuminating Micro GeometryBasedon Precom-puted Visibili ty. In SIGGRAPH’00 Proceedings, pages455–464, July 2000.

9. W. Heidrich andH.-P. Seidel. Realistic, Hardware-accelerated Shading andLighting. In SIGGRAPH’99 Proceedings, August1999.

10. J.KautzandM. McCool. InteractiveRenderingwith Arbitrary BRDFsusingSeparableApproximations.In Proc.of EurographicsWorkshopon Rendering, pages247– 260,June1999.

11. J.KautzandH.-P. Seidel. Towardsinteractive bumpmapping with anisotropic shift-variant BRDFs. In2000Eurographics/SIGGRAPHWorkshopon GraphicsHardware, pages51–58,August2000.

12. E. Lafortune,S.Foo,K. Torrance, andD. Greenberg. Non-Linear Approximationof Reflectance Func-tions. In SIGGRAPH’97 Proceedings, pages117–126,August1997.

13. M. Levoy and P. Hanrahan. Light Field Rendering. In SIGGRAPH’96 Proceedings, pages31–42,August1996.

14. F. Neyret. Modeling, Animating, andRendering Complex ScenesUsing Volumetric Textures. IEEETransactionson VisualizationandComputer Graphics, 4(1),January – March1998.

15. W. Press,S. Teukolsky, W. Vetterling, andB. Flannery. Numerical Recipesin C: TheArt of ScientificComputing (2nded.). CambridgeUniversity Press,1992. ISBN 0-521-43108-5.

16. P. SloanandM. Cohen. Hardware AcceleratedHorizon Mapping. In Proc. of Eurographics Workshopon Rendering, pages281–286, June2000.

17. Tien-TsinWong,Pheng-AnnHeng,Siu-HangOr, andWai-Yin Ng. Image-based Rendering with Con-trollableIllu mination. In Proc.of EurographicsWorkshopon Rendering, pages13–22,1997.

18. Y.-Q. Xu, Y. Chen,S. Lin, H. Zhong,E. Wu, B. Guo, andH.-Y. Shum. Photo-realistic rendering ofknitwearusingthe lumislice. In Computer Graphics (SIGGRAPH’01 Proceedings), 2001. to be pub-lished.

Page 9: Efficient Cloth Modeling and Renderinggraphics.stanford.edu/~lensch/Papers/EGWR01-cloth... · modeled micro-geometry in Sections 4, and 5. After discussing the rendering algorithm

(a) (b)

Fig. 4. (a) A dressrenderedwith BRDFsconsistingof only onelobe. (b) Left: Aliasing artifactsareclearlyvisible if no mip-mapping is used.Right: usingseveralmip-mappinglayers.

Fig. 5. Thefabricpatternsdisplayedon themodels(left andright) werebothcomputedfrom themicrogeometryin themiddle. In contrastto theright BRDFmodel,theleft onedoesnot includea lookup table.Clearlythis BRDF is not ableto capturethecolor shift to redfor grazingangles,nicely displayedon theright.

Fig. 6. Differentfabricpatternson thesamemodel.Left: plainknit, middle: loopswith differentcolors,right: perl loops.