cosmix: a community -building project - ringling college of

17
Claudia D. Cumbie-Jones, et al. 2008 Fulldome Summit, Chicago, Illinois, July 3 rd , 2008 Cosmix: a community-building project Claudia D. Cumbie-Jones MFA, a Lance Ford Jones MFA, b David Steiling PhD, c Ringling College of Art and Design 2700 N. Tamiami Trail, Sarasota, FL, USA Jeff Rodgers BS d Bishop Planetarium 201 10th Street West, Bradenton, FL, USA Abstract. In April of 2008, Ringling College of Art and Design partnered with the South Florida Museum’s Bishop Planetarium to create Cosmix, an installation/performance event. This was conceived as a community-building project. The title, Cosmix, suggested works that might be cosmic in their aspirations and mixed in their media. In addition, the event was oriented around the theme of the cardinal directions, in this case seven directions—East, South, West, North, Up, Down, and Inward. Several classes were involved in creating installations within the space of the museum and in crafting a final evening event within the planetarium’s fulldome. Our contribution to the Cosmix final show was called Dreamcatcher. Through some innovative programming on the part of the planetarium director, rather than rendering it as a fulldome piece, we were able to feed seven separate mpeg-2 files into each of the projectors of the SkyScan system. We created a previsualization tool using MAX/MSP/Jitter allowing us to synchronize our seven video and audio tracks prior to seeing them on the dome. We believe that this is a model for a low- cost way of working with educational institutions as well as independent new media artists in which other planetariums may be interested. On April 5, 2008, The Ringling College of Art and the South Florida Museum presented an evening of installation, media and performance art entitled Cosmix: A Multidirectional Synthesis of Art and Technology. As part of that evening, several works for fulldome projection were shown in the Bishop Planetarium, a recently rebuilt 125-seat SkyScan equipped fulldome theater. These works deployed several strategies that were developed by Ringling College faculty and students to avoid the lengthy development time and rigorous technical resources that inhibit spontaneous content development and limit the diversity of offerings for the current generation of high-definition dome theaters. This paper is a report on some of the strategies and methods we used to create several programs of media art for fulldome projection that by-passed or reduced the inhibiting factors of machine-choking image resolutions, daunting software learning curves, and mind-numbing render times associated with traditional production techniques. a [email protected] b [email protected] c [email protected] d [email protected]

Upload: others

Post on 03-Feb-2022

3 views

Category:

Documents


0 download

TRANSCRIPT

Claudia D. Cumbie-Jones, et al.

2008 Fulldome Summit, Chicago, Illinois, July 3rd, 2008

Cosmix: a community-building project Claudia D. Cumbie-Jones MFA,a Lance Ford Jones MFA,b David Steiling PhD,c Ringling College of Art and Design 2700 N. Tamiami Trail, Sarasota, FL, USA Jeff Rodgers BSd Bishop Planetarium 201 10th Street West, Bradenton, FL, USA

Abstract. In April of 2008, Ringling College of Art and Design partnered with the South Florida Museum’s Bishop Planetarium to create Cosmix, an installation/performance event. This was conceived as a community-building project. The title, Cosmix, suggested works that might be cosmic in their aspirations and mixed in their media. In addition, the event was oriented around the theme of the cardinal directions, in this case seven directions—East, South, West, North, Up, Down, and Inward. Several classes were involved in creating installations within the space of the museum and in crafting a final evening event within the planetarium’s fulldome.

Our contribution to the Cosmix final show was called Dreamcatcher. Through some innovative programming on the part of the planetarium director, rather than rendering it as a fulldome piece, we were able to feed seven separate mpeg-2 files into each of the projectors of the SkyScan system. We created a previsualization tool using MAX/MSP/Jitter allowing us to synchronize our seven video and audio tracks prior to seeing them on the dome. We believe that this is a model for a low-cost way of working with educational institutions as well as independent new media artists in which other planetariums may be interested.

On April 5, 2008, The Ringling College of Art and the South Florida Museum presented an evening of installation, media and performance art entitled Cosmix: A Multidirectional

Synthesis of Art and Technology. As part of that evening, several works for fulldome projection were shown in the Bishop Planetarium, a recently rebuilt 125-seat SkyScan equipped fulldome theater. These works deployed several strategies that were developed by Ringling College faculty and students to avoid the lengthy development time and rigorous technical resources that inhibit spontaneous content development and limit the diversity of offerings for the current generation of high-definition dome theaters. This paper is a report on some of the strategies and methods we used to create several programs of media art for fulldome projection that by-passed or reduced the inhibiting factors of machine-choking image resolutions, daunting software learning curves, and mind-numbing render times associated with traditional production techniques.

a [email protected] b [email protected] c [email protected] d [email protected]

Claudia D. Cumbie-Jones, et al.

2008 Fulldome Summit, Chicago, Illinois, July 3rd, 2008

Figure 1. Bishop Planetarium (courtesy Camille Pyatte)

Conception

The Cosmix project was conceived as a community building effort on the part of the South Florida Museum/Bishop Planetarium and the Ringling College of Art to lay groundwork for a number of potential collaborations that would extend the reach of both institutions into relationships with new audiences. Among the attempts of our initial effort was to prototype ways in which artists across the community, including the artists-in-training at the college, local school kids, independent media artists and local amateurs might participate in making works for dome projection. The larger scope of the project involved enlisting students and faculty from the Ringling College to create site-specific installation works that would recontextualize and refresh the experience of the South Florida Museum for its many repeat visitors. The South Florida Museum/Bishop Planetarium is an institution with a very eclectic collection and a mission rooted in its role as a community cultural center. Ringling artists created installations for all areas of the museum. Some installations appeared in the interstices between exhibits like stairwells, lobbies and service areas. Other installations interpenetrated the exhibits in the museum itself—next to the mastodon, for example, inside the 19th century pharmacy exhibit, in the recreation of the Hernando De Soto birthplace, or in the corridors leading to the tank of Snooty the Manatee who has lived at the museum for 50 years.

Claudia D. Cumbie-Jones, et al.

2008 Fulldome Summit, Chicago, Illinois, July 3rd, 2008

Figure 2. Subaqua installation by Dee Hood/Sheryl Haler (courtesy Camille Pyatte)

In creating original works for the planetarium theater, the effort was to restore some of the spontaneity and live interaction that, over the years, had helped build community support for the facility. In August 2001, the interior of the Bishop Planetarium and some of the surrounding facilities had been destroyed in an electrical fire. This event sparked a community debate over whether to replace the facility and what role the planetarium played within the local cultural landscape. The debate clarified support for the institution and culminated in the construction of a state-of-the-science digital projection theater, thanks, in part to the community support generated by a diverse audience of science buffs, educators and fans of the entertainment oriented laser light shows that had been a popular feature of the old planetarium. Trying to find ways to provide new and much more complex cultural programming for the audience segment that had sought out the planetarium as a venue for unique multimedia experiences was part of the impulse behind our experiments. Production The entire project from first contact between the Museum and the College through the night of the final performance took three months. This time line virtually eliminated any possibility of creating fulldome projects in the usual fashion, using high-end software like Maya to render dome masters of very high-resolution 3D and 2D imagery. Ringling College has a strong program in Maya training and its own substantial render farm, but these could not be made available for this project. Although the College has the premier undergraduate computer animation program in the country, only a couple of participating students had links to that program. Most of our participating students and faculty were

Claudia D. Cumbie-Jones, et al.

2008 Fulldome Summit, Chicago, Illinois, July 3rd, 2008

from the school’s programs in Fine Arts and Photography with a smattering of other majors. Only a few of the participating students had any familiarity with Maya or similar programs, so we were interested in developing other ways to make use of the facilities of the new planetarium theater. Experimentation. All but one of the participating artists entered the project in complete ignorance of the technical aspects of fulldome projection. Wendy Wischer, our Visiting Professor of Installation and Media Art, had a previous work commissioned by the Miami Planetarium, which we hoped to reengineer and adapt to the Bishop Planetarium. In the end, given the time and technology restrictions, we were unable to accomplish that goal, although Wendy hopes to adapt it for a proposed repeat of the Cosmix Project tentatively scheduled for next Spring. But Wendy’s experiments and trials working through the technology with Jeff Rodgers, Director of the Bishop Planetarium, provided the necessary technical insights and programming to allow for opening up the theater to the possibilities of more improvisational use. From the beginning, we felt intuitively that the way to rapidly develop creative content for the fulldome theater was to think of the potential imagery as not a single image sliced into the seven sections of our projector environment—six projectors around the perimeter of the 180˚ dome and one for the dome cap—but as a synchronous projection system capable of projecting seven synchronized simultaneous images. Through Wendy and Jeff’s experiments, about six weeks into the project—nearly half-way through—we were able to get a reasonable grasp of the technical situation and had a means to develop pieces that would use the fulldome projection capacity of the venue. The approach we chose was very much influenced by the aesthetic developed by early pioneers in fulldome projection like Stan Vanderbeek. Forged by the collaborative aesthetic of the Black Mountain School, Vanderbeek, besides being one of the first artists to work with computer animation, explored a number of interesting approaches to multimedia projection including his MovieDrome, a multi-projector dome environment he created in his backyard using the top of a grain silo. Vanderbeek believed in the transformative and consciousness raising power of immersive artistic spaces. Stan’s pieces were often designed to involve large communities of collaborators producing a range of imagery integrated with live performances in a collage-like assemblage of media hot and media cool. Similar approaches were used in experimental projection environments around the country, wherever there was an interest and enough available 16 mm and slide projectors to put on a show. Events of this type had the quality of uniqueness because no individual performance could be exactly replicated; in fact, the operative process of randomness creating its own context was often used in these events to insure no two performances would ever be the same [1]. In the case of the Cosmix project, we retained several elements of the projection environments of the psychedelic era. Instead of using slide projectors, we used portable LCD video projectors, the slide projector of the 21st century. The portability of these devices make it easy to direct them by hand or move them around the performance space highlighting or imagizing portions or features of live performances in real time. We also

Claudia D. Cumbie-Jones, et al.

2008 Fulldome Summit, Chicago, Illinois, July 3rd, 2008

used the video player mode of the Digital Sky software package to put projected moving images behind live performers using one of the projectors at the front of the house, in the dome area behind the performance/lecture area of the theater.

Figure 3. Head box with projection from Cosmix performance. (courtesy David Steiling)

It was the digital player function of the Digital Sky package that we were able to trick into functioning as a seven projector synchronized multi-image system playing mpeg-2 files. It turned out that the Digital Sky system was very specific about what files it would play and we were able to only project mpeg-2 files of a very pure type. The system is supposed to play .avi files, but we were unable to get it to do this in simultaneous mode. The system would not play a variety of mpeg-2 variants and we ended up using Adobe After Effects to process our mpeg-2 files into a variant Digital Sky could project. Much of the time it took to develop content for this multi-projector system was taken up in experimenting with file format variants until we could center on a process to produce files that would reliably play in the player. Because the "Cosmix" imagery was designed as either texture or as "pre-sliced" video files, we used Digital Sky's video player function to project onto the dome. Using mpeg-2 files, we simply renamed files, placing them in the correct sequence on the hard drives, and wrote a simple line of code instructing the system to play all videos simultaneously. Subaqua and Clouds. Besides using the digital projector function of Digital Sky to put moving imagery behind performers at one section of the dome, we developed some works that were essentially texture pieces—a single mpeg-2 file that could be mapped to all seven projectors; this was not a single image sliced, but the same image file projected

Claudia D. Cumbie-Jones, et al.

2008 Fulldome Summit, Chicago, Illinois, July 3rd, 2008

simultaneously from all seven projectors so that the effect was like a texture tiled in a mosaic seven times across the entire dome. How effective this looks becomes a matter of how well the image tiles. Using this tiling technique, it is relatively easy to produce original imagery that can be projected effectively in fulldome aspect in a matter of days, hours or even minutes.

Figure 4. Still from Subaqua (courtesy Dee Hood)

For these pieces we used mpeg-2 files created by our colleague and Cosmix participant Dee Hood. One mpeg-2 movie was taken underwater; the second movie was of clouds taken from an airline seat. The underwater work, called Subaqua, was also featured in an installation in the museum in collaboration with Sheryl Haler, and is accompanied by a soundtrack of whale and dolphin song. Soundtracks were played from one of the seven mpeg-2 files that were playing simultaneously. The second movie, Clouds, was accompanied by live original music by Rik Tweed. Prism techniques. In our next level of experimentation with the synchronized projector system, one of the student participants, Kayla Carlson, shot video footage using an improvised prismatic lens. This video file was then put into After Effects where it was edited and colorized. Six versions of the mpeg-2 file were produced, and each was tinted a different color. These six files were projected simultaneously from the six projectors around the perimeter of the dome. The dome cap projection space was left blank. The projection was accompanied by a performance from a live band. The piece was called “Refracted Perceptions of Nature through the Florida Spectrum."

Claudia D. Cumbie-Jones, et al.

2008 Fulldome Summit, Chicago, Illinois, July 3rd, 2008

Figure 5. Still from Refracted Perceptions of Nature through the Florida Spectrum (courtesy Kayla

Carlson)

Kayla describes the technical aspects of creating the work and its projected effect in her documentation statement, “’Footage for ‘Perceptions of Florida Light & Color’ was recorded at a remote farm near Myakka State Park in Sarasota, Florida. A crystal glass prism was placed over the video camera’s lens to create refracted and segmented shots. Rainbow spectrums and lens flares were produced by the glass prism, creating a dream-like effect later magnified by additional, computer generated, color alterations…using After Effects. The individual video shots were compiled in a montage format and layered together with the opacity feature. Slow dissolves were used as transitions. The hue/saturation and the brightness/contrast tool were used to alter the colors on the six videos to create a ‘Roy G. Biv’ light spectrum when projected on the dome of the planetarium. The videos were exported as mpeg-2’s to comply with the planetarium’s computer projection format. “ Dreamcatcher Our most complex experiment with the possibilities of the synchronized video playback system was developed by Lance Ford Jones and Claudia Cumbie-Jones and involved taking archival footage of their own and structuring it across all seven projectors to make a truly time-based media work of 8 minutes duration. The work included an original sound track. This work, entitled Dreamcatcher (Asabikeshiinh), demonstrates the high level of structural complexity possible within the relatively easy technical requirements of the synchronized projector system of the Digital Sky video player process. By careful and exacting preplanning of the image selections, images can be made to “track” around the dome, form thematic chains of associated images and subsequent variations on those

Claudia D. Cumbie-Jones, et al.

2008 Fulldome Summit, Chicago, Illinois, July 3rd, 2008

themes. The creators of the work describe their technical process in detail below, but we collectively feel that we have just scratched the surface of what is possible in seven projector-synchronized assemblages. The discussion that follows lays out working techniques that can liberate some of the complex possibilities of this simple fulldome projection technology. Concept. Dreamcatcher was conceived late in the Cosmix process. We were asked to create a 5 minute "image mix", and we decided to attempt a coordinated seven projector piece. One of the suggestions given for this approach was to vignette images in a black surround to avoid seams. This combined with the round venue of the planetarium dome reminded both of us of some of our early experiments with analog video feedback. The second idea for content came from a consideration of the arrangement of the seven projectors in Bishop Planetarium's SkyScan system, with six of the projectors forming a ring with the seventh filling in the cap of the dome. The ring of six projectors was an encouragement to explore imagery that was continuous and circular, which led to the consideration of an area of interest for the past several years: panoramic photography, specifically 360˚ panoramic landscapes (see Figure 6). The panoramic landscape seemed to work well with the cardinal directions in the theme of Cosmix. Many of the locations were photographed while traveling in the American Southwest; they were also sacred sites to Native Americans, and this seemed to reemphasize the importance of the spiritual nature of place and direction.

Figure 6. Fourth World (detail), 2004, Lance Ford Jones and Claudia Cumbie-Jones. Stitched 360˚

photographic panorama of Magdalena, New Mexico, wrapped around a glass of ice water. (courtesy

Cumbie-Jones/Jones) Dreamcatcher, or Asabikeshiinh, the original Ojibwe word meaning spider, refers to the familiar Native American object that hangs above many beds (and from the occasional rear-view mirror) serving to catch and hold bad dreams in its web while allowing the

Claudia D. Cumbie-Jones, et al.

2008 Fulldome Summit, Chicago, Illinois, July 3rd, 2008

good to pass through the hole in the center. The work is comprised of three basic elements. The music is a piece called Mock Coda, which is part of a collection of pieces from a CD completed in 2005. The 360˚ panoramic landscapes are from original digital photographs shot at several sacred locations in the American Southwest in 2004: Shiprock, or Tsé Bit'a'í as the Dine call it, in New Mexico, Canyon DeChelley, or Tséyi, in Arizona, and the Great Sand Dunes, near Alamosa, Colorado. The geometric elements are created through a process known as video feedback and were generated in 1982 as part of a collective experiment called the Vivid Index.e The spiraling patterns and spontaneous generation of symbols and mandala in the feedback were the inspiration for the title [3]. The locations were chosen for their association with important figures like coyote, the trickster, and spider grandmother, significant teachers of lessons and imparters of wisdom to Native Americans.

Figure 7. Still of video feedback from Dreamcatcher (courtesy Cumbie-Jones/Jones)

e For anyone who has accumulated thirty years of tape library in a variety of formats: Betamax, VHS, !”

U-Matic, Betacam, 8mm, High8, Digital8, and Fisher Price. Many tapes from the late 1980s suffer from

what is called “sticky shed syndrome”. Due to the changes in the chemistry of the binders used to hold the

oxide particles to the Mylar backing, certain tape tends to stick together, and when it comes apart it in

playback, it tends to lose the oxide and the coherent video signal. If it doesn’t come apart, it can easily burn

up the motors in most consumer tape decks. Our solution to this was to build a “desiccator” to dry out the tapes. We took the low energy consumption route and placed a canister filled with calcium chloride

(purchased from the Drierite Corporation for about $20) in an airtight plastic bin with a small computer fan

to circulate the air within the bin. You can find a lot of information online about this, from people baking

tapes in the oven to placing them in modified food dehydrators [2].

Claudia D. Cumbie-Jones, et al.

2008 Fulldome Summit, Chicago, Illinois, July 3rd, 2008

Video feedback. If you’re not familiar with video feedback, the simplest way to think of it is to point a video camera at a video monitor that is showing the camera’s output. In it’s simplest form you see a monitor within a monitor within a monitor, ad infinitum. However, with a few adjustments to the monitor and the camera, you can generate much more complex patterns and imagery [4]. These fractal patterns are similar to that produced by cellular automata software simulations of life. Depending on the off-axis rotation of the camera, you can generate an array of symbols like crosses, swastikas, pentagrams, hexagrams, and complex mandalas [5]. The video monitor and camera combine essentially to create an analog computer that calculates feedback. The output is recorded and/or digitized, taken into AfterEffects, and modified, enhanced and edited. The analog world gives you a great deal of complexity in real time for a very small investment. Panoramic sequences. The next problem was how to create a 360˚ continuous panoramic image for the landscape. Experimentation led to at least three possible techniques for creating the initial panning image/movie. The simplest is placing a video camera on a tripod and slowly panning a full 360˚. The second takes a series of sequential digital camera still images with incremental pans between shots, uses Canon’s ImageStitch or another piece of commercially available software to create a seamless panoramic two dimensional image, and then pans across that image in AfterEffects. The problem with both of these methods is their seamlessness. What would ordinarily be a positive was here, a negative. The six projectors are normally fed mpeg-2 movie files that have been “pre-distorted” by displacement. The movie file is convolved by the projector, even further distorted by being projected onto the curved surface of the dome. We are also taking into account the fact that the projectors’ fields overlap with one another and are feathered at the edges, all adding up to a seamless and undistorted recreation of the original. Our source video, on the other hand, would be distorted by the projector and the dome and would overlap and feather, but did not have the first step of being pre-distorted. If we used a seamless original, the effect would be to appear seamless and consistent across the span of each projector, with distinct seams and doubling of images at each overlap. The third method is to take the sequence of images into AfterEffects and create a sequence of dissolves from the first to the second, second to the third, and so on. This is the approach we took. There are 23 images in this sequence, so we simply set up 1 second dissolves to create a 23 second sequence. The images originally shot at 3072x2048, were scaled and cropped to fill a standard digital video frame, and then time-stretched to thirty seconds in AfterEffects. By selecting the third method the original source video was filled with ghostly after images that never completely settled. Not only did this seem appropriate to Cosmix’ somewhat mystical and psychedelic orientation, but it also resulted in a significant masking of seams or discontinuities.

Claudia D. Cumbie-Jones, et al.

2008 Fulldome Summit, Chicago, Illinois, July 3rd, 2008

Testing. Solutions to timing issues were worked out with the aid of a piece of software that we wrote using Cycling74’s Max/MSP Jitter (MAX) language. Figure 8 shows a screenshot of the main window, or patcher as it is called in MAX and two subpatchers.

Figure 8. Screenshot of the patchers in MAX. (courtesy Cumbie-Jones/Jones)

The software allows the simultaneous and synchronized playback of seven QuickTime movies, comprised of the six projectors A thru F and the seventh cap projector G. We tended to think of the B projector as the main projector because it displayed directly in front of the planetarium audience. We used it as the starting point for sequencing. Notice that it is labeled 1B. To its left is the A projector labeled 2A, continuing to the left, or counter clockwise around the dome 3F, 4E, 5D, and 6C. The cap projector is labeled 7G. The letters relate to the planetarium’s projector naming scheme and the numbers relate to our edit scheme. Notice also that the 1B movie is repeated to the left of the 6C movie so that both edges of the 1B movie can be seen. To test the timing and find the proper offset we did the following:

1. Using the “control” subpatcher, open the same movie on each of the six projectors (the 7G projector isn’t used for this part)

2. Open the “offset” subpatcher and select “get framecount” to show the number of frames in each movie (they should be the same) 900 in this case

3. Adjust the offset for the 2A projector until you find the best match between the right edge of 2A and the left edge of 1B.

4. Repeat step 3 for the 3F, 4E, 5D, and 6C projectors.

Claudia D. Cumbie-Jones, et al.

2008 Fulldome Summit, Chicago, Illinois, July 3rd, 2008

Figure 9. Screenshot of the patcher in MAX demonstrating offset. (courtesy Cumbie-Jones/Jones)

Eyeballing it this way we can see that the offset jumps in increments of 150 frames in this case. This basically comes down to the total number of frames (900) divided by the number of projectors (six) or 150 frames. If we set the 1B projector to 0 (the beginning of its movie), the 2A projector is at -150 (five seconds prior to start) and so on until the 6C projector is set at -750 frames (twenty-five seconds prior to start). Note that five seconds after the last projector (6C) starts, the first projector (1B) reaches the end of the thirty second movie and loops back to the beginning, keeping the offset consistent. Although the timing issues were not as critical with the feedback sequences, the software allowed us to pre-visualize the sequencing and synchronization before final editing.

Figure 10. Still from final confidence test of Dreamcatcher (courtesy Cumbie-Jones/Jones)

In addition to using the MAX software to pre-visualize and work out timing issues in pre-production, the software also allowed us to see our seven final movies played back in sync in post-editing. We rendered out the seven movies at low resolution (160x120), loaded each of these on to their corresponding projector and set the offsets to zero. We were able to get a good sense of the overall flow of the piece. Since the music for

Claudia D. Cumbie-Jones, et al.

2008 Fulldome Summit, Chicago, Illinois, July 3rd, 2008

Dreamcatcher is embedded in the QuickTime files, we were able to hear the sound synchronized to the playback as well. As a final confidence test, we created a similar setup in AfterEffects and rendered out a final composite of all seven movies. It’s important to remember that while MAX is playing back seven movies simultaneously any delay or slack in the timing of the playback will be made up by dropping frames. Thus it is not possible to watch the playback in real-time and be guaranteed that you are seeing every frame. By rendering the composite in AfterEffects you ensure that every single frame from every movie is present in the composite render. Since you are then playing back only one (albeit slightly larger than normal) movie, the likelihood of skipped frames in playback is diminished. Seven projector edit structure. The overall structure of Dreamcatcher can be seen in Figure 11. A quick narrative of the piece follows: Begin in black on all projectors for the first fifteen seconds while the music fades in. At fifteen seconds the 1B projector fades up from black to a segment of video feedback (F). Five seconds later the same footage starts on 2A, five seconds later on 3F, and around the dome until at forty seconds the 6C projector completes the circle. At this point the 7G projector fades up to a continuous and constantly varying edit of video feedback. This feedback sequence continues to the end of the piece. The feedback segment playing on the six ring projectors is one minute in length and fades out sequentially in the same order it faded in.

Figure 11. Whiteboard edit scheme for all seven projectors. (courtesy Cumbie-Jones/Jones)

The first panorama (P) fades up sequentially, 1B, 2A, 3F, 4E, 5D, until it completes the circle with 6C. The thirty-second landscape repeats twice on each projector and then fades out sequentially first on 1B at 2:15:00. The second feedback segment (synchronized so that events happen simultaneously in all six projectors) fades in, projector by projector, again taking thirty seconds to complete the circle, and running until all six projectors simultaneously fade out at 3:22:15. The second landscape fades up simultaneously on all six projectors. This panorama lasts one minute and is repeated twice. The offset of ten seconds per projector is factored into the starting point of each movie so that the landscape comes up correctly timed and rotates around the dome twice until all projectors fade out at 5:22:15. The third feedback sequence begins with all

Claudia D. Cumbie-Jones, et al.

2008 Fulldome Summit, Chicago, Illinois, July 3rd, 2008

projectors fading up simultaneously, synchronized, and fading out sequentially between 6:05:00 and 6:30:00. The third and final panorama staggers in on the six projectors at five-second intervals, repeating twice for a total of one minute. As it fades out on each projector the final feedback segment fades up, staggered, and finally fades out projector by projector until the C projector fades out at 8:30:00. The feedback that faded in at forty-five seconds on the G projector, and has run continuously, fades out also at 8:30:00 as the music also fades.

Figure 12. Whiteboard edit scheme for the 1B projector. Circled numbers along the bottom indicate steps required to turn this into the 2A projector. (courtesy Cumbie-Jones/Jones)

Timing effects. Using the testing software, we achieved three distinct types of timing effects in the final piece. The first effect was using feedback and bringing up each segment with a time offset so that an event that occurs on one projector “chases” from one projector to the next.

Figure 13. The “chase” effect. (courtesy Cumbie-Jones/Jones)

The second effect uses the landscape pictures and brings up each segment with an offset to create a continuous panorama.

Claudia D. Cumbie-Jones, et al.

2008 Fulldome Summit, Chicago, Illinois, July 3rd, 2008

Figure 14. The continuous panorama effect. (courtesy Cumbie-Jones/Jones)

And the third effect uses feedback and brings up the projections in synchronization so that each event occurs on all projectors simultaneously.

Figure 15. The synchronized feedback effect. (courtesy Cumbie-Jones/Jones)

For Dreamcatcher our use of the MAX programming environment was to create a tool that could be used in both pre-production for pre-visualization and to work out timing issues, and in post-production to view the seven renders synchronously as a confidence test. Our involvement in Cosmix spanned a total of three weeks from our first thoughts about the venue to the night of the final show. We doubt that we could have solved the various problems and produced the work on so short a timeline without the aid of these tools. In our current work we are using another set of software tools that we have developed in MAX, again working within a very tight time frame, to generate 3600x3600 resolution domemasters directly from MAX (see Figure 16).

Claudia D. Cumbie-Jones, et al.

2008 Fulldome Summit, Chicago, Illinois, July 3rd, 2008

Figure 16. Screenshot of custom domemaster software in MAX. (courtesy Cumbie-Jones/Jones)

Future Work We feel we now have a reasonable grasp on the means by which a range of would-be dome artists can access some of the powerful image effects of fulldome theater without having to become familiar with high-end software or expensive digital equipment. The methods we outline in this paper are capable of producing extremely complex and aesthetically satisfying works using software like Adobe After Effects and simple mpeg-2 files that are within the technical ability of even beginners at digital media art. These techniques effectively bridge consumer photographic technology at relative low resolutions to high-definition fulldome projection. These techniques make it possible for planetaria to integrate locally produced and community generated content into the programming of their fulldome facilities. These techniques have the promise of recreating some of the spirit of spontaneity and ad hoc collaboration that were salient features of many early dome projection experiments. In the next generation of our own experiments we hope to pursue the possibilities of using computer visualization in combination with dome projectors to bring back real-time music/image improvisation to the new generation of planetarium theaters. We also hope to experiment with high definition still imagery from large format digital cameras using

Claudia D. Cumbie-Jones, et al.

2008 Fulldome Summit, Chicago, Illinois, July 3rd, 2008

stop motion to simply and quickly create imagery that could be sliced for fulldome presentation. In addition, we are also experimenting with ways to shorten production time on the creation of domemasters for more traditional means of planetarium programming. We are in the process of organizing a Cosmix 2 project and symposium to unfold in Florida in the Spring of 2009 during which we other working groups from around the country might come to share their work and approaches to lowering the technical barriers to fulldome media projects.

References

[1] Youngblood, G., “Expanded Cinema” 246-248, 349-350 1970.

[2] Website: http://newsgroups.derkeiler.com/Archive/Uk/uk.rec.audio/2006-05/msg00069.html; accessed

23 June 2008. Website: http://www.wendycarlos.com/bake%20a%20tape/baketape.html; accessed 23 June 2008.

[3] Talman, L. “Dream-Catcher Mandalas: Mathematical Art” Website:

http://clem.mscd.edu/~talmanl/Mandalas.html; accessed 23 June, 2008.

[4] Hofstadter, D. “Gödel, Escher, Bach: an Eternal Golden Braid” 488-493 1979.

[5] Cortial, J., and Padgett, M., “Fractals in pixellated video feedback” Contemp. Phys. 44, 137-143 2003

Website: http://www.physics.gla.ac.uk/Optics/play/fractalVideoFeedback/research.html; accessed 22 June,

2008.