CS184 Assignment 9 -- 2004

Texturing, Lighting, and Rendering


Introduction

The main objectives of this assignment are to take a procedurally generated SLIDE model from the folder http://www.cs.berkeley.edu/~sequin/CS184/AS9_SCU/ and provide this model with suitable surface definition or texture, then illuminate it properly with two or more light sources, choose a viewing angle that brings out the model's best features; and then use a more sophisticated rendering system than SLIDE can offer to produce a high quality rendering of the given object. In order to use a more powerful rendering system, you must generate a RIB output file, which you can then render with BMRT. The procedural models used in this assignment will need to include a new surface property called texture. You will use the rendering system to produce a series of images highlighting various visual effects, including shadows, radiosity (color bleeding), antialiasing, and area lights. These images along with a brief write-up will be your submission. See the references section for links to other pages and informative sections from your text book.

A bit of caution: Do not waste time rendering high resolution images with radiosity until you are almost done! Instead, concentrate on getting your geometry, surface properties, light positions, and viewing parameters optimized. This will make your work much more efficient. Do not forget to read the summary of how to start section at the end of the document.

This is your chance to show that this semester's 184 class can produce great displays. We will post a few of the most interesting images when the assignment is complete!


Contents


Procedural Modeling

Procedural modeling is an algorithmic way of producing geometric data. For this assignment we will give you a few SLIDE files with geometrical shapes that might make impressive constructivist sculptures, if they are built at the right scale, from the right materials, placed on a proper pedestal, and properly illuminated. You find these sculpture models in http://www.cs.berkeley.edu/~sequin/CS184/AS9_SCU/ . To find the best possible presentation for one of these models is your assignment! Thus, this assignment is primarily about textures and lighting and rendering; the generation of geometry is somewhat secondary for now -- you can then fully let your creativity roam in that respect during the final project. Thus a first key decision is to choose a suitable texture. Look at the geometry of the piece that you chose, and ask yourself would this be implemented best in stone, or in metal, or in glass ... Then you may have to add texure coordinates to your vertices, and textures to your surfaces to make your object more interesting. Texturing is discussed in more detail in the section on texture mapping. If you want to refine or enhance the basic geometry given, you may want to use a tclinit block to generate that more sophisticated geometry with the extra information for the renderer. The full version of SLIDE that you will use for this assignment knows how to process this tclinit block.


Texture Mapping

In addition to standard geometry, your procedural model will also include surface properties, such as color, diffuse and specular coefficients, and also texture information. The idea of texture mapping is to repeatedly paint the same small picture onto your geometry. The motivation is to efficiently model intricate patterns that would require a great deal of very small geometry information. In SLIDE each point can also specify a texture coordinate, and a surface contains the name for a texture file. For SLIDE your texture file must be a .gif file, and the height and width in pixels must each be a power of 2. On the PC, Microsoft Imaging editor or some other utility should let you convert images from other formats. On UNIX xv can do this for you. In SLIDE you will specify a texture file for each textured surface. If a surface is to be textured you should probably set its color to ( 1 1 1 ) because SLIDE multiplies the surface color by the texture map values. The next step is to specify texture coordinates for each vertex of the faces in your surface. When the texture map is applied, the renderer linearly interpolates the texture coordinates you supply to obtain a coordinate for a particular point on the surface. The texture mapping algorithm then samples the texture at the coordinates obtained and uses that value to color the pixel. The following figure gives some examples.


Texture coordinates are usually two values (s and t) but SLIDE requires three values and ignores the third. The geometric coordinates in the diagram are in 2D, but for SLIDE (as you know!) they are in 3D. Read the SLIDE language description and your textbook for more more details. The tricky part about texture mapping an object consisting of multiple polygons is assigning texture coordinates so that the texture is smooth and continuous over the object. There are some tricks to this. One is to use a texture with a solid background color and some seperated foreground elements (stars on black or random polka dots are examples) then it is easier to get away with approximating continuity. The torus.tcl file shows a good example of how to wrap a texture around an object smoothly. Note that the circuit board texture we use for the torus is tileable, which means that if you put a bunch of copies next to eachother there are no disconituities in the texture. This and the wood texture both came from this site. Note that while all of the textures on that site tile, many are not a power of 2 by a power of 2 in size, so you will have to resize them in an image editing program.


Area Light

Area light sources are a good approximation of a diffuse light source in the real world. A diffuse light source emits light from each point on its surface in all directions out of the surface. Actually a perfectly diffuse surface with a light shining on it acts as a diffuse light source. Calculating the illumination due to an area light source at a point can be very difficult. In requires computing the amount of light leaving each individual point on the light source in the direction of the illuminated point. This direction changes over the area light source. Because this integral is complex usually the result is approximated by representing the area light source as a number of small patches over which the direction from light source to the point changes little, or completely discretizing the light source into a collection of point light sources on the surface of the area light. The number of facets or point light sources used to simulate the area light effects the accuracy of the lighting. Area light sources provide softer light, and soft shadows on surface. This illumination is much more realistic than the other light sources we have used. Look around you and try to find a really sharp shadow. It is difficult unless you have a bright light with a very small bulb/reclector approximating a point light source or spotlight (for instance a small halogen bulb). This is why area light sources can add realism to rendered images. In BMRT area light sources are defined as geometry which emits light. For instance a polygon would emit light in the direction of its normal. (note: This may be different than the normals you specify for the vertices, it is the normal computed from the position of all the verticies of the plane, and depends on the order in which you specify the verticies.) While the SLIDE viewer (openGL really) is not capable of displaying the illumination effects of area light sources properly, it will approximate the effect with a spotlight using very small angular fall off, and by making the area light source glow in the image. In addition slide will let you specify the parameters for an area light source and produce the appropriate commands in the RIB file output for a scene so that BMRT can properly model the light from the area light source. There are examples of area light sources in the SLIDE files for the examples discussed in the summary.

Radiosity

Radiosity methods allow the computation of illumination due to light that bounces off of multiple diffuse objects along its path from the light source to the viewer. Since most objects are modeled as having a diffuse coeficient and a specualr coeficient, we usually see multiple "diffuse bounces" off of objects. Ray-tracing by itself only models light that makes one diffuse bounce and then multiple specular bounces on its way from the light source to the viewer.

In order to compute the effect of diffuse light bouncing off of objects, we break the scene down into small facets on all the surfaces. We assume the facets are lambertian surfaces and that they are small enough so that light striking them from a nearby light source is coming in at the same angle over the entire facet (note this is not true for a large polygon!). Then we compute how much light reaches the diffuse surfaces directly from light sources. As discussed earlier, a diffuse surface patch under illumination looks like an area light source, so in the next step we treat all the illuminated diffuse patches as light sources, and once again calculate how much light falls on each of the diffuse patches. Ideally we would repeat this calculation a very large number of times, summing the light directly from the light source, from one bounce, from two bounces, etc... To do this would require computing the powers of a large matrix describing how much light travels from each facet to each other facet. If our scence had say one million facets this becomes a million by a million matrix, and is difficult to raise to a power! In practice people estimate the effects which will appear brightest in the image. For your .rib files you can help the radiosity renderer, by telling how big to make the facets for your scene. If it breaks your scene up into too many small facets, the rendering time will be very long! On the other hand if it breaks your scene up into too few facets, the illumination will be be clearly wrong.


Antialiasing

Antialiasing depends heavily on ideas from signal analysis, and sampling theory, but for our purposes in this assignment we will take a simple approach to the subject. The goal of antialiasing is to remove artifacts of sampling from the image. In this case the problem is sampling only one color value per pixel. This results in what has come to be known (it's actually in the dictionary as a computer graphics term) as the "jaggies" or the staircase-like effect along the edges of lines or polygons in an image. One solution to this problem is to color a pixel based on the percentage of its area covered by a polygon. This is called "unweighted area sampleing". An example of the normal technique next to an example of unweighted area sampling follows:

While analytically computing the area of this intersection would be difficult, especially in a ray tracing system, there is a more straightforward way to get an approximation of this result. By sending out a large number of rays through a pixel and averaging their color values, we get an approximation to area weighted sampling. The following image shows a number of rays (the black circles) sent out in the direction of a pixel. The color of the pixel is determined by the average of the color of the surface each ray hits.

Producing RIB

What is RIB? RIB files are RenderMan Interface Bytestream files. They are written to comply with the RenderMan interface standard, developed by PIXAR. All the documentation for the format is available on-line from PIXAR. And you can find summary here. You should read this documentation when you want to go in and modify your .rib files by hand. You will need to do this for some of the more complex effects at the end of the assignment. Many programs produce rib output for rendering with RenderMan (a fancy rendering program from Pixar) or other tools which comply with the Renderman standard (e.g., Pixie, or BMRT, or public domain raytracing programs that support ray tracing and radiosity).

How do I get a RIB file?

From the SLIDE viewer you can output a RIB file of the current geometry by selecting "save as rib" from the file menu. For this assignment you will be using the full SLIDE viewer called slide which has all the features of SLIDE implemented for you. Once you have written the RIB file (name it with extension .rib), quit slide. You can examine the RIB file in a text editor, it is basically a tagged text file. The RIB files produced by the SLIDE viewer have comments that should help you find which parts of your geometry have been converted into which parts of the RIB file. The RIB file will contain information about geometry, surface properties, and light sources. While RIB supports texture mapping, it cannot use the .gif files that SLIDE does. At the end of the RIB file produced by SLIDE there will be a list of the texture files used in the file. You need to convert each of these from a .gif file to a .tif file, and then to a .tx (texture) file. You can use microsoft imaging editor (under accessories on the nt machines in the lab) to produce .tif files from your .gif file. Load the .gif and then select "save as". Be sure to save the file as a 24 bit tiff (this is an option under "more" in the save as dialog), otherwise it will not work. Follow the instructions from the next step for setting the path to the BMRT binaries, and then you can run the following command to convert your tiff into a texture file:

T:\mkmip.exe myTexture.tif myTexture.tx  (for BMRT)
T:\texmake myTexure.tif myTexture.tx  (for Pixie)

This will produce a file called myTexture.tx which you should place in the same directory as the RIB file. Once you have done this for all the textures you are using, you are ready to begin rendering!


Making a Rendering

We are supplying two RIB compliant renderers: Pixie and BMRT. You are also free to use other renderers, such as Maya.

Pixie (Renderman Renderer)

Using Pixie will probably be slightly more of a hassle, but what you will learn is more useful.  For standard features, Pixie uses the same interface as Pixar's Renderman (the Renderman Interface).  This will give you practice editing rib files and get you somewhat acquainted with some things that people actually do, including writing shaders.

Pixie is setup on the labs already. If you download Pixie for windows (http://www.cs.berkeley.edu/~okan/Pixie/download) at home, there will be a couple of environment variables to change.

Add "C:\Program Files\Pixie\bin" (or wherever the bin folder is on your machine) to you path
Add a new environment variable called "PIXIEHOME" and point it to "C:/Program Files/Pixie" (or wherever Pixie is)

When you render the .rib file straight from SLIDE, you'll notice that there are no shadows.  This is a good first thing to fix.  You must (1) add lines to the .rib file and (2) create a custom shader that takes visibility into account.

SHADOWS:
To turn on ray tracing (for shadows, etc.) add this to the beginning of the .rib file:

Attribute "visibility" "trace" [1]
Attribute "visibility" "shadow" [1]
Attribute "visibility" "transmission" "opaque"

Once the raytracing is turned on, you can use visibility(A,B) to compute
shadows inside light source shaders or trace(A,B) inside the surface shaders.
For example, to make pointlight shader cast raytraced shadows, change the
shader from

illuminate (from) {
Cl = intensity * lightcolor / (L . L);
}


to

illuminate (from) {
Cl = visibility(from,Ps)*intensity * lightcolor / (L . L);
}

Also change the name of the shader in the file from 'light' to 'mylight' for example.

Save your changed shader in your as9 folder (perhaps a /shader subfolder).

You will then need to recompile the shader that you've changed. 
At the command line, in the directory containing the .sl file:

sdrc myshader.sl

This should create mylight.sdr (following from my example)
Note that even if you change pointlight.sl and save it as pointlight2.sl, when you compile it, it WILL 
OVERWRITE pointlight.sdr INSTEAD of generating pointlight2.sdr. The shader name is the name in the
file not the filename.

Now you'll have to replace the previously named light shader in the .rib file 'light' to your new shader 'mylight'.  Also, you'll have to add a line to add your shader directory to its shader searchpath.

Option "searchpath" "shader" "H:/myshaders:&"

Now rendering the .rib file should produce an image with shadowing.

  

That's sort of a lot to get shadows working, but these are exactly the same issues you would face using Pixar's Renderman.

 

AREA LIGHTS:
You shouldn't actually have to do anything for area lights, but you may want to tweak the parameter.  This line that SLIDE already places in the RIB file for you is:

Attribute "trace" "numarealightsamples" [30]

30 is a pretty good number, but a higher number will give you a smoother result.

 


PHOTONMAPS:
Pixie does not implement radiosity. It uses Photon Maps which are more useful in generating all aspects of global illumination (not just diffuse bounces). 

Normally a modelling package, like Maya, will have options to add lines for irradiance caching to the rib file for you.  It will format your file to do fancy things like a two pass rendering. That will be difficult for you to do by hand so we're going to try and stick with one pass.

RIB file changes:

Option "hider" "photonmap" [1]
Attribute "visibility" "photon" [1]
Attribute "trace" "maxdiffusedepth" [1]

Note that global illumination is a messy process and involves lots of
parameter tweaking. The parameters involved in photon mapping and their default values are:

Attribute "globalphoton" "estimator" [100]     # the number of photons to use in radiance estimate
Attribute "globalphoton" "maxdistance" [1]     # The lookup distance
Attribute "globalphoton" "emit" [10000]        # Number of photons to emit
Option "irradiance" "maxerror" [0.5]           # The irradiance cache error control
Option "irradiance" "minsampledistance" [0.1]  # The irradiance cache error control
Option "irradiance" "maxsampledistance" [1]    # The irradiance cache error control

You can use causticphoton instead of globalphoton above.

Okan was nice enough to fiddle with the parameters and find what looks good for one of the example scenes.  They are these.  Try them first when you're looking at photon maps:

Attribute "globalphoton" "estimator" [500]
Attribute "globalphoton" "maxdistance" [1]
Attribyte "globalphoton" "emit" [100000]

Also you must change all instances of the "plastic" and "matte" shader to "gplastic" and "gmatte" respectively

The Pixie documentation can be found online at http://www.cs.berkeley.edu/~okan/Pixie/doc/">.

You may also find it useful to browse through the complete PIXAR Renderman Interface Specification

BMRT (the Blue Moon Rendering Tools)

You can download the Win32 binaries for BMRT here. If you need it for other platforms, email the TA's.

The first step of rendering is to setup some environment variables so that you can easily run the BMRT utilities from the dos command prompt. The software is installed on the caffe partition, and you will need to add some entries to your path and some environment variables. To do this right click on the My Computer icon in the upper left of you windows nt desktop, and select the environment tab.

For your path add the following directory:
S:\bmrt\SYSTEM\NT\bmrt2.4\bin
Then add the following two environment variables with the values shown:
SHADERS 
S:\bmrt\SYSTEM\NT\bmrt2.4\shaders

HOME 
S:\bmrt\SYSTEM\NT\bmrt2.4

Make sure to set these and click on apply before opening a dos shell. Also if you ever get a "cannot load shader xxxx" you should check these variables. Now open a dos shell and test this by going to the S:\bmrt\SYSTEM\NT\bmrt2.4\examples directory and typing
rgl.exe shadtest.rib
This should produce a faint image. Then try:
rendrib.exe -d shadtest.rib
This should produce a more detailed image. Now you are ready to go to the directory with your .rib RIB file and .tx texture files and render away. Near the beginning of a SLIDE produced RIB file is a line beginning with Display. The first argument in quotes is the name of the file the rendered image will be saved to if you save it. The next argument tells the renderer to render to the screen. Once the image has been rendered to the screen you can press "w" in the viewing window and it will be saved to the filename specified. The output is always a TIF file, and should have extension .tif. Also near the beginning of the RIB file is a line beginning with Format with two parameters that specify the size of the output image in pixels. You can make this smaller to speed up rendering. I suggest commenting it out completely and using the -res command line option described below. The following command line options for rendrib.exe will probably be usefull while you render various images.
Command Line Option Description
-samples a b Makes the ray tracer send out a columns of b rays per pixel for a total of ab rays per pixel
-var threshold min max Sends out min samples per pixel, if these samples have a variance above threshold then send out up to max samples until the variance is below the threshold.
-d [n] Tells BMRT to render to the screen. The optional argument n makes it trace each nth line creating a progressive refinement of the image. (Good for a quick check.) When the rendering is done, pressing w in the display window will save the .tif file, if one was specified in the .rib file
-res x y Tells BMRT to make an image x by y pixels in size.
-radio n Tells BMRT to use radiosity and compute at most n radiosity passes.

You can find more information in the README files in the BMRT installation (see Links and References below).


Hand-in Requirements

You will need to hand in some code, a .rib file, a sequence of five images and three short write-ups as follows:
Put your four write-ups in a file called ANSWERS,
and each one of your images in a file named "image[1-5]xx.jpg", where the "xx"-part represents the letters of your instructional account.
Then submit all of these files from your as9 directory.

You may do this assignment alone or with your preferred partner;
but if you work as a pair you will have to submit twice as many images:
labeled as "image[1-5]xx.jpg" and "image[1-5]yy.jpg", respectively, reflecting your two account logins.
You may use the same sculpture model and the same basic setup, but you should choose two different sets of materials, surface treatments, light constellations, and viewpoints.