Rendering Guidelines

Basic Scene Anatomy

The fundamental building blocks of an |nsi| scene

The fundamental building blocks of an ɴsɪ scene

A minimal (and useful) ɴsɪ scene graph contains the three following components:

  1. Geometry linked to the .root node, usually through a transform chain.

  2. ᴏsʟ materials linked to scene geometry through an attributes node.

  3. At least one outputdriver​→​outputlayer​→​screen​→​camera​→​.root chain to describe a view and an output device.

The scene graph in shows a renderable scene with all the necessary elements. Note how the connections always lead to the .root node.

In this view, a node with no output connections is not relevant by definition and will be ignored.

Caution

For the scene to be visible, at least one of the materials has to be emissive.

A Word – or Two – About Attributes

Those familiar with the RenderMan standard will remember the various ways to attach information to elements of the scene (standard attributes, user attributes, primitive variables, construction parameters). E.g parameters passed to RenderMan Interface calls to build certain objects. For example, knot vectors passed to RiNuPatch().

Attribute inheritance and override

Attribute inheritance and override

In ɴsɪ things are simpler and all attributes are set through the NSISetAttribute() mechanism. The only distinction is that some attributes are required (intrinsic attributes) and some are optional: a mesh node needs to have P and nvertices defined — otherwise the geometry is invalid.

Note

In this documentation, all intrinsic attributes are documented at the beginning of each section describing a particular node.

In ᴏsʟ shaders, attributes are accessed using the getattribute() function and this is the only way to access attributes in |nsi|. Having one way to set and to access attributes makes things simpler (a design goal) and allows for extra flexibility (another design goal). shows two features of attribute assignment in ɴsɪ:

Attribute inheritance

Attributes attached at some parent (in this case, a metal material) affect geometry downstream.

Attribute override

It is possible to override attributes for a specific geometry by attaching them to a transform node directly upstream (the plastic material overrides metal upstream).

Note that any non-intrinsic attribute can be inherited and overridden, including vertex attributes such as texture coordinates.

Instancing

Instancing in ɴsɪ is naturally performed by connecting a geometry to more than one transform (connecting a geometry node into a transform.objects attribute).

_images/instancing.svg

Instancing in ɴsɪ with attribute inheritance and per-instance attribute override

The above figure shows a simple scene with a geometry instanced three times. The scene also demonstrates how to override an attribute for one particular geometry instance, an operation very similar to what we have seen in the attributes section. Note that transforms can also be instanced and this allows for instances of instances using the same semantics.

Creating ᴏsʟ Networks

A simple |osl| network connected to an attributes node

A simple ᴏsʟ network connected to an attributes node

The semantics used to create ᴏsʟ networks are the same as for scene creation. Each shader node in the network corresponds to a shader node which must be created using NSICreate. Each shader node has implicit attributes corresponding to shader’s parameters and connection between said arguments is done using NSIConnect. Above diagran depicts a simple ᴏsʟ network connected to an attributes node.

Some observations:

  • Both the source and destination attributes (passed to NSIConnect must be present and map to valid and compatible shader parameters (Lines 21–23).

Note

There is an exception to this: any non-shader node can be connected to a string attribute of a shader node. This will result in the non-shader node’s handle being used as the string’s value.

This behavior is useful when the shader needs to refer to another node, in a ᴏsʟ call to transform() or getattribute(), for example.

  • There is no symbolic linking between shader arguments and geometry attributes (a.k.a. primvars). One has to explicitly use the getattribute() ᴏsʟ function to read attributes attached to geometry. In this is done in the read_attribute node (Lines 11–14). Also see the section on attributes.

 1Create "ggx_metal" "shader"
 2SetAttribute "ggx"
 3    "shaderfilename" "string" 1  ["ggx.oso"]
 4
 5Create "noise" "shader"
 6SetAttribute "noise"
 7    "shaderfilename" "string" 1 ["simplenoise.oso"]
 8    "frequency" "float" 1 [1.0]
 9    "lacunarity" "float" 1 [2.0]
10
11Create "read_attribute" "shader"
12SetAttribute "read_attribute"
13    "shaderfilename" "string" 1 ["read_attributes.oso"]
14    "attributename" "string" 1 ["st"]
15
16Create "read_texture" "shader"
17SetAttribute "read_texture"
18    "shaderfilename" "string" 1 ["read_texture.oso"]
19    "texturename" "string" 1 ["dirt.exr"]
20
21Connect "read_attribute" "output" "read_texture" "uv"
22Connect "read_texture" "output" "ggx_metal" "dirtlayer"
23Connect "noise" "output" "ggx_metal" "roughness"
24
25# Connect the OSL network to an attribute node
26Connect "ggx_metal" "Ci" "attr" "surfaceshader"

Lighting in the Nodal Scene Interface

Creating lights in nsi

Creating lights in nsi

There are no special light source nodes in ɴsɪ (although the node, which defines a sphere of infinite radius, could be considered a light in practice).

Any scene geometry can become a light source if its surface shader produces an emission() closure. Some operations on light sources, such as light linking, are done using more general approaches.

Following is a quick summary on how to create different kinds of light in ɴsɪ.

Area Lights

Area lights are created by attaching an emissive surface material to geometry. Below is a simple ᴏsʟ shader for such lights (standard ᴏsʟ emitter).

Example emitter for area lights
 1// Copyright (c) 2009-2010 Sony Pictures Imageworks Inc., et al.  All Rights Reserved.
 2surface emitter     [[ string help = "Lambertian emitter material" ]]
 3(
 4    float power = 1 [[ string help = "Total power of the light" ]],
 5    color Cs = 1    [[ string help = "Base color" ]])
 6{
 7    // Because emission() expects a weight in radiance, we must convert by dividing
 8    // the power (in Watts) by the surface area and the factor of PI implied by
 9    // uniform emission over the hemisphere. N.B.: The total power is BEFORE Cs
10    // filters the color!
11    Ci = (power / (M_PI * surfacearea())) * Cs * emission();
12}

Spot and Point Lights

Such lights are created using an epsilon sized geometry (a small disk, a particle, etc.) and optionally using extra arguments to the emission() closure.

An example OSL spot light shader
 1surface spotlight(
 2    color i_color = color(1),
 3    float intenstity = 1,
 4    float coneAngle = 40,
 5    float dropoff = 0,
 6    float penumbraAngle = 0
 7) {
 8    color result = i_color * intenstity * M_PI;
 9
10    // Cone and penumbra
11    float cosangle = dot(-normalize(I), normalize(N));
12    float coneangle = radians(coneAngle);
13    float penumbraangle = radians(penumbraAngle);
14
15    float coslimit = cos(coneangle / 2);
16    float cospen = cos((coneangle / 2) + penumbraangle);
17    float low = min(cospen, coslimit);
18    float high = max(cospen, coslimit);
19
20    result *= smoothstep(low, high, cosangle);
21
22    if (dropoff > 0) {
23        result *= clamp(pow(cosangle, 1 + dropoff),0,1);
24    }
25    Ci = result / surfacearea() * emission();
26}

Directional and HDR Lights

Directional lights are created by using the node and setting the angle attribute to 0. HDR lights are also created using the environment node, albeit with a 2π cone angle, and reading a high dynamic range texture in the attached surface shader. Other directional constructs, such as solar lights, can also be obtained using the environment node.

Since the node defines a sphere of infinite radius any connected ᴏsʟ shader must only rely on the I variable and disregard P, as is shown below.

An example OSL shader to do HDR lighting
 1shader hdrlight(
 2    string texturename = ""
 3) {
 4    vector wi = transform("world", I);
 5
 6    float longitude = atan2(wi[0], wi[2]);
 7    float latitude = asin(wi[1]);
 8
 9    float s = (longitude + M_PI) / M_2PI;
10    float t = (latitude + M_PI_2) / M_PI;
11
12    Ci = emission() * texture(texturename, s, t);
13}

Note

Environment geometry is visible to camera rays by default so it will appear as a background in renders. To disable this simply switch off camera visibility on the associated node.

Defining Output Drivers and Layers

|nsi| graph showing the image output chain

ɴsɪ graph showing the image output chain

ɴsɪ allows for a very flexible image output model. All the following operations are possible:

  • Defining many outputs in the same render (e.g. many EXR outputs)

  • Defining many output layers per output (e.g. multi-layer EXRs)

  • Rendering different scene views per output layer (e.g. one pass stereo render)

  • Rendering images of different resolutions from the same camera (e.g. two viewports using the same camera, in an animation software)

depicts a ɴsɪ scene to create one file with three layers. In this case, all layers are saved to the same file and the render is using one view. A more complex example is shown in : a left and right cameras are used to drive two file outputs, each having two layers (Ci and Diffuse colors).

|nsi| graph for a stereo image output

ɴsɪ graph for a stereo image output

Light Layers

_images/multilight.svg

The ability to render a certain set of lights per output layer has a formal workflow in ɴsɪ. One can use three methods to define the lights used by a given output layer:

  1. Connect the geometry defining lights directly to the outputlayer.lightset attribute

  2. Create a set of lights using the set node and connect it into outputlayer.lightset

  3. A combination of both 1 and 2

Above diagram a scene using method to create an output layer containing only illumination from two lights of the scene. Note that if there are no lights or light sets connected to the lightset attribute then all lights are rendered. The final output pixels contain the illumination from the considered lights on the specific surface variable specified in outputlayer.variablename ().

Inter-Object Visibility

Some common rendering features are difficult to achieve using attributes and hierarchical tree structures. One such example is inter-object visibility in a 3D scene. A special case of this feature is light linking which allows the artist to select which objects a particular light illuminates, or not. Another classical example is a scene in which a ghost character is invisible to camera rays but visible in a mirror.

In ɴsɪ such visibility relationships are implemented using cross-hierarchy connection between one object and another. In the case of the mirror scene, one would first tag the character invisible using the attribute and then connect the attribute node of the receiving object (mirror) to the visibility attribute of the source object (ghost) to override its visibility status. Essentially, this “injects” a new value for the ghost visibility for rays coming from the mirror.

Visibility override, both hierarchically and inter-object

Visibility override, both hierarchically and inter-object

Above figure shows a scenario where both hierarchy attribute overrides and inter-object visibility are applied:

  • The ghost transform has a visibility attribute set to 0 which makes the ghost invisible to all ray types

  • The hat of the ghost has its own attribute with a visibility set to 1 which makes it visible to all ray types

  • The mirror object has its own attributes node that is used to override the visibility of the ghost as seen from the mirror. The nsi stream code to achieve that would look like this:

    Connect "mirror_attribute" "" "ghost_attributes" "visibility"
        "value" "int" 1 [1]
        "priority" "int" 1 [2]
    

    Here, a priority of 2 has been set on the connection for documenting purposes, but it could have been omitted since connections always override regular attributes of equivalent priority.

Footnotes

1

Parameters passed to `Ri calls to build certain objects. For example, knot vectors passed to RiNuPatch.