Pixel Processor Ray tracer


The pixel processor ray tracer uses the Python API and the sbsMath library to generate a complex pixel processor node that ray traces a sphere

  • raytracer.py Script that creates the pixel processor functions required for the sample and creates a network so it can be visualized.
  • sbsmath.py A python module introducing python types for generating pixel processors, functions and dynamic values in a convenient way.
  • sbsmath_tools.py A set of higher level operations to make it more convenient to work with sbsmath.py

The output will be put in ray tracer directory in the samples directory and consists of a new sbs file with a pixel processor ray tracing a sphere

In order to run the demo, make sure python is installed and in your path environment and that the substance python api is installed. Then go to the samples directory and run:

python raytracer.py

It will generate a substance file containing a node that ray traces a sphere and lights it.


Ray tracing example here is not chosen as a good example of how to use pixel processor nodes, it's mainly an example of computations that would be hard to create manually in a pixel processor and how using the autograph library it gets easier to understand and deal with.

The fundamental process is described here. Briefly what happens is:

  • A camera ray is generated for each pixel.
  • The ray is intersected with a sphere
  • The intersection point is turned into a uv coordinate
  • Textures for roughness and diffuse are sampled
  • The point is being lit by a directional light source

Also note how there are some oddities in how samples missing the sphere are dealt with. They are still being lit but the result is discarded in the end which is a consequence of the programming model used in pixel processors.

The bulk of this demo is in the sbsmath.py where a set of new python types with overloaded operators for arithmetic are introduced. In order to understand it better, let's take a look at a snippet of code for generating a function in the substance document.

def sphere_uv(fn):
Substance function for generating a uv coordinate from a sphere normal and a tiling factor
:param fn: The function context to create the function in
:type fn: FunctionContext
:return: function, a function to call to instantiate the function
normal = fn.input_parameter('normal', widget_type=sbsenum.WidgetEnum.COLOR_FLOAT3)
tiling = fn.input_parameter('tiling', widget_type=sbsenum.WidgetEnum.COLOR_FLOAT1)
fn_pi = fn.import_external_function('sbs://functions.sbs/Functions/Math/Pi')
pi = fn_pi()
two_pi = pi * 2.0
u = (fn.atan2(fn.expand(normal[0], normal[2])) + pi) / two_pi
v = fn.atan2(fn.expand(1.0, normal[1])) / pi
uv = fn.expand(u * tiling, v * tiling)
return fn.generate(uv)

This code generates a node that computes uv's for a position on a sphere using the normal of the sphere as input together with a tiling parameter to set how many times it should wrap along the u and v direction.

  • The fn parameter represents a function context which is a helper object for code generation. You never need to create these function contexts yourself, it will be handled behind the scene by the library. 
  • The calls to fn.input_parameter generates the inputs for the function. 
  • The call to fn.import_external_function imports a function (in this case a constant) from the substance library of functions.
  • The calls to sm.atan2 etc calls a built in function.
  • The use of operators such as +, - * are overloaded to generate nodes for addition, subtraction, multiplication etc.
  • The call to fn.generate takes the output node for the function as input. Any function created in the system needs a signature like this.

This function itself doesn't do anything useful, in order to create it in a substance document it needs to be passed into st.generate_function like below:

st.generate_function(sphere_uv, doc, name='sphere_uv')

This will generate a function that can now be imported in other functions to be called. The first parameter is the function to be processed, second is the sbs_document it should live in and the name one gives a name to the function so it can be found when called from other functions.

The output of this function will look like this in Substance Designer. 

 As you can see, the normal and the tiling inputs are created, together with a set of math nodes and finally going into an output node representing the value.

There are four other similar functions in sample

  • phong_lighting
  • ambient_lighting
  • intersect_sphere
  • render_sphere

These are similar to the sphere_uv function. An interesting part of the render_sphere function looks like this:

fn_sphere_uv = fn.import_local_function('sphere_uv')

In this section the functions defined earlier are imported so they can be called directly in the graph like this:

uv = fn_sphere_uv(i_normal, tiling)

In order to actually make function into a pixel processor node there is a final function called raytracer_pixel_processor. This is not fundamentally different from the other functions but it get created in a slightly different way:

pp = pp_node.getPixProcFunction()
st.generate_function(raytracer_pixel_processor, doc, fn_node=pp)

This generate_function call is almost identical to the previous one but instead of passing in a name parameter we pass in an fn_node parameter. The fn_node is essentially some type of graph network for functions, in our case it comes out of a pixel processor. This means that this function will not be a free function that can be accessed but it will be put inside of the pixel processor node.

There is also code for creating a simple network and saving it out to disk in raytracer.py

The final result looks like this in Substance Designer

All the functions defined are now available in the document and the pixel processor renders out the resulting image.

Another thing covered is how a lot of input parameters are exposed on the top level graph:

graph.addInputParameter('sphere_radius', aWidget=sbsenum.WidgetEnum.SLIDER_FLOAT1, aDefaultValue=30.0)
graph.addInputParameter('sphere_origin', aWidget=sbsenum.WidgetEnum.SLIDER_FLOAT3, aDefaultValue=[0.0, 0.0, 130.0])
graph.addInputParameter('light_direction', aWidget=sbsenum.WidgetEnum.SLIDER_FLOAT3, aDefaultValue=[1.0, -1.0, -1.0])
graph.addInputParameter('light_color', aWidget=sbsenum.WidgetEnum.COLOR_FLOAT4, aDefaultValue=[1.0, 1.0, .7, 0])
graph.addInputParameter('ambient_color', aWidget=sbsenum.WidgetEnum.COLOR_FLOAT4, aDefaultValue=[0.3, 0.3, .6, 0])
graph.addInputParameter('background_color', aWidget=sbsenum.WidgetEnum.COLOR_FLOAT4, aDefaultValue=[0.3, 0.3, .6, 1.0])
graph.addInputParameter('tiling_scale', aWidget=sbsenum.WidgetEnum.SLIDER_FLOAT1, aDefaultValue=10.0)

In the raytracer_pixel_processor function we import references to these so we can control the raytracer using them:

sO = fn.variable('sphere_origin', widget_type=sbsenum.WidgetEnum.SLIDER_FLOAT3)
sR = fn.variable('sphere_radius', widget_type=sbsenum.WidgetEnum.COLOR_FLOAT1)
light_dir = fn.variable('light_direction', widget_type=sbsenum.WidgetEnum.SLIDER_FLOAT3)
light_color = fn.variable('light_color', widget_type=sbsenum.WidgetEnum.COLOR_FLOAT4)
ambient_color = fn.variable('ambient_color', widget_type=sbsenum.WidgetEnum.COLOR_FLOAT4)
background_color = fn.variable('background_color', widget_type=sbsenum.WidgetEnum.COLOR_FLOAT4)
tiling = fn.variable('tiling_scale', widget_type=sbsenum.WidgetEnum.COLOR_FLOAT1)

If editing the pixel processor function it will look like this:

This allows us to tweak parameters in the raytracer from the substance designer gui: