Page MenuHome

Cycles Volume displacement Shader (in early development)
Needs ReviewPublic

Authored by Nikolaij (FATTOMCAT) on Oct 20 2020, 12:17 AM.

Details

Reviewers
None
Group Reviewers
Cycles
Summary

Volume displacement is (to my knowledge) based on the idea to distort the input position and check the density at this distored (or displaced) position. Addressing this problem I gave my best at writing a node that checks If a given point in sapce is inside a given mesh/object.
The Idea is to have a bounding geometry (a cube for example) and use a "Inside Test" node to find out weather or not the input position is inside an other object.
If it is, return 1 if not 0. One can use this information to drive a volume shader and finally distort the volume.
This way volume displacement is possible.

Im aware of the volume displacement modifire but as far as I can tell it's still "locked" to a discrete grid, the shader would be entirely "procedural" and therefore more suited for details, however it should be somewhat slower.

Here a quick test render:

The object editor: (the node is used in the cubes material while the pipe material is transparent)

I implemented this using raycasting and counting the number of intersections with the given object (odd -> inside | even -> outside).

The existing concept could be improved by some kind of cache, but my blender API knowledge is way to bad to know exacly how this could be acomplished.
In addition a VDB support is not implemented by now.

(Not used anymore)

This is still like early alpha but I figured sharing the progress makes it easyer to improve the overall progress and exchange opinions and ideas.
OSL is also not working at the moment...

Diff Detail

Repository
rB Blender

Event Timeline

Nikolaij (FATTOMCAT) requested review of this revision.Oct 20 2020, 12:17 AM
Nikolaij (FATTOMCAT) created this revision.
Nikolaij (FATTOMCAT) edited the summary of this revision. (Show Details)

Not using the object-pass anymore but the actual objects id since the inside test is accomplished counting the number of intersections with the tested object (odd -> inside | even -> outside), this way a local scene intersection can be tested wich is much faster than a general scene intersection.
In addition normals don't have to be checked anymore wich saves time again, and since the isect.Ng is not generally avalible (GPU for example) the code is more general (at the time writing i don't even know how to get the Ng of a GPU intersection test).

Nikolaij (FATTOMCAT) edited the summary of this revision. (Show Details)Oct 20 2020, 3:22 AM

I'm not sure this approach is something we should support in Cycles. Volume rendering is already expensive, doing a ray-trace for every shader evaluation step makes it even more so.

I think displacement support for rendering volume objects makes a lot of sense. It would be done by expanding the bounds by some specified amount of voxels (dilating the OpenVDB grid used for generating the bounds mesh), and then modifying the position used for shading and associated voxel data lookups.

The mesh based volume shading I consider more something for adding absorption or other effects within a mesh with a surface shader. For objects with soft boundaries I think the volume object should be used.

I'm not sure this approach is something we should support in Cycles. Volume rendering is already expensive, doing a ray-trace for every shader evaluation step makes it even more so.

I think displacement support for rendering volume objects makes a lot of sense. It would be done by expanding the bounds by some specified amount of voxels (dilating the OpenVDB grid used for generating the bounds mesh), and then modifying the position used for shading and associated voxel data lookups.

The mesh based volume shading I consider more something for adding absorption or other effects within a mesh with a surface shader. For objects with soft boundaries I think the volume object should be used.

I share your performance concerns, so I tryed to do some kind of performance comparison.
I rendered the Cube with and without the extra node (when I say without the extra node I mean the whole cube is rendered as a normal Volume).

Adaptive SamplingVolume DisplacementNormalRatio
On7.48 min6.47 min1.156
Off9.48 min7.35min1.290

This isn't exactly fast If one considers that renderengines are optimized for one or less of a % but faster then I expected.
Regarding further improvements in performance I currently think about how one could implement a cache.
I imagined a two step inside test to accelerate testing.
The first step is to have a 3d grid that approximates the mesh, just like a VDB but in addition to the "standard" Voxelization one stores information weather or not a Voxel intersects a Polygon directly.
Now one checks If a point is inside this grid, if it is one further tests if the "hit" cell intersects a polygon (wich is cached) if it does one casts a ray to make the result exact. If it doesn't one doesn't need to cast a ray.
This way a ray has only to be cast at the rimm of the object, all other parts of the Volume can be tested by simply reading the datastructure.

As mentioned I am new to the Blener API and therefore will heve to learn quite a view thing in order to know how to exactly do that but I'll give my best.

Fixed a bug.
(scene_intersect_local expects lcg_state != NULL, if not hit counting won't work)

For performance, my concern is mostly with more complex meshes, and GPU rendering. Having a trace call in the middle of volume stepping is going to hard to optimize for the GPU compiler. So we should try to design things in a way that avoids this entirely, even if there was caching.

It's just not a feature that I want to have in Cycles and maintain as we make changes to volume and GPU rendering. Volume displacement like this should only be supported for volume objects, not meshes.