Page MenuHome

Fix T92308: OptiX denoising fails with high resolutions
ClosedPublic

Authored by Patrick Mours (pmoursnv) on Dec 1 2021, 12:02 PM.

Details

Summary

The OptiX denoiser does have an upper limit as to how many pixels it can denoise at once, so
this changes the OptiX denoising process to use tiles for high resolution images.
The OptiX SDK does have an utility function for this purpose, so changes are minor, adjusting
the configured tile size and including enough overlap.

Diff Detail

Repository
rB Blender
Branch
cycles_optix_denoiser_tiled (branched from master)
Build Status
Buildable 19121
Build 19121: arc lint + arc unit

Event Timeline

Patrick Mours (pmoursnv) requested review of this revision.Dec 1 2021, 12:02 PM
Patrick Mours (pmoursnv) created this revision.

Looks good, one question inline.

intern/cycles/device/optix/device_impl.cpp
890–891

Where does the 2048 limit come from?

I sometimes use a 3D viewport bigger than that on a 4K monitor, and I'm wondering if tiling in that case would negatively impact performance, or if it doesn't matter.

intern/cycles/device/optix/device_impl.cpp
890–891

It's an arbitrary limit that I went with so a normal HD image still fits into one tile. OptiX can denoise up to around 8100x8100, so this could be increased still. I wouldn't expect there to be a huge performance difference with tiling vs no tiling, but there is slightly more work done with tiling due to the additional overlap between tiles, so it doesn't perform the same.

intern/cycles/device/optix/device_impl.cpp
890–891

I think we might as well increase it to 4096 then.

Accepting assuming 2048 will be changed to 4096.

This revision is now accepted and ready to land.Dec 1 2021, 6:48 PM