The OptiX denoiser does have an upper limit as to how many pixels it can denoise at once, so
this changes the OptiX denoising process to use tiles for high resolution images.
The OptiX SDK does have an utility function for this purpose, so changes are minor, adjusting
the configured tile size and including enough overlap.
Details
Details
Diff Detail
Diff Detail
- Repository
- rB Blender
- Branch
- cycles_optix_denoiser_tiled (branched from master)
- Build Status
Buildable 19121 Build 19121: arc lint + arc unit
Event Timeline
Comment Actions
Looks good, one question inline.
| intern/cycles/device/optix/device_impl.cpp | ||
|---|---|---|
| 890–891 | Where does the 2048 limit come from? I sometimes use a 3D viewport bigger than that on a 4K monitor, and I'm wondering if tiling in that case would negatively impact performance, or if it doesn't matter. | |
| intern/cycles/device/optix/device_impl.cpp | ||
|---|---|---|
| 890–891 | It's an arbitrary limit that I went with so a normal HD image still fits into one tile. OptiX can denoise up to around 8100x8100, so this could be increased still. I wouldn't expect there to be a huge performance difference with tiling vs no tiling, but there is slightly more work done with tiling due to the additional overlap between tiles, so it doesn't perform the same. | |
| intern/cycles/device/optix/device_impl.cpp | ||
|---|---|---|
| 890–891 | I think we might as well increase it to 4096 then. | |