Page MenuHome

Viewport denoising does not work for Pixel Size other than 1x
Closed, ResolvedPublic

Description

System Information
Kubuntu Linux 20.04
Graphics card: GTX 960 4GB

Blender Version
Broken: Master (Own Build modified to support Optix on GTX card)

Open eg BMW27.blend scene. Enable Optix viewport Denoising. Set a different to 1x value in “Performance > Viewport > Pixel Size”. Switch to Viewport rendering.
Note that I don't have an RTX card, applied this mod:
https://devtalk.blender.org/t/blender-2-8-cycles-optix-on-non-rtx-card/11224

EDIT:

EDIT 2:
Important thing that I had not noticed before. The problem I describe only occurs if we are using "CUDA" from System item in Preferences. This works if we select OptiX.

Event Timeline

since this is not reported in supported drivers, this could be due to the modification in the build.
Also, bug tracker is only for official builds. you can ask about this on https://devtalk.blender.org though.

Could you add a simple blend file ?

Change Pixel Size from Automatic to 2x for example.

I can confirm that this bug is now in the master branch of 2.9 (tested with a GTX1070/r7 1700x on June 6th’s nightly buildbot version). Given that the ‘automatic’ pixel size for a high DPI screen (I tested with a 3840x2160p screen), it appears to not work at all with no indication to the user as to what is wrong. Setting it manually to 1 does enable it, but cycles viewport at 1x pixel size at 4K is not really feasible.

YAFU (YAFU) updated the task description. (Show Details)Jun 8 2020, 3:22 PM
Patrick Mours (pmoursnv) closed this task as Resolved.Jun 10 2020, 2:13 PM
Patrick Mours (pmoursnv) claimed this task.