Page MenuHome

Fix T82042: Crash when rendering huge images
ClosedPublic

Authored by Jeroen Bakker (jbakker) on Nov 10 2020, 5:06 PM.

Details

Summary

Issue is related to the definition of the GL_MAX_TEXTURE_SIZE.

OpenGL does not clearly defined GL_MAX_TEXTURE_SIZE exactly means. Both on AMD and NVIDIA we
have issues with huge textures that they don't get created even if they are smaller. (See D9530: Added test case to research support of texture sizes for research).
This patch will try to create the texture in a smaller size when the texture creation failed.

Future

In my opinion we should create a solution that doesn't need downscaling. For this specific case ARB_sparse_texture might help to create cleaner code, but you still have to commit the whole image. Other option is to optimize the scaling current implementation isn't optimized for performance.

Diff Detail

Repository
rB Blender

Event Timeline

Jeroen Bakker (jbakker) requested review of this revision.Nov 10 2020, 5:06 PM
Jeroen Bakker (jbakker) created this revision.
Jeroen Bakker (jbakker) planned changes to this revision.Nov 11 2020, 11:23 AM

@Brecht Van Lommel (brecht) yes will change it.

I also want to perform more test to find out where current the limitations are. We might want to adapt the patch to these findings.

Clément Foucault (fclem) requested changes to this revision.Nov 14 2020, 3:12 PM

A better workaround would be to just test if texture creation worked. In case it did not, divide it's size by 2 and try again. Since the size is clamped to max gl size (16K) then its half always fit the 8K requirement. See P1767. However, this does not fix T82591 at all.

You can always improve and add more cases for 8k limit and buffer limit but that's maybe too much driver dependent to be useful.

In the future, we might want to avoid requesting the fullsize image.

After some experiments I sticked to your solution as the code didn't became to complex.
My alternative was to do this in 3 steps (100%, 75% and 50% of the max texture size).

In my opinion we should create a solution that doesn't need downscaling (eg sparse textures, https://www.khronos.org/registry/OpenGL/extensions/ARB/ARB_sparse_texture.txt). I would assume that users would would want to render huge resolutions have modern GPUs. For GPUs not supporting this ARB we could still do the scaling.

Next to that we could also optimize the current pipeline so that even when the ARB isn't supported the GPU texture is updated by tiles. During rendering the area that changes is small. I could look into this for 2.92.

Jeroen Bakker (jbakker) edited the summary of this revision. (Show Details)Nov 16 2020, 1:10 PM
This revision is now accepted and ready to land.Nov 17 2020, 2:37 AM

Just a suggestion: The automatic scaling solution certainly seems like a friendly behaviour but from a UX perspective, the user isn't informed when this is happening. Would there be some possible way to notify the user when an image is too large? Perhaps just one of those on screen notification popups?
Something like:

"Warning: texture_64k.png has been downscaled to the maximum size supported by your GPU: 8,192px x 8,192px."

I'm just thinking about the possibility of a situation where a user is expecting a very high resolution image. Say for example, a user loading a monster sized texture for a photogrammetry scan of an area produced by other software. They load the texture they need, and perhaps is left scratching their head when it appears much lower resolution in the render result. It could result in some amount of confused head scratching trying to figure out what's happening. You wouldn't want the user to think in that situation that the issue is a bug in Blender (and creating a bug report here), so it might be helpful to notify them that it's just Blender getting around the limitations of their GPU.