Page MenuHome

Baking system overhaul: Move baking settings from render settings into BakePasses
Needs ReviewPublic

Authored by Lukas Stockner (lukasstockner97) on May 4 2018, 11:23 PM.
Tokens
"Like" token, awarded by Aleks41."Love" token, awarded by Stig."Love" token, awarded by aschere."Love" token, awarded by Gonara."Love" token, awarded by lopoIsaac."Like" token, awarded by bestelix."Burninate" token, awarded by AnityEx."Love" token, awarded by softyoda."Like" token, awarded by SimonH."Love" token, awarded by rbx775."Love" token, awarded by mindinsomnia."Love" token, awarded by Mike.Bailey."Love" token, awarded by kenziemac130."Like" token, awarded by johantri."Like" token, awarded by knightknight."Love" token, awarded by Loginzoom."100" token, awarded by Dir-Surya."Love" token, awarded by forresthopkinsa."Love" token, awarded by Yegor."Love" token, awarded by Zweistein."Like" token, awarded by joules."Love" token, awarded by RodDavis."Love" token, awarded by Shimoon."Love" token, awarded by VertexPainter."Like" token, awarded by Jaydead."Love" token, awarded by xrg."Love" token, awarded by Brandon777."Pterodactyl" token, awarded by Grische."Like" token, awarded by Fracture128."Burninate" token, awarded by Thane5."Love" token, awarded by andruxa696."Love" token, awarded by iWaraxe."Like" token, awarded by belich."100" token, awarded by Frozen_Death_Knight."Love" token, awarded by bnzs."Like" token, awarded by momotron2000."Love" token, awarded by ambient."Love" token, awarded by mino159."Love" token, awarded by fiendish55."Love" token, awarded by Schamph."Love" token, awarded by brilliant_ape."Love" token, awarded by symstract."Yellow Medal" token, awarded by amonpaike."Party Time" token, awarded by madminstrel."Love" token, awarded by kioku."Love" token, awarded by erickblender."Love" token, awarded by TheRedWaxPolice."Love" token, awarded by DotBow."Love" token, awarded by rawalanche."Love" token, awarded by juantxo."Love" token, awarded by jonathanl."Love" token, awarded by wo262."Like" token, awarded by billreynish."Love" token, awarded by LazyDodo.

Details

Summary

The current baking workflow has some issues:

  • The way to specify which image should be used is cumbersome and unintuitive
  • Baking multiple channels has to be done by baking, changing settings and image, baking again etc.
  • When baking two objects sharing a material, the image has to be changed before each baking pass.
  • Bake types are hardcoded, other rendering engines have no way of specifying their own

Therefore, this patch introduces a system that allows to persistently store the combinations of Object, Image, Material and Settings that should be used for baking.

This is handled by the BakePass, a new type that replaces the Bake settings that used to be part of the rendering settings.
Each object holds a list of bake passes, and each bake pass holds an image and the settings that should be used to bake that image.
When running the Bake operator, it iterates over all bake passes of the currently selected object and executes them.

This way, the user can configure which images should be generated from each object and rebake quickly. Especially with the trend of baking PBR maps for e.g. game engines, this workflow is a major use case of baking.

Previously, it was possible to specify a different image for each material. To keep that functionality, the user can optionally select a material, in which case the baking pass will be limited to that material. Then, by adding a pass per image/material-pair, you get the same functionality.

In order to get rid of the hardcoded types, the Blender-side baking code doesn't care about the type anymore - the engine just provides the pixels and some additional info (for example, setting is_normal enables the normal space conversion).

This patch is not really finished yet, here's a list of ToDos:

  • Polish the UI
  • Versioning code - do we want to create a BakePass from the old settings? If yes, some versioning has to be done in Cycles, which means that we need to keep the old data in RNA :/
  • Share renderer session (currently afaics the scene is synced and a Cycles session is started for each pass, it would be better to keep the session for all passes)
  • Setting verification (used to be part of the Blender side, has to be moved into Cycles)
  • Also supporting the new way of specifying images in the case of Multires Baking
  • More Testing

Diff Detail

Repository
rB Blender
Branch
arcpatch-D3203 (branched from master)
Build Status
Buildable 7350
Build 7350: arc lint + arc unit

Event Timeline

There are a very large number of changes, so older changes are hidden. Show Older Changes

I believe that this system, even with all the improvements that can be made with respect to the previous one, which has undoubtedly cleaned up the code and everything in terms of programming, still contains very basic problems with its use.

The first and fundamental problem it has is that it is designed to bake a particular object. I can't find a way to bake several objects that share the same UV, and if it exists it is even more hidden than in the previous system.

It does not make sense that a baking system is necessarily linked to a single object. BakePasses should contain the objects to be baked. Both high and low. It should be a separate entity, an object within the outliner.

I highly disagree. Nearly every other functionality in Blender is on a per object basis. If you want an Operator, material, texture, Physics, particle or any other kind of System to work with multiple objects, you either group the objects or script it to make it repeatedly for multiple objects. So many people waiting for this feature and it is getting postboned every release. You should put it in on object basis which is the most common usecase and extend funtionality to "on scene Basis" later. It would be one of the first function on a scene basis, but if you really need it you can do it later.

I highly disagree. Nearly every other functionality in Blender is on a per object basis. If you want an Operator, material, texture, Physics, particle or any other kind of System to work with multiple objects, you either group the objects or script it to make it repeatedly for multiple objects. So many people waiting for this feature and it is getting postboned every release. You should put it in on object basis which is the most common usecase and extend funtionality to "on scene Basis" later. It would be one of the first function on a scene basis, but if you really need it you can do it later.

  • Particles are an independent system that can be linked to multiple objects.
  • Particles nodes will be an independent node system
  • Everything node have the same concept

Bake only one object it's not a normal usecase, it's a rare case in reality.

  • Characters have a lot of parts or objects and different materials.
  • Props have a lot of parts.
  • If you want to bake a lightmap it's normal to work with multiply objects
  • If you want to use atlas, you use multiple low poly objects

An independent Bake task (in a custom editor, object or node system) is the only correct way of how this bake overhaul must to be implemented. The difference between both implementations is that this implementation don't solve the all problems of the actual bake system, still being a strange implementation, and a "task" oriented bake system will be solve all bake problems for the rest of blender history.

The new system:

  • Don't allow to work with multiple objects, breaking any nondestructive pipeline
  • It has a obscure UI solution inside the object properties (bake can not be a property of one object)
  • Don't allow to save automatically all images to the hard disk.
  • Don't allow custom passes (selecting a material that will override the materials of the objects)
  • Don't allow to use multiple collections as low and high poly

Like I commented, some changes in the system will be left it completed for the rest of blender history.

Comparing it to other aspects of blender isn't fair. Compare it to every other software that does baking and realize that they don't do it to fit a structure of a software, they bake multiple objects and treat them as jobs because it's what's practical. It's what works. Look at any videogame and the level of detail in their texture work. These kinds of workflow demand it. It's best to make it right to begin with or the baking system will be replaced a third time

There's another aspect to be considered here I think, which is how you can easily preview or use baked textures. This may be to check for quality before exporting to a game engine, or because you're using baking as a way to speed up rendering with Cycles or Eevee. I'd like to avoid having too much of a disconnect between the settings for baking and using textures.

This is especially important for exporters. When exporting to glTF, USD or another interchange format, you not only need to know what the baked textures for an object are, but also how to use them in the exported shader.

There are many places we could store baking settings:

  • On every object, but make editing many possible with presets and multi-object properties editing
  • In a new type of datablock with baking settings
  • In material datablocks
  • In image datablocks
  • In collections for all objects contained within

I'm thinking that maybe material datablocks and shader nodes are the right place?

Materials Datablocks

Materials would store a few settings that are common to all bake passes that you would bake together. Particularly cage objects and related settings.

There would be a new Bake Texture shader node. This would be like an Image Texture node, with an image datablock. Renderers would simply render it as an Image Texture. For baking it would have the bake pass and associated settings. There would be some restrictions, maybe directly specifying a UV map rather than a Vector input for textures, only Flat mapping, etc, whatever is needed to make it efficiently bakeable.

For the simple use case when you just use a single material as input for baking, you could put all baking related settings in the same material.

This design would support baking multiple objects with multiple materials into a single texture too. You would link the "baking material" to all objects that you want to bake together, and all other materials would be baked into that. Making a clear UI for this might be a bit tricky but I think something can be figured out.

Some ideas to improve on this beyond an initial implementation:

  • Besides a bake pass name, Bake Texture nodes could also support an input socket. If unbaked it would pass through, if baked it would read from the baked image texture. This would avoid having to create multiple BSDF nodes, you could just insert nodes in an existing shader node setup where needed.
  • Bake Texture nodes could have an optimization where if it detects the input is constant across all materials used for baking, it outputs a fixed parameter rather than an image. Or there may be a better solution to this, but it would be great if users wouldn't have to think about this aspect somehow.
  • Ideally it would be possible to create a shader node group matching a material for the target game engine, which specifies both how to bake it and how to preview it in Blender. With additional Python scripting it could automatically create image datablocks with particular naming conventions.
  • Materials or individual Bake Texture nodes could have a state indicating if they are "Baked" or not. This could be used to determine what to display in the viewport, the baked or unbaked results. As you bake it would automatically display the results, or revert back to the unbaked state after clearing.
  • This "Baked" state could also make it easier to ensure you have baked everything before exporting. You could select all objects, press bake, and it would only bake what is not ready yet. Exporters could check this state and show warnings or offer to bake before exporting.

For the implementation, Bake Texture nodes could perhaps be handled fully on the Blender side with the renderers unaware. Blender already performs some shader graph modification, and turning Bake Texture nodes into Image Texture nodes or fixed parameter nodes should work.

I don't see materials like a good place to have bake settings. A bake is independent of a material. I can bake an object with a lot of materials or without materials, or make a bake that have nothing to do with my actuals materials. A bake is like a function of everything nodes, socar addon or a script. Is a independent element that have low poly meshes, high poly meshes, some parameters and outputs. And all that parameters inputs and outputs can change in any moment

This allow to change easily the objects inside the bake pass or add passes

There's another aspect to be considered here I think, which is how you can easily preview or use baked textures. This may be to check for quality before exporting to a game engine, or because you're using baking as a way to speed up rendering with Cycles or Eevee. I'd like to avoid having too much of a disconnect between the settings for baking and using textures.

This is especially important for exporters. When exporting to glTF, USD or another interchange format, you not only need to know what the baked textures for an object are, but also how to use them in the exported shader.

There are many places we could store baking settings:

  • On every object, but make editing many possible with presets and multi-object properties editing
  • In a new type of datablock with baking settings
  • In material datablocks
  • In image datablocks
  • In collections for all objects contained within

I'm thinking that maybe material datablocks and shader nodes are the right place?

Materials Datablocks

Materials would store a few settings that are common to all bake passes that you would bake together. Particularly cage objects and related settings.

There would be a new Bake Texture shader node. This would be like an Image Texture node, with an image datablock. Renderers would simply render it as an Image Texture. For baking it would have the bake pass and associated settings. There would be some restrictions, maybe directly specifying a UV map rather than a Vector input for textures, only Flat mapping, etc, whatever is needed to make it efficiently bakeable.

For the simple use case when you just use a single material as input for baking, you could put all baking related settings in the same material.

This design would support baking multiple objects with multiple materials into a single texture too. You would link the "baking material" to all objects that you want to bake together, and all other materials would be baked into that. Making a clear UI for this might be a bit tricky but I think something can be figured out.

Some ideas to improve on this beyond an initial implementation:

  • Besides a bake pass name, Bake Texture nodes could also support an input socket. If unbaked it would pass through, if baked it would read from the baked image texture. This would avoid having to create multiple BSDF nodes, you could just insert nodes in an existing shader node setup where needed.
  • Bake Texture nodes could have an optimization where if it detects the input is constant across all materials used for baking, it outputs a fixed parameter rather than an image. Or there may be a better solution to this, but it would be great if users wouldn't have to think about this aspect somehow.
  • Ideally it would be possible to create a shader node group matching a material for the target game engine, which specifies both how to bake it and how to preview it in Blender. With additional Python scripting it could automatically create image datablocks with particular naming conventions.
  • Materials or individual Bake Texture nodes could have a state indicating if they are "Baked" or not. This could be used to determine what to display in the viewport, the baked or unbaked results. As you bake it would automatically display the results, or revert back to the unbaked state after clearing.
  • This "Baked" state could also make it easier to ensure you have baked everything before exporting. You could select all objects, press bake, and it would only bake what is not ready yet. Exporters could check this state and show warnings or offer to bake before exporting.

For the implementation, Bake Texture nodes could perhaps be handled fully on the Blender side with the renderers unaware. Blender already performs some shader graph modification, and turning Bake Texture nodes into Image Texture nodes or fixed parameter nodes should work.

Then what would happen if you had same material applied to multiple objects with parts of these multiple objects overlapping same UV space? This doesn't make much sense to me. In a scene, I want to be able to use one material I've created on multiple different objects without worrying about baking them all at once. This would be devastating as one would have to create unlinked copies of the same material just for the sake of baking. Such an ugly solution.

Baking should be per object, because baking is based on UVs and UVs are stored within object, not material.

All of the most successful texture authoring/baking tools out there such as Substance Painter have object based baking.

Then what would happen if you had same material applied to multiple objects with parts of these multiple objects overlapping same UV space?

You'd link a different baking material to each, with different images.

All of the most successful texture authoring/baking tools out there such as Substance Painter have object based baking.

Are you sure about that?

Doesn't Substance painter create one Texture Set per imported material, and then have baking settings stored in that Texture Set?

https://docs.substance3d.com/spdoc/texture-set-154140919.html
https://docs.substance3d.com/spdoc/baking-109608997.html

I don't see materials like a good place to have bake settings. A bake is independent of a material. I can bake an object with a lot of materials or without materials, or make a bake that have nothing to do with my actuals materials. A bake is like a function of everything nodes, socar addon or a script. Is a independent element that have low poly meshes, high poly meshes, some parameters and outputs. And all that parameters inputs and outputs can change in any moment

You can say it's independent of a material, but that still leaves the question about how to preview the results and more importantly, how to export the baked materials for your model.

Would you set up both these baking nodes and a material that uses the images resulting from these baking nodes for that? Isn't that redundant?

how to preview the results

If you use collection as input for baking and collection has option to override materials, there could be bunch of shaders (overlays) that show baked maps.

how to export the baked materials for your model.

You just export set of maps, not materials.

If you use collection as input for baking and collection has option to override materials, there could be bunch of shaders (overlays) that show baked maps.

So you would only see individual baked maps, but not the result of all them together in an actual material? Or would it auto construct that material, and if so how would it guess the material parameters that were not baked to textures?

You just export set of maps, not materials.

If you're using Substance Painter for creating texture maps, and then another software for using those texture maps in a material and exporting, then just exporting texture maps makes sense. But the big advantage of baking in Blender is exactly that we can integrate things better than that.

If you want to export to glTF or USD, you need to export a material one way or the other. Those file formats do not have "set of maps" as a data structure as far as I know, nor would it be sufficient to actually render the model in the target game engine or other software.

I don't see materials like a good place to have bake settings. A bake is independent of a material. I can bake an object with a lot of materials or without materials, or make a bake that have nothing to do with my actuals materials. A bake is like a function of everything nodes, socar addon or a script. Is a independent element that have low poly meshes, high poly meshes, some parameters and outputs. And all that parameters inputs and outputs can change in any moment

You can say it's independent of a material, but that still leaves the question about how to preview the results and more importantly, how to export the baked materials for your model.

Would you set up both these baking nodes and a material that uses the images resulting from these baking nodes for that? Isn't that redundant?

  • I can preview the results in a image editor, like normally. I don't see a reason to have other way to see the result. Maybe is could be added a button that give you direct access to bakedtexture (like we show the render in a float window). But in a normal workflow you bake 5-6 images at same time, so it's not really suefull. Anyway is not a important feature, imho.
  • How to export the baked maps inside a material is a secondary task. Rarely, but not impossible, you export all (models and materials with maps) from blender, you normally use that maps in other software like substance, unreal or similar. So that situation must not be the standard workflow reference.
  • Anyway, in the moment that you select an output image, if you have correctly configure your materials your change will be visible instantly in viewport or image editor and you could export all without problems.
  • Yes, I will preffer this system. I don't see a problem or a redundant situation.

Other thing that must to be considerer is that when you bake a map you need the posibility to MIX CHANNELS. Because in videogames is really normal that you don't use raw texture channels. For example my actual project.

Texture 1
RGB - Base color
Alpha - Roughness

Texture 2
RG - Normal RG Color
B - Metallic

With a node system we can do this type of things.

Then what would happen if you had same material applied to multiple objects with parts of these multiple objects overlapping same UV space?

You'd link a different baking material to each, with different images.

All of the most successful texture authoring/baking tools out there such as Substance Painter have object based baking.

Are you sure about that?

Doesn't Substance painter create one Texture Set per imported material, and then have baking settings stored in that Texture Set?

https://docs.substance3d.com/spdoc/texture-set-154140919.html
https://docs.substance3d.com/spdoc/baking-109608997.html

Yes, it does, and many people are not happy about that, because you then end up with multiple overlapping UVs channels on one object and you then have to do ugly workarounds of combining separate textures afterwards into one image using some clumsy image combiner tool they've built in to mitigate that. That's why most of the people I know always export the object with only one material ID and material selection masks.

It's just all around inferior workflow. In game engines, you want to minimize the amount of material IDs on single object as each material ID adds another mesh draw call. 6 material IDs and suddenly you end up with 6 draw calls instead of one. So the common workflow is having one texture per object, not one texture per material.

Separating multiple materials into multiple texture sets using multiple UV channels on a single object just doesn't make sense, from both complexity and performance standpoints.

Just to add to the discussion. We needed in our last project to bake the buildings of an archviz masterplan to take them to Unity for an app we were developing. And what did save us from the current clunky workflow (which is still better then the default 3dsmax baking system we used to use), was the Bake Wrangler (https://blenderartists.org/t/bake-wrangler-node-based-baking-tool-set-ver-b0-9-4-curvycavity/). The node-based approach and the ability to mix the results from different passes to create a single texture for specular/roughness/ao for example was great. But I do agree with @Brecht Van Lommel (brecht), the ability to see the results/apply them in blender would be great since we always check the results in blender and export the objects with a semi-working material.

So you would only see individual baked maps, but not the result of all them together in an actual material? Or would it auto construct that material, and if so how would it guess the material parameters that were not baked to textures?

In most cases people bake stuff like AO, normals, World Position... being able to inspect those individually is more convenient. For more exotic cases (Metalness, Rougness, Transparency) in the node setup presented above could be added display material output where you can wire specific image channel to specific parameter.

But if you want to export to glTF or USD, you need to export a material one way or the other. Those file formats do not have "set of maps" as a data structure as far as I know, nor would it be sufficient to actually render the model in the target game engine or other software.

I'm not sure how these formats work if use RGBA like container for Roughness, Metalness and all that. I think these kinds of exports should be setup by user separately.

  • I can preview the results in a image editor, like normally. I don't see a reason to have other way to see the result. Maybe is could be added a button that give you direct access to bakedtexture (like we show the render in a float window). But in a normal workflow you bake 5-6 images at same time, so it's not really suefull. Anyway is not a important feature, imho.

To me this reads similar to "realtime viewport rendering is not important, just press F12".

You can go through exporting to another application before you can see that you didn't have quite enough texture resolution in some part of the model or that the UVs were too stretched, but surely that's inefficient? Wouldn't it be much better to see the model with baked textures applied in the Blender viewport immediately after you've pressed Bake? Isn't that a standard feature in applications like Substance Painter?

  • How to export the baked maps inside a material is a secondary task. Rarely, but not impossible, you export all (models and materials with maps) from blender, you normally use that maps in other software like substance, unreal or similar. So that situation must not be the standard workflow reference.

To me it's important that the design takes into account the use case want to see baked materials in Blender, either for preview, for actual use in some external game engine, or for use in Cycles or Eevee.

That may not be your current workflow, but I doubt this is rare for Blender users in general, especially if we provide the tools to do it easily.

Other thing that must to be considerer is that when you bake a map you need the posibility to MIX CHANNELS. Because in videogames is really normal that you don't use raw texture channels. For example my actual project.

There could be an option to say "bake this pass to only these channels of the image". A node setup seems overkill for that.

Yes, it does, and many people are not happy about that, because you then end up with multiple overlapping UVs channels on one object and you then have to do ugly workarounds of combining separate textures afterwards into one image using some clumsy image combiner tool they've built in to mitigate that. That's why most of the people I know always export the object with only one material ID and material selection masks.

Ok, so it's not actually object based in Substance Painter, but in your workflow you try to make it behave as such.

It's just all around inferior workflow. In game engines, you want to minimize the amount of material IDs on single object as each material ID adds another mesh draw call. 6 material IDs and suddenly you end up with 6 draw calls instead of one. So the common workflow is having one texture per object, not one texture per material.

Separating multiple materials into multiple texture sets using multiple UV channels on a single object just doesn't make sense, from both complexity and performance standpoints.

Right, but this is possible in my design. You would be able to have 6 materials with their own shader node setups, and then 1 baking material that bakes the result of all those materials into one set of textures.

And then baking material could be used for previewing the bake in the viewport and exporting.

In most cases people bake stuff like AO, normals, World Position... being able to inspect those individually is more convenient. For more exotic cases (Metalness, Rougness, Transparency) in the node setup presented above could be added display material output where you can wire specific image channel to specific parameter.

What is an exotic use case for one user is the main use case for another.

I'm not sure how these formats work if use RGBA like container for Roughness, Metalness and all that. I think these kinds of exports should be setup by user separately.

Standardization of material exchange is very much a thing that's being worked on in the industry now (USD, MaterialX, MDL, Standard Surface, ...). It's not there yet, but we should assume this situation will improve and not get locked into a design that assumes users will set up their materials two or three times.

To me this reads similar to "realtime viewport rendering is not important, just press F12".

You can go through exporting to another application before you can see that you didn't have quite enough texture resolution in some part of the model or that the UVs were too stretched, but surely that's inefficient? Wouldn't it be much better to see the model with baked textures applied in the Blender viewport immediately after you've pressed Bake? Isn't that a standard feature in applications like Substance Painter?

Showing the baked maps on the viewer, which I think is what you mean, is a good idea. But I think it's outside the scope of a textured baking system. It would be part of the passes shown in the blender viewer.

And unlike blender, substance is a closed software, it lets you see some specific maps that it decides are the ones you need to see, not others. Blender is a suite with many more possibilities and particularities. In substance you can't bake lightmaps, AOVs,... How do we show the Ambient Occlusion or the curvature? They are not input textures in any shader.

There are solutions to this, like the workbench allowing to see the bake output. But since your scene contains more things than just the model you are working with, how do you decide what you will be able to see or not? depending on the selected object?

To me it's important that the design takes into account the use case want to see baked materials in Blender, either for preview, for actual use in some external game engine, or for use in Cycles or Eevee.

That may not be your current workflow, but I doubt this is rare for Blender users in general, especially if we provide the tools to do it easily.

The user will be able to see in real time what he wants. If for example you have the curvature map connected in the node system, once the bake is done the change will be seen instantly in the viewer, eevee or cycles.

There could be an option to say "bake this pass to only these channels of the image". A node setup seems overkill for that.

Bake-wrangler do with nodes and the UI is perfect for that.

And unlike blender, substance is a closed software, it lets you see some specific maps that it decides are the ones you need to see, not others. Blender is a suite with many more possibilities and particularities. In substance you can't bake lightmaps, AOVs,... How do we show the Ambient Occlusion or the curvature? They are not input textures in any shader.

Right now if you use Solid draw mode with colors set to Texture, what you see is the active Image Texture node. In my design there would be a Bake Textures node for AO or Curvature in the material (not necessarily connected to anything), and activating it would work exactly like an Image Texture node.

And unlike blender, substance is a closed software, it lets you see some specific maps that it decides are the ones you need to see, not others. Blender is a suite with many more possibilities and particularities. In substance you can't bake lightmaps, AOVs,... How do we show the Ambient Occlusion or the curvature? They are not input textures in any shader.

Right now if you use Solid draw mode with colors set to Texture, what you see is the active Image Texture node. In my design there would be a Bake Textures node for AO or Curvature in the material (not necessarily connected to anything), and activating it would work exactly like an Image Texture node.

The problem with posing the baking system with a material is that it limits the baking system very much. Everything depends on a material when only sometimes that workflow is used. We must differentiate between programs and their functions

Substance: that its objective is to export only final texture maps for assets
Blender: Used to bake any kind of map, which will not be final or need not be material outputs.

One option could be this

Or maybe pick the output images of the bake system and automatically convert in visible channels in the workbench

Yes, it does, and many people are not happy about that, because you then end up with multiple overlapping UVs channels on one object and you then have to do ugly workarounds of combining separate textures afterwards into one image using some clumsy image combiner tool they've built in to mitigate that. That's why most of the people I know always export the object with only one material ID and material selection masks.

Ok, so it's not actually object based in Substance Painter, but in your workflow you try to make it behave as such.

It's just all around inferior workflow. In game engines, you want to minimize the amount of material IDs on single object as each material ID adds another mesh draw call. 6 material IDs and suddenly you end up with 6 draw calls instead of one. So the common workflow is having one texture per object, not one texture per material.

Separating multiple materials into multiple texture sets using multiple UV channels on a single object just doesn't make sense, from both complexity and performance standpoints.

Right, but this is possible in my design. You would be able to have 6 materials with their own shader node setups, and then 1 baking material that bakes the result of all those materials into one set of textures.

And then baking material could be used for previewing the bake in the viewport and exporting.

Alright, I guess I will just wait how it turns out and go from there. But to me, the most straightforward baking I could imagine is simply selecting object, selecting one of the UV channels as the destination UV to bake to, and specifying several baking channels (albedo, roughness, normal, etc...) with their respective output file paths, that would be it. I would not have to care about interfering with the materials of the object in any way.

Practical example:

  1. I have a car mesh mesh object. It has multiple materials (paint, plastic, metal, etc...). Those materials utilize several different UV channels to map their textures.
  1. I will just select this object, and create a new, unwrapped, non overlapping UV channel for use as the baking target.
  1. I will specify this UV channel in the baking UI.
  1. I will create required baking channels in the baking UI: BaseColor, Metallic, Roughness, Normal, Emission.
  1. The baking UI will automatically fill out the channel output paths with auto generated file names, in a format MESHNAME_CHANNEL.extension. For example Car_Roughness.png. The names can be of course modified.
  1. I press a single bake button, and on my hard drive (by default in some relative path) the baked textures for the car object appear.

Then, as an advanced option, I am able to specify certain "material groups" where I can bake multiple different texture sets for a single object. For example if I wanted all metallic materials in one set, and all dielectric in the other.

By default, there will be only one material group, containing all materials present on the given object. Each material group would have just list of materials to bake out and UV channel selector to define which UV layout they use for baking.

But if you look at the average baking workflow from the high level user stand point, then what people are mostly interested is just "I have this thing, and I want to bake these specific material channels into this particular UV channel and save them as the textures in this format at this file path.". That's pretty much all in 99% of cases.

JPG (joules) added a comment.EditedMar 30 2020, 6:12 PM

Hi,
Just going to throw in a curve ball here.
You have to consider multiple textures for a single mesh object. How I approached it with my addon (I might show screenshots tomorrow). Is how I approached materials on objects. And that is with "Material Groups". The addon will cycle through the groups and bake out "skins" textures.
To preview the baked results is a completely separate shader within the Material Group (i.e. "RedHelmet_BAKE") that is omitted from the bake. To view the result is to make that "Material Group" active.
Works simply, manually, is fine because there may well be a "BlueHelmet_BAKE" etc.
I've always thought of a one to many option in regards to materials for objects. Isolating, and experimenting with shaders on mesh objects.
This involved a level below the Material Slots.
Manages and automatically fills the slots. It's rather convoluted with an addon. But perhaps considered as an alternative rather than messing about too much with simple bake procedures?

JPG (joules) added a comment.EditedMar 30 2020, 6:25 PM

Sorry for a double post. I'll try and be a bit more clear of my perspective.
At this point I think bake handlers of some sort would suffice (currently doing a hack to figure out when the bake is finished).
However looking at it from the perspective on how materials can be managed on objects, is something I'm keen to see some extra features for (layer below material slot to manage multiple material slots per object)
This currently works well for me (however, need to fix for UDIM).
Material Group management.

Bake management.

Edit: older video but gives a good insight to the workflow.

Please let me know if you need some clarification on what I'm suggesting (Material Slot Management)
Joules.

But if you look at the average baking workflow from the high level user stand point, then what people are mostly interested is just "I have this thing, and I want to bake these specific material channels into this particular UV channel and save them as the textures in this format at this file path.". That's pretty much all in 99% of cases.

In your workflow you join everything you want to bake together in one object. But from what I understand, there are two users here arguing it needs to work for multiple objects too. Not multiple texture sets on one object, but the same texture set applied to multiple objects. One of the main reasons asked for multi-object editing in Blender was to be able to pack UVs for multiple objects together, this is not uncommon.

The other things you're not addressing is how to preview baked materials in Blender, how to export materials to file formats like glTF or USD, and how to handle the case where you want to bake materials for use in Blender itself.

So I just don't how this covers 99% of use cases, I wouldn't generalize so easily.

You have to consider multiple textures for a single mesh object. How I approached it with my addon (I might show screenshots tomorrow). Is how I approached materials on objects. And that is with "Material Groups". The addon will cycle through the groups and bake out "skins" textures.

I think a general concept of material groups is too much. There's already much user confusion around the difference between materials and material slots, adding another layer to that would make this worse.

To me it seems better to do something specifically for baking. Let me expand a bit on what the UI for a "baking material" could look like. In the material properties or shader editor, you'd have a "Add Baked Material" button. This would create a new material datablock, with as contents a Principled BSDF nodes and one or more Bake Texture nodes connected to it (or not if it doesn't make sense for e.g. Curvature).

Maybe this would just be a fixed default set, maybe it would look at the existing materials to see which parts needs textures and which parts have a fixed value, maybe there's a popup that let's you easily choose which channels to bake. Once this is added you can go and edit parameters in the various nodes. Due to the presence of Bake Texture nodes, there would be a Baking panel visible with material wide baking settings and a Bake button.

If you want to change the channels to bake afterwards, you can remove or add Bake Texture nodes. The Add node menu could have a category for that, for common channels like Base Color, Metallic, Roughness, Normal Map, Curvature, etc. Adding a bake texture to an input of the Principled BSDF could also auto detect the right type of channel to bake. Image file names could also be set up automatically based on object/material name and channel.

The baked material would be added outside of the regular list of material slots. The exact UI for that I'm not sure about. Maybe it's in the material slot list separate from the rest, with a specific icon, or in a completely separate material slot list where you can toggle between input and baked material slots. But regardless, this would be an actual material that can be used by the viewport, renderers and exporters.

But if you look at the average baking workflow from the high level user stand point, then what people are mostly interested is just "I have this thing, and I want to bake these specific material channels into this particular UV channel and save them as the textures in this format at this file path.". That's pretty much all in 99% of cases.

In your workflow you join everything you want to bake together in one object. But from what I understand, there are two users here arguing it needs to work for multiple objects too. Not multiple texture sets on one object, but the same texture set applied to multiple objects. One of the main reasons asked for multi-object editing in Blender was to be able to pack UVs for multiple objects together, this is not uncommon.

The other things you're not addressing is how to preview baked materials in Blender, how to export materials to file formats like glTF or USD, and how to handle the case where you want to bake materials for use in Blender itself.

So I just don't how this covers 99% of use cases, I wouldn't generalize so easily.

Well the issue here is that if you want to actually bake same texture set on multiple objects, you first need to ensure that the UVs do not overlap in between the objects. So for example if there is object A and object B, then if the destination is the same texture set, then no UV isles of object A can overlap with texture isles of object B. This is tricky, as the only way to get that right is if both were originating from the originally combined object. So this would come down just to the order of operations. Either first detach, then bake, or first bake, and then detach. Baking of multiple objects alone should not be an issue with proper multi-object editing. Clicking bake button with multiple objects selected would initiate the bake on all of them, using their individual bake setups. Baking multiple objects to the same texture set though, that to me just seems like a messier, more chaotic way of doing the same thing as baking one object and then detaching pieces of it.

But as I said, I will just wait how your idea turns out and complain only if the things are too cumbersome to use that way.

If I take a step back from the specific, my general concern is just having to perform way too many manual steps to get the baking working. Which is especially the case with the current baking system in official Blender release.

But if you look at the average baking workflow from the high level user stand point, then what people are mostly interested is just "I have this thing, and I want to bake these specific material channels into this particular UV channel and save them as the textures in this format at this file path.". That's pretty much all in 99% of cases.

In your workflow you join everything you want to bake together in one object. But from what I understand, there are two users here arguing it needs to work for multiple objects too. Not multiple texture sets on one object, but the same texture set applied to multiple objects. One of the main reasons asked for multi-object editing in Blender was to be able to pack UVs for multiple objects together, this is not uncommon.

The other things you're not addressing is how to preview baked materials in Blender, how to export materials to file formats like glTF or USD, and how to handle the case where you want to bake materials for use in Blender itself.

So I just don't how this covers 99% of use cases, I wouldn't generalize so easily.

Well the issue here is that if you want to actually bake same texture set on multiple objects, you first need to ensure that the UVs do not overlap in between the objects. So for example if there is object A and object B, then if the destination is the same texture set, then no UV isles of object A can overlap with texture isles of object B. This is tricky, as the only way to get that right is if both were originating from the originally combined object. So this would come down just to the order of operations. Either first detach, then bake, or first bake, and then detach. Baking of multiple objects alone should not be an issue with proper multi-object editing. Clicking bake button with multiple objects selected would initiate the bake on all of them, using their individual bake setups. Baking multiple objects to the same texture set though, that to me just seems like a messier, more chaotic way of doing the same thing as baking one object and then detaching pieces of it.

But as I said, I will just wait how your idea turns out and complain only if the things are too cumbersome to use that way.

If I take a step back from the specific, my general concern is just having to perform way too many manual steps to get the baking working. Which is especially the case with the current baking system in official Blender release.

Bake multiple objects is not hard or complex. It is also easy because you don't need to detach and join meshes or use a placeholder mesh.

A example of node system is Bake Wrangler, but it has the problem of to work with multiple objects (but it is for the way that blender works)

To me it seems better to do something specifically for baking. Let me expand a bit on what the UI for a "baking material" could look like. In the material properties or shader editor, you'd have a "Add Baked Material" button. This would create a new material datablock, with as contents a Principled BSDF nodes and one or more Bake Texture nodes connected to it (or not if it doesn't make sense for e.g. Curvature).

Maybe this would just be a fixed default set, maybe it would look at the existing materials to see which parts needs textures and which parts have a fixed value, maybe there's a popup that let's you easily choose which channels to bake. Once this is added you can go and edit parameters in the various nodes. Due to the presence of Bake Texture nodes, there would be a Baking panel visible with material wide baking settings and a Bake button.

If you want to change the channels to bake afterwards, you can remove or add Bake Texture nodes. The Add node menu could have a category for that, for common channels like Base Color, Metallic, Roughness, Normal Map, Curvature, etc. Adding a bake texture to an input of the Principled BSDF could also auto detect the right type of channel to bake. Image file names could also be set up automatically based on object/material name and channel.

The baked material would be added outside of the regular list of material slots. The exact UI for that I'm not sure about. Maybe it's in the material slot list separate from the rest, with a specific icon, or in a completely separate material slot list where you can toggle between input and baked material slots. But regardless, this would be an actual material that can be used by the viewport, renderers and exporters.

I don't understand this concept and the extra steps that add to workflow and the improvement. It's simpler to bake into images like actually and that images be connected to principled BSDF like always. In the moment that you made a bake you will see the output in the viewport.

I don't see the problem with a separated node system for bake images and connect the outputs to the material that you need. For example this.

  • It allow to have a specific editor for bake system and cleaning the UI
  • Allow more possibilities for all the bake system, specially in complex situations
  • It is compatible with actual blender software UX
  • Remove the limitation of bake material.

@Lukas Stockner (lukasstockner97), apologies for letting this code review turn into a long discussion.

But I would love to hear your opinion on this, if you're interested in making design changes or have other opinions of how this would work best.

Some of the changes in this patch could be committed already: internal AOV baking support for Cycles, IMB_filter_extend improvements, ImageFormatData moving to the DNA_image_types header. Most of it depends on design decisions though.

@Alberto Velázquez (dcvertice), setting up baking nodes like that would definitely be more work than my proposal. It can be argued that it's more powerful (it's not clear to me that it is in ways that really matter), but both for users and for the implementation an additional node system is going to be more complex. To be honest baking nodes is not an option I would seriously consider. If you want that kind of more complex but specialized/powerful workflow I think an add-on is suitable. For native Blender features we should aim for something simpler.

@Alberto Velázquez (dcvertice), setting up baking nodes like that would definitely be more work than my proposal. It can be argued that it's more powerful (it's not clear to me that it is in ways that really matter), but both for users and for the implementation an additional node system is going to be more complex. To be honest baking nodes is not an option I would seriously consider. If you want that kind of more complex but specialized/powerful workflow I think an add-on is suitable. For native Blender features we should aim for something simpler.

Blender actually have nodes for shaders, textures, compositor and in few time for everything nodes and particles (and maybe more parts of the program I suppose). I don't see why to have nodes in baking will be a problem for users that uses five different parts with nodes in the program.

For me is an obvious solution when I work with a production file like this. And try to make a complete pipeline. Where you need to have a adhoc output like I told (moving data between channels of different images).

For example, how to rebake an assets that have added a new object in HighPoly And Lowpoly version, that need to remade the final textures.

With a node system:

  • Add HighPoly to HP Collection
  • Add Lowpoly to LP Collection
  • Click bake (it automatically will bake AO/curvature/... and it will change the output image in blender and update all materials that depends of this maps)
  • Select the second bake node tree to bake final material (note that we need to bake intermediate maps to do an asset)
  • Click bake (it automatically export all the files to a folder with all channels correctly exported)

Completed

How many steps will be needed to do the same in the solution that you propose? multiply that by hundreds of objects, hundreds of feedbacks changes,... and you have a big black hole of productivity. Normally in my workflow one of the worst parts is bake the maps, because you need to made a lot of tests, changes, add parts,... and that iterations needs a lot of time because that changes in a bake system are a pits. Substance solve part of that works because you can configure the output to mix maps channels. But how solve that blender in a efficient way?

How many steps will be needed to do the same in the solution that you propose?

Approximately the same amount of steps. Maybe fewer, I can imagine this being possible with 3 steps since I'm not sure why you'd need to bake twice, if there are dependencies between maps they could be figured out automatically. Though that's possible to implement in either system.

How many steps will be needed to do the same in the solution that you propose?

Approximately the same amount of steps. Maybe fewer, I can imagine this being possible with 3 steps since I'm not sure why you'd need to bake twice, if there are dependencies between maps they could be figured out automatically. Though that's possible to implement in either system.

Mmm, Interesting, Maybe i don't see your implementation. But how do you make automatically the mix channels of textures and export automatically to files?

As the author of Bake Wrangler add-on (Here) I have some input:

Firstly my add-on is designed to show how I and others believe that the blender internal system should be moving toward. That being a flexible, node based system that allows you to decide what your workflow is rather than having to conform to a fixed path that may or may not provide what you want. It's not perfect yet and I have some changes in the pipe coming. But it does completely replace the current baking system with a much more flexible solution.

Arguments about "but this is how it is done in blender" are irrelevant. The question is: Do the current developers see a fully featured bake system as a part of blender? If not then maybe don't change it?

There has never been a 'good' internal bake system. Much of the rendering side has been created (though still not all), but the interface has never really thought about what is actually needed by users of the system.

While no doubt these changes here are an improvement, they don't actually address the real issue. Which is that users have wildly different workflows and requirements for their baking. What is needed is a system that allows the user to design their baking workflow, similar to how we allow users to design very elaborate materials if they so desire.

I have a bit of grievance here because these changes will not actually address any of the issues my add-on is designed to fix but will require me to re-write essentially the entire core of the add-on :(

I understand it's tricky coming from a development point of view. You see the way it has been done for years and how that can be made better, maintaining the same feature set. But in this case, the way has been done for years isn't that useful to most users. It's been a somewhat neglected system that needs a complete rethink about what it is actually supposed to be offering.

Arguments about "but this is how it is done in blender" are irrelevant. The question is: Do the current developers see a fully featured bake system as a part of blender? If not then maybe don't change it?
..
While no doubt these changes here are an improvement, they don't actually address the real issue. Which is that users have wildly different workflows and requirements for their baking. What is needed is a system that allows the user to design their baking workflow, similar to how we allow users to design very elaborate materials if they so desire.

An important Blender design principle is "Simple things should be simple, complex things should be possible".

I am convinced that for the most common cases, users having to set up both baking nodes and shading nodes is neither required nor simple enough. The choice is not a binary one between what we have now and a workflow aimed at more technical users. If there are use cases we don't cover or someone prefers a very different workflow, add-ons are still possible.

I still have not seen it explained which important problems baking nodes solve, that can't be solved otherwise. The ability pack multiple maps into color channels is important, but is possible to implement with a simple setting per baking channel. Being able to write to disk instead of memory is also something that could be an option or just default behavior, independent of using nodes or not.

I think it's easy to underestimate the value of having a system where it is explicitly defined what the baking feeds into. If Blender knows that roughness will be packed in channel X, users don't have to manually set up shading nodes extract it from that channel. If we know that simulation X depends on texture Y being baked for particle emission, or an exporter can verify if all baked textures are up to date, that helps automation and validation.

Systems that more loosely couple things together where users have to manually ensure that this filename or setting in one place matches this other thing in another place are the source of many problems in production in my experience, and we should try hard to avoid that.

I have a bit of grievance here because these changes will not actually address any of the issues my add-on is designed to fix but will require me to re-write essentially the entire core of the add-on :(

We could provide a low-level baking API function for this kind of thing . But even without that, I doubt it is that hard to programmatically set up a temporary scene to bake just what you need.

I am convinced that for the most common cases, users having to set up both baking nodes and shading nodes is neither required nor simple enough. The choice is not a binary one between what we have now and a workflow aimed at more technical users. If there are use cases we don't cover or someone prefers a very different workflow, add-ons are still possible.

I still have not seen it explained which important problems baking nodes solve, that can't be solved otherwise. The ability pack multiple maps into color channels is important, but is possible to implement with a simple setting per baking channel. Being able to write to disk instead of memory is also something that could be an option or just default behavior, independent of using nodes or not.

I think it's easy to underestimate the value of having a system where it is explicitly defined what the baking feeds into. If Blender knows that roughness will be packed in channel X, users don't have to manually set up shading nodes extract it from that channel. If we know that simulation X depends on texture Y being baked for particle emission, or an exporter can verify if all baked textures are up to date, that helps automation and validation.

Systems that more loosely couple things together where users have to manually ensure that this filename or setting in one place matches this other thing in another place are the source of many problems in production in my experience, and we should try hard to avoid that.

Well, Blender's future is supposed to be "Everything nodes" afaik, so I do not see why baking should not follow. Node based workflows are exactly the "Simple things should be simple, complex things should be possible" principle. With regular, panel based UI, you have tons visual clutter you have to skim over to find what you need. So instead of having to have chunk of UI panel designated to channel packing, one would not even come in contact with anything packed related unless he adds appropriate nodes. The beauty of node based workflow is that you only have to add as much of the UI as you need. You can clearly see that with Particle Nodes, where the giant, monolithic particle panel full of cryptic knobs can be replaced with 3-4 simple nodes for very simple particle simulations, or expanded to a huge network, which can do many times more than the hardcore particle panel ever could, if the user desires. Similar would apply to baking.

Furthermore, when alleviating the issue of "Systems that more loosely couple things together where users have to manually ensure that this filename or setting in one place matches this other thing in another place", there's no better solution for that than nodes, especially if you can modify and pass along data such as strings in between the nodes. Imagine users being able to create something like cycles material group, except for string data type, which automatically generates all the channels and outputs them along with string node outputs with correct file names with all the prefixes and suffixes.

Following the logic of "nodes are too complex for non technical users", we should remove shading nodes and go back to Blender internal panel only style material editing. Even beginner users are able to understand node based workflows if they are done right.

The argument is not that there shouldn't be nodes. It's that there shouldn't be separate baking nodes and shading nodes, with a loose coupling between them.

An important Blender design principle is "Simple things should be simple, complex things should be possible".

I don't think that an exclusive abstraction of baking with concept and interface that have nothing to do with the rest of the software, also blender and that makes decisions like putting baking inside a material... can be simpler than using the same concept, abstraction and interface that the user uses to make materials, textures and composition since 10 years ago. That allow easily solve all problems and can be expanded with a simplicity.

How can be complex to use a node system to bake when you needed a more complex node system few minutes ago to create the texture that you want to bake?

I think it's easy to underestimate the value of having a system where it is explicitly defined what the baking feeds into. If Blender knows that roughness will be packed in channel X, users don't have to manually set up shading nodes extract it from that channel. If we know that simulation X depends on texture Y being baked for particle emission, or an exporter can verify if all baked textures are up to date, that helps automation and validation.

The ramifications of that solution are increasingly expanding within blender modifying the user workflow only to use the bake system.

  • Changes in the Workbench
  • Changes in exporters
  • It asks if there have been updates in the original modelsor textures that have not been baked to warn the user.

It appear more a closed solution only for final assets that will change all user workflow and pipeline than a bake system inside a suite like blender.

The argument is not that there shouldn't be nodes. It's that there shouldn't be separate baking nodes and shading nodes, with a loose coupling between them.

Ah, nevermind then.

The argument is not that there shouldn't be nodes. It's that there shouldn't be separate baking nodes and shading nodes, with a loose coupling between them.

It must be separated because baking are a separate task from shading nodes that sometimes it is used in this system. It is like try to mix vertex colors with shading nodes, you simply refer the vertex color that you want to use in the nodes, you don't work with vertex colors inside nodes. Baking must to work in a similar way, with nodes refering outputs in the baking nodes. Maybe with some presets or special node to make some automatization.

And can give users strange situations, lo for example, mix both system will give user nodes like output file paths that can't be used with the shading nodes.

Netherby (netherby) added a comment.EditedMar 31 2020, 4:02 PM

Arguments about "but this is how it is done in blender" are irrelevant. The question is: Do the current developers see a fully featured bake system as a part of blender? If not then maybe don't change it?
..
While no doubt these changes here are an improvement, they don't actually address the real issue. Which is that users have wildly different workflows and requirements for their baking. What is needed is a system that allows the user to design their baking workflow, similar to how we allow users to design very elaborate materials if they so desire.

An important Blender design principle is "Simple things should be simple, complex things should be possible".

I am convinced that for the most common cases, users having to set up both baking nodes and shading nodes is neither required nor simple enough. The choice is not a binary one between what we have now and a workflow aimed at more technical users. If there are use cases we don't cover or someone prefers a very different workflow, add-ons are still possible.

I still have not seen it explained which important problems baking nodes solve, that can't be solved otherwise. The ability pack multiple maps into color channels is important, but is possible to implement with a simple setting per baking channel. Being able to write to disk instead of memory is also something that could be an option or just default behavior, independent of using nodes or not.

I think it's easy to underestimate the value of having a system where it is explicitly defined what the baking feeds into. If Blender knows that roughness will be packed in channel X, users don't have to manually set up shading nodes extract it from that channel. If we know that simulation X depends on texture Y being baked for particle emission, or an exporter can verify if all baked textures are up to date, that helps automation and validation.

Systems that more loosely couple things together where users have to manually ensure that this filename or setting in one place matches this other thing in another place are the source of many problems in production in my experience, and we should try hard to avoid that.

I'm convinced that a node system can provide the simple use case more intuitively than the alternative. Consider the user selects some objects and presses 'New Bake Tree', which then opens with the selected objects connected up to a basic bake node set up. They open a drop down an selected the pass they want and done. Now it's saved for the future and took them two clicks to make. If they want something more elaborate then it's all there ready for them.

This is the system I've been working towards with Bake Wrangler, but some of this is difficult to impossible with an add-on (need more ways to hook blender functions). Of course I could wrong, but I have had a lot of positive feed back to this approach.

Also having it as part of the material system only makes limited sense. Yes some baking is in order to get material properties, but a whole lot is computing things like projected tangent normals, occlusion, curvature etc and doesn't really have anything to do with materials logically.

I have a bit of grievance here because these changes will not actually address any of the issues my add-on is designed to fix but will require me to re-write essentially the entire core of the add-on :(

We could provide a low-level baking API function for this kind of thing . But even without that, I doubt it is that hard to programmatically set up a temporary scene to bake just what you need.

More low-level APIs for add-ons are always nice, but that is more work. Similarly, yes it's not hard to set up what ever is needed. But is extra work and somewhat time consuming!

(Actually a more modular low level baking API where you could perhaps insert shader programs or swap out different renderers would be great. There are quite a few things missing from the internal bake passes that aren't really possible to create via add-on.)

JPG (joules) added a comment.EditedMar 31 2020, 4:59 PM

I'm trying to fathom the scenario where there is a complicated set of materials, all defined within an objects material slots. Anything greater than 1 slot and you already have levels of complication. For example, defining faces to certain materials or any kind of UV projection (texture).
I'm just not a fan of overly complicated shader nodes so to preview 2 or more possible shaders on a mesh within one shader node setup can get tedious.

I know it's possible to set multiple "Material Outputs" and you can make them active: So there is already a solution to represent a bake result within the same shader node, however when we get into higher numbers, the solution I found was to clear out the Material Slots and define a new set.

Think of this use case:
All country flags to be baked out on a racing helmet.
There is also face assignment shaders for the rims and visor. (3 slots)
I want to bake them all at once.
And view the results off all the bakes.

I don't think what you are proposing will fit with my work flow. Particular if I want to bake out 100+ textures for a single mesh object. It's achievable right now with my addon. However as I mentioned a long term solution of a lower level of management for material slots so an object can have more than one set of material slots; is something I'm still a fan of, because it has become the solution for me and it really isn't a very complicated workflow (can't speak for anyone else).

I'm convinced that a node system can provide the simple use case more intuitively than the alternative. Consider the user selects some objects and presses 'New Bake Tree', which then opens with the selected objects connected up to a basic bake node set up. They open a drop down an selected the pass they want and done. Now it's saved for the future and took them two clicks to make. If they want something more elaborate then it's all there ready for them.

The exact same thing can be done if the baking setup is part of the shader nodes. For me the big difference is what happens when you want to make changes after that while also having a material setup to preview, use or export the resulting textures. That's where it becomes harder to hide the underlying complexity.

Also having it as part of the material system only makes limited sense. Yes some baking is in order to get material properties, but a whole lot is computing things like projected tangent normals, occlusion, curvature etc and doesn't really have anything to do with materials logically.

Most of the time tangent normals, occlusion, curvature are being baked for use in a material, either in Blender or an external application.

I'll admit that for other use cases it's less elegant, even though it will still work. But I think this kind of functionality can also fit into upcoming geometry nodes directly, where you want to be able to use shader/texture nodes as input, either baked or not.

I don't think what you are proposing will fit with my work flow. Particular if I want to bake out 100+ textures for a single mesh object. It's achievable right now with my addon. However as I mentioned a long term solution of a lower level of management for material slots so an object can have more than one set of material slots; is something I'm still a fan of, because it has become the solution for me and it really isn't a very complicated workflow (can't speak for anyone else).

I don't think what anyone is proposing (object bake passes,dedicated baking nodes, or baking in shader nodes) is designed to address your use case. And I would also consider it beyond what we are trying to solve here. I think a general solution for this would be more in the realm of overrides, to work with multiple variations of an object.

I don't think what you are proposing will fit with my work flow. Particular if I want to bake out 100+ textures for a single mesh object. It's achievable right now with my addon. However as I mentioned a long term solution of a lower level of management for material slots so an object can have more than one set of material slots; is something I'm still a fan of, because it has become the solution for me and it really isn't a very complicated workflow (can't speak for anyone else).

I don't think what anyone is proposing (object bake passes,dedicated baking nodes, or baking in shader nodes) is designed to address your use case. And I would also consider it beyond what we are trying to solve here. I think a general solution for this would be more in the realm of overrides, to work with multiple variations of an object.

What use case is it designed to address then. There is going to be a scenario that will trip it up. I'm giving a perspective on a workflow and my experience with baking as a user and an artist. Also overrides is not a solution to the problem I highlighted, because if it was a solution, I would've chosen it instead of writing my own addon.
I'm also not sure if anyone would use overrides for situations like this, instead a shader node setup with multiple "Material Output" nodes, or a linked object with it's slots linked to Object instead of Data.
I'm pointing out the scope of which may lay with an object or objects Material Slots. I'm all for a good separate baking functionality that covers a good proportion of scenarios and "use cases".

I personally think the current baking system is OK, some improvements and API hooks (pre/post handlers) would be useful. I do have issues with UDIM as far as the API is concerned. Out of scope here.
Just giving some "use case" user feedback.

What use case is it designed to address then.

See the description for this code review for the problems this is intended to solve. It's about easily baking and re-baking multiple channels for an object.

Also overrides is not a solution to the problem I highlighted, because if it was a solution, I would've chosen it instead of writing my own addon.

It's not possible yet in the current implementation of overrides, but if that system keeps improving and gets closer to USD, this would be a natural fit.

JPG (joules) added a comment.EditedMar 31 2020, 7:25 PM

I'm convinced that a node system can provide the simple use case more intuitively than the alternative. Consider the user selects some objects and presses 'New Bake Tree', which then opens with the selected objects connected up to a basic bake node set up. They open a drop down an selected the pass they want and done. Now it's saved for the future and took them two clicks to make. If they want something more elaborate then it's all there ready for them.

This is the system I've been working towards with Bake Wrangler, but some of this is difficult to impossible with an add-on (need more ways to hook blender functions). Of course I could wrong, but I have had a lot of positive feed back to this approach.

Also having it as part of the material system only makes limited sense. Yes some baking is in order to get material properties, but a whole lot is computing things like projected tangent normals, occlusion, curvature etc and doesn't really have anything to do with materials logically.

I'm kind of with you on this. My own bake addon would probably do better with nodes (custom settings for each pass is a popup). Essentially, what I was disappointed in was in the intermittent period that a new baking system is hashed out, bake (pre/post etc) handlers would've been super useful in the meantime and the patch that got passed up (was for 2.79) seemed very straight forward.
Currently an ugly hack figures out when a pass is finished before moving to the next. But it works and well.

I guess the case I'm making is the results of a bake or baking in general having nothing to do with a shader editor/materials/material slots direcly e.g Baked Images (textures) are just inputs to the
Principled BSDF and it is as simple as that. You can even get them off the net. I'm all for leaving that up to the user to figure it out or an addon that simplifies this process.

I guess to the developers I'm saying keep it in scope of simple processes of baking, do a custom Bake Editor area even (nodes). But not the shader editor, it already gets really messy in there. As for iterating over shaders, objects (or doing crazy things like wiping out material slots as it moves to the next group etc). Leave that to an addon, that is really advanced power bake level stuff. Give some bake handers so we know when a bake is done. I'm totally cool with that, I won't mind if the API breaks my addon if it means there is less in my addon in the end.
UDIMs is an example of not being properly thought out or even over thought out. Because at this stage I can't get my head around automating the process that I have with the current implementation of UDIMs (doing it all in the API). So I'm worried that you are going to lock in a workflow with baking.

Try and keep it open to scenarios buy doing less not more. Again the current (stable) baking system is OK. Apart from a lack of bake handlers.
So I guess I'm trying to figure out exactly what you are trying to fix that wasn't really that broken.
Again my addon works great and it IS a use case of 99% of users out there. We bake, ship it. Done.

Netherby (netherby) added a comment.EditedApr 1 2020, 8:31 AM

I'm convinced that a node system can provide the simple use case more intuitively than the alternative. Consider the user selects some objects and presses 'New Bake Tree', which then opens with the selected objects connected up to a basic bake node set up. They open a drop down an selected the pass they want and done. Now it's saved for the future and took them two clicks to make. If they want something more elaborate then it's all there ready for them.

This is the system I've been working towards with Bake Wrangler, but some of this is difficult to impossible with an add-on (need more ways to hook blender functions). Of course I could wrong, but I have had a lot of positive feed back to this approach.

Also having it as part of the material system only makes limited sense. Yes some baking is in order to get material properties, but a whole lot is computing things like projected tangent normals, occlusion, curvature etc and doesn't really have anything to do with materials logically.

I'm kind of with you on this. My own bake addon would probably do better with nodes (custom settings for each pass is a popup). Essentially, what I was disappointed in was in the intermittent period that a new baking system is hashed out, bake (pre/post etc) handlers would've been super useful in the meantime and the patch that got passed up (was for 2.79) seemed very straight forward.
Currently an ugly hack figures out when a pass is finished before moving to the next. But it works and well.

Like I said in my first post, I think blender needs to decide if a fully featured baking system is what it wants at this point or if a simple baking system is desired and to let add-ons deal with more fancy solutions. Personally I would like to see a fuller implementation within blender and I'm not opposed to putting bake stuff inside materials... But I've already invested many hours creating my solution, so continuing to evolve it isn't a problem. But if we want to go that way, we really need some better add-on support in the baking pipe.

I have wish list for that:

  • Access to bake progress information (currently no good way to provide feedback on progress)
  • Some simple way to create bake passes via inserting OpenGL (or some shader language) and/or custom renderer
  • Accessible image format control (Currently you have to modify the render output settings and save as render, which also applies gamma and other stuff that isn't really desirable. I would naively suggest adding this as a data block on each image)
  • I'm sure people would like pre and post hooks for bake

The current bake function is fine I think. Perhaps just needs an optional call-back to facilitate the progress information. There is a bunch of other stuff I think would be great for add-ons, but not really relevant to this discussion..

Like I said in my first post, I think blender needs to decide if a fully featured baking system is what it wants at this point or if a simple baking system is desired and to let add-ons deal with more fancy solutions. Personally I would like to see a fuller implementation within blender and I'm not opposed to putting bake stuff inside materials... But I've already invested many hours creating my solution, so continuing to evolve it isn't a problem. But if we want to go that way, we really need some better add-on support in the baking pipe.

I have wish list for that:

  • Access to bake progress information (currently no good way to provide feedback on progress)
  • Some simple way to create bake passes via inserting OpenGL (or some shader language) and/or custom renderer
  • Accessible image format control (Currently you have to modify the render output settings and save as render, which also applies gamma and other stuff that isn't really desirable. I would naively suggest adding this as a data block on each image)
  • I'm sure people would like pre and post hooks for bake

The current bake function is fine I think. Perhaps just needs an optional call-back to facilitate the progress information. There is a bunch of other stuff I think would be great for add-ons, but not really relevant to this discussion..

Agreed. Also I would add/expand on the following:

  • bake pass needs it's own image settings -Spectacular and environment maps often do not need to be high resolution and also some maps only require detail along the X axis. So this would be a data block on the bake pass.
  • De-noise features
  • Resizing features - Baking higher resolution at low samples can be faster and give better interpolated detail.
  • Real-time preview (Not a biggie on the list).
  • Output to image channel (RGBA) or layer (openexr)
  • And perhaps more features to improve bake quality
  • Custom UVMAP per pass (See limitations of UDIM)*

*Note: e.g custom Image sizes per pass might reveal issues of UDIM implementation and perhaps optional UVMAP per bake bass would address this. So how many UDIM settings can we have per UVMAP? One?

A full featured baking system need only check these boxes (and those you suggested) and the rest left up to be extended either directly by blender developers in the UI or an addon. As long as the API doesn't limit workflow possibilities (As examples given). Iterating over bake passes and automatically setting name outputs such as {object}_{shader}_{pass}_{udim_tile}_{resolution}. These things are actually quite important when shipping the texture/image maps, but you can see these are also very optional so a good candidate for python.

Another sticky point seems to be the ability to view the bake results within blender effectively without clobbering the actual material slot(s) shader(s) that is the source of the bake. However you may have more than one bake result (in the example I gave). My addon addresses this problem but it wasn't within the scope of baking nor within the scope of the source shader(s).
I'm all for simple design and perhaps that can address the solution of one-to-many data model for an objects material slots at a later time (this has been discussed over the years), in my opinion I don't think it matters right now as it is also not even specific to baking.
So a full featured baking system is fine as long as the API gives me the option of other workflow opportunities.

Possible process?

0. image texture node has an additional dropdown labelled 'bake type', user sets this accordingly (DIFFUSE, COMBINED, AO, etc).  This removes the need to physically select a texture prior to baking
1. select object or multiple objects
2. user selects which maps to bake
3. bake system goes through each object and then for each material:
    
       3a.  Goes through each bake type:

             Is this bake type not material dependant ((AO,  NORMAL...if no bump/disp), UV):
                   
                   YES:    during this entire bake process, has this bake type already been baked for this object (in another material assigned to this object for example):
                       
                           YES:      Is there an image texture node with this 'bake type'?
                                       
                                      YES:    populate this image texture node with the already baked texture, go to 4.
                                       NO:  did the user select option 'create if doesn't exist'
                           
                                                   YES:      creates disconnected image texture node, populate with existing texutre, go to 4.
                                                    NO:      skip and go to 4.


                              NO:     Is there an image texture node with this 'bake type'?
                                       
                                      YES:    bake image, go to 4.
                                       NO:  did the user select option 'create if doesn't exist'
                           
                                                   YES:      creates image (single or udim based on uv layout)  named 'bakeType_object', bake, creates disconnected image texture node and populates,  go to 4.
                                                    NO:      skip and go to 4.





                      NO:     go to 3b

  
        3_b.     checks there's a texture with this 'bake type' in this material:
                
               YES:     bake to that texture and go to 4.
               NO:     did the user select option 'create if doesn't exist'
                           
                           YES:      creates image (single or udim based on uv layout) and names the image 'objectName_matName_passName', bakes, creates disconnected image texture node, goes to 4.
                            NO:      skip and go to 4.

       4.    have all bake types been completed for this material?

               YES:   are there any more materials on this object?

                         YES;  go to next material and then 3a.
                         NO:  go to next object and to 3a.

                 NO:  go to 3a.

resolution of 'create if doesn't exist' textures should be defined globally for all selected objects in the bake settings, and also on a per material basis in the material settings. If not defined in the material, use resolution defined in bake settings.

Perhaps when using selected to active, rather than using ray distance, the low res seams could be projected onto the high res (cutting where required), and then bake the the high res directly, then populating the low res image textures with the result (because the uv's would match)? The cutting process could be, shrinkwrap low res to high res by nearest face, offset from high res normal, selects seams of low res, knife project seams onto high res.

A total overhaul of the baking system seems very nice, and I particularly like the idea of using nodes, that seems powerful, that solves the UI mess the baker can be, and Blender could provide some "baking templates" for Unity, Unreal, etc...
However, reading the conversation, a total overhaul may be slow to agree on and implement?

I was coming here with the idea to help doing some quick fixes to the current Baker, based on my experience of exporting my 3D models to Unity.
From a gamedev perspective (and there are a lot of indies using Blender out there), I mainly need to export the PBR channels (e.g. Roughness/Smoothness Metallic, etc) and to combine them in the same output image as R,G,B,A channels.
Although there are many baking solutions, I did not know them, and I struggled to find one. I used that one: https://github.com/danielenger/Principled-Baker
My idea was to try to port some of the solutions of Principled Baker into Blender.

So I am just wondering if it would be possible to start small, and implement small fixes that would make a lot of people happy?
I think that a couple of quick fixes would help a lot of people while waiting for the complete overhaul.

@Geoffrey Megardon (Nodragem), there are some features orthogonal to the baking system design changes proposed in this patch. For example support for baking metallic and roughness from the principled BSDF, which are fine to add anytime.

For deeper design changes about the workflow, how to set up bake passes, we should not add temporary intermediate solutions. In that case it's better for users to use a more complete add-on anyway.

For deeper design changes about the workflow, how to set up bake passes, we should not add temporary intermediate solutions. In that case it's better for users to use a more complete add-on anyway.

Except it's currently impossible for any addon to fix core baking design flaws. Every single baking addon out there fails to bake more complex materials with shading networks using material groups. At the same time groups are absolute necessity to manage material complexity, and reproduce "smart materials" workflows commonly used in other texturing tools. I've talked to two different creators of baking addons for Blender but they both agreed that they just can't make it work because of some core baking design flaws. Let alone the performance issues related to tile sizes.

Bottom line is, postponing such an important patch and justifying it by saying "there are more complete addons" is one of the worst ways to go about it. Texture baking in Blender is not up to quality standard with the rest of the software.

Furthermore, the current design, which relies on actively selected image texture node in shader editor can destroy user's data outside of Blender (texture files): https://devtalk.blender.org/t/blenders-texture-baking-design-is-extremely-dangerous/12193 It's super easy to click the bake button, and then upon exiting Blender, accidentally clicking the inviting "save all" button overrides all your image texture files on your hard drive with solid black color just because they were active selection in one of many materials on your object(s).

This patch would solve that too.

@Brecht Van Lommel (brecht) so if that sounds like a good first contribution, I would like to try :D !
Can we also add the possibility to combine the different PBR channels together into the same image (e.g. metalness in RBG, and smoothness in A)? or is it going to be too much changes?

The issue with add-ons is that you need to find them. I went through the trouble :D ! Would it be possible to add something in the documentation to direct the users to useful add-ons? https://docs.blender.org/manual/en/latest/render/cycles/baking.html
I can try to do it if you think that's a good idea.

I agree with @Ludvik Koutny (rawalanche) that if there is a danger to delete external textures by accident, there could be a quick fix while waiting for the new overhaul.

In terms of postponing/waiting, is there anything we can do to help the process?
are there blockers we can help to resolve?

Can we also add the possibility to combine the different PBR channels together into the same image (e.g. metalness in RBG, and smoothness in A)? or is it going to be too much changes?

I think the option could be added to specify which channels of the image to bake to, leaving the rest unmodified. That doesn't make it possible to bake multiple to the same image in a single click, but baking multiple passes at the same time requires a bigger overhaul like this patch.

The issue with add-ons is that you need to find them. I went through the trouble :D ! Would it be possible to add something in the documentation to direct the users to useful add-ons? https://docs.blender.org/manual/en/latest/render/cycles/baking.html
I can try to do it if you think that's a good idea.

If you want to contribute documentation for that it's welcome.

In terms of postponing/waiting, is there anything we can do to help the process?
are there blockers we can help to resolve?

The issue is that I think the design of this patch isn't quite right, and that it should be material node based. That's a significant change compared to this patch, but help with implementing that is welcome.

Thanks for your answer.

I will check how I can contribute to the documentation.

In terms of the proposal, I have just finished to read all the conversation.

Basically to move forward, I feel that these points need to be addressed:

  • the discussion has revealed several use cases for baking, can we come up with a list of requirements? I can start writing them in a google doc
  • the discussion brought interesting UI considerations (e.g. baking parameters in object properties, or in a centralised editor)
  • based on requirements and UI consideration, a new proposal can be written,
  • based on that new proposal, what can be reused from this patch?
  • based on that new proposal, what can be ingested from external solutions (e.g. Bake Wrangler, etc)
  • based on that new proposal, can we split the overhaul into smaller chunks to be delivered sequentially (e.g. the automatic creation of a material that used the baked texture could be part of a future patch)?

Also, as this conversation is beyond the review of this patch, we could possibly move it somewhere else if needed.

Let me know what you think.

I think a re-design of the bake process is just what Blender needed for use in games and interactive visualization. I think I fall stronger on the side of the "its not best for a material" (or maybe even possibly per-object) argument. You aren't just doing something like an "apply modifiers" or "convert to mesh" operation, you are describing a set of rendering procedures. Here are the points I would like to make when you break through the abstractions of what baking is:

  • Baking is just rendering from the perspective of the texels of a series of objects in your scene.
  • Blender already has the infrastructure to process this data, render different groups of objects, swizzle channels and combine passes and output this data to multiple image files: The Compositor.
  • Yes there will need to be some quality of life improvements so that the user doesn't have to manually set up a custom node network every time they want to bake. Other areas of blender (like the motion tracker) already create default compositor setups to render a scene like this
  • As for the automatic setup of materials on a mesh based on a series of baked images... That seems like something really complex and specific that should be the task of an addon. (personally I don't want a bake process to touch my material setup to try to be "helpful")

So could a bake pass perhaps:

  • Specify a collection to use as the target object (the texel camera)
  • A collection to render to as the "high poly" where applicable for passes that require it.
  • (optional) Cage
  • Which maps to bake
  • (optional) Target UVMap name (warn and/or fallback to last/first availible if not found)
  • (possibly optional) Compositor data block with the bake passes (to feed the data into for swizzling, processing and de-noising)

This approach could re-use the existing infrastructure in blender, make a more robust solution for baking anything from a set of mesh maps on a single assets to tasks like light-mapping entire scenes without having to create a a monolithic or custom node solution just for baking.

Yes, it wouldn't cover everything, yes It would need a decent amount of friendly UX additions for the texture painting workspace, those are still problems that I think specialized solutions should be designed for building on-top of existing ones.

I think it would be worth it to shift your perspective towards baking as a task just being another form of rendering, this gives it room to grow in the future.
It is especially useful to look past these concrete black-boxed solutions to abstract problems if Blender is going to move forward to a more open-ended, unified, and procedural future.

Would love to contribute to the ideas document! :)

Just wanted to add my 2 cents to this..

What I would want out of any new baking system is the ability to do the following:

  • Create multiple 'Bake Jobs', and for each Bake Job:
    • Select a collection of objects that would be used for baking
    • Select the target object and target UV map to bake to
    • Select a 'cage' optionally for more control over ray tracing
    • Customise settings for ray distance, samples, denoising, clear image, etc
    • Define a list of output textures, and for each output texture:
      • Choose a texture to bake the output into from a texture dropdown, so output is immediately applied to a texture possibly in use in an existing material in the scene for preview
      • Define what baking output values will go into each channel of the output texture
      • Choose a resolution, margin, etc and if the modified texture should be saved to disk automatically after baking
  • For baking output values, the existing choices of baked lighting outputs, plus a choice of all the standard PBR values (metallic, roughness, base colour, clearcoat, SSS, etc), plus other common baking outputs (AO, cavity, displacement, normals, etc)
  • Bake all jobs with one click, or just an individual job
  • Save All textures in one click, after reviewing the output to decide if the output is acceptable

So for example, lets say I've modelled a very detailed hard surface character made up of a few hundred objects, and millions of polygons, and I've created a simple low poly retopologised version of the character of just 40k quads, and I want to bake out the PBR values of the character to the low poly version.

I'd create a bake job, set my options, and setup a number of output textures to bake. For each texture, I would choose the output values to go into each texture channel. In this example, I'll choose the following setup:
Texture 1: Base Colour + Alpha (alpha being 0 if the baking raytrace didn't encounter geometry)
Texture 2: Specular + Metallic + Roughness in RGB and AO in Alpha
Texture 3: Normal Map in RGB + Displacement in Alpha

Here's what that would look like:

([v] = Dropdown)
([Label] = Button)

Bake Job: - [Bake]

  • Collection: Character [v]
  • Target Object: CharacterLowPoly [v]
  • Target UV: UVMap [v]
  • Cage: --
  • Samples: 32
  • Denoising: Yes
  • Textures:
    • Texture 1:
      • Settings:
        • Resolution: [4096] x [4096], Margin: 16px, Texture: basealpha.png [v]
      • Channels:
        • Red: Base.Red [v] Green: Base.Green [v] Blue: Base.Blue [v] Alpha: Alpha [v]
    • Texture 2:
      • Settings:
        • Resolution: [4096] x [4096], Margin: 16px, Texture: smrao.png [v]
      • Channels:
        • Red: Specular [v] Green: Metallic [v] Blue: Roughness [v] Alpha: Ambient Occlusion [v]
    • Texture 3:
      • Settings:
        • Resolution: [4096] x [4096], Margin: 16px, Texture: normalsdisplace.png [v]
      • Channels:
        • Red: Normal.Red [v] Green: Normal.Green [v] Blue: Normal.Blue [v] Alpha: Displacement [v]

[Bake All]
[Save All]

Then after choosing those options, I can just click 'Bake All' and all the values are baked and I can see a preview on another instance of the low poly model in the scene using those textures in it's material slots.

Then if I like the outputs, I could choose 'Save All' to save each texture. Any texture generated from the baking process would be saved to disk, if the texture is already a texture loaded from a file, the file is updated. Any texture not loaded from a file and only existing in memory in Blender is also saved, but for each texture, the user would be asked to pick a location and filename to save it, as is the case currently in Blender.

This may not be the best imaginable baking system, but it would be a massive improvement relative to what's currently in Blender, while retaining all of the existing functionality. For that reason, I see no reason why anyone would feel it isn't at least a step forward, even if it isn't the best system imaginable to some users. More advanced features such as custom baking passes, node systems, etc, could always be added at a later point.

Thoughts?

@Geoffrey Megardon (Nodragem), there are some features orthogonal to the baking system design changes proposed in this patch. For example support for baking metallic and roughness from the principled BSDF, which are fine to add anytime.

I hope I'm not being a pest to ask this, but just curious, you say support for baking metallic and roughness from principled BSDF would be fine to add any time. Is there any chance of us getting that perhaps in 2.91?

I've recently had to bake textures for a number of game ready characters made in Blender, and to bake base colour, roughness, metallic and other properties, I had to modify every material on the character between each bake pass, passing the base colour, metallic and roughness values into Emit shaders that then connected to the final output to emit the values. There were anywhere up to 40 different materials on some of these characters, and I ended up having to go through this process multiple times for each character to achieve a nice result.

Having an option to bake base colour, specular, metallic and roughness from the baking menu would be a huge time saver while a larger overhaul of the baking system is still in the design phase.

I would like to voice my support for putting baking in a node or several nodes in the shader editor. In my mind this would generalize to other workflows naturally by using node groups, which are already very powerful ways of coordinating behaviour across materials. And materials are already a good way to organize groups of textures onto an object. I often work with many objects in a scene with different materials, each of which would benefit from baked lighting passed into the shader. This baking-as-shader-nodes proposal would also perhaps nicely solve the problem of extremely complex procedural shader baking, a sort of "cache" node that renders all it's inputs in a particular uv-layout and can then be sampled as a texture. (this would be especially nice for volume textures, "nebula shaders" can get very slow!) Baking could be set per node to happen on every frame, or every animation render, or only when manually triggered.

If a material is shared by multiple material slots or multiple objects, perhaps different settings could be stored and accessed based on which slot/object is currently selected? I understand this would somewhat break the idea of material datablocks being separate from objects but perhaps it's the most intuitive/efficient way to work?

Obviously this node would have settings for where to store the texture, it could be packed into the .blend for render optimization use case or a file on disk for working with larger pipelines.

I think realistically the only way you could do better then this is with a full bake/render queue editor, like the Render Queue window in Reaper. However this seems like it would be more related to renderfarm/server scaling for very large workloads outside the scope of this patch....

Please don't do "thread bumping" on tasks and code reviews. Comments are for constructive feedback on the task, not requests.

Sorry Brecht for my small comment/request, but I can't find here anything about alpha check. Here my quite old suggestion on RCS. Current baking do alpha trace check only for geometry but not texture/material instructions. Would be amazing to make it work with material alpha.