This is the cache part of the openvdb branch.
A documentation for the cache system is available here: http://wiki.blender.org/index.php/User:Kevindietrich/OpenVDBSmokeExport.
Differential D1721
Implementation of an OpenVDB based cache system for smoke simulations. Authored by Kévin Dietrich (kevindietrich) on Jan 9 2016, 3:04 PM.
Details
This is the cache part of the openvdb branch. A documentation for the cache system is available here: http://wiki.blender.org/index.php/User:Kevindietrich/OpenVDBSmokeExport.
Diff Detail
Event TimelineComment Actions A few quick comments.
I wouldn't bother with this, I've never seen VDB files used that way and I don't know any renderers that support such files.
I think it would be better to give a suffix to the low resolution fields instead of the high resolution fields. Generally for rendering you just want to render the highest resolution in the file, and it would be inconvenient to have to use different grid names depending if high resolution is used or not.
As far as I know Blosc is the much better compression for OpenVDB because of its decompression speed, and also the default in Houdini. So I think Blosc should be the default.
Maybe use Blender/Smoke? I see Houdini uses a value like Houdini/SOP_VdbAtomWriter, which makes sense as SOP_VdbAtomWriter is the name of a node / SOP that a users can choose to use. But OpenVDBWriter in Blender is not a concept that means anything to users.
Comment Actions I haven't analysed the code in depth, but could you explain why this duplicates so much point cache code? I was expecting VDB to be just an alternative file format, which I wouldn't expect to require its own baking operators, caching logic or depsgraph hooks.
Comment Actions I remember being asked if this was feasible, it is, but it's quite impractical indeed. I think I'll remove this line.
Can do.
See my inline comment.
That sounds a bit better, and more descriptive.
While @Lukas Tönne (lukastoenne) was working on Alembic during Gooseberry, he gave me the idea of making an OpenVDB exporter for smoke sims, since it would be more suited than Alembic for this. After that people wanted it to be make it have it's own cache system, next to the point cache without reusing the point cache code, so the it could eventually be removed. That was following the idea that Alembic would replace the current point cache system for the rest (cloth, hair...). So I turned my little exporter into a little cache system. Unless I misunderstood what they (the Gooseberry team, and other developers) wanted/expected. The depsgraph hooks are apparently required to invalidate the cache when you modify a flow/collision object while the simulation is running and cached on the fly (before a final bake). I thought I could get away without those.
Comment Actions The point cache system was designed to be a unified physics caching and baking system, to avoid duplicating code, to make the UI and behavior consistent, and to make it possible to bake all physics systems in one go so that they can all interact properly. I wasn't involved with that Alembic point cache design or code so I don't know the rationale, but I would hope the eventual goal would still be to keep the physics caching and baking unified, even if some systems use different file formats? If the idea was to eventually have two separate Alembic and OpenVDB systems then that seems wrong to me. I feel a bit bad coming late into this, but reading T37578: Point Cache replacement based on Alembic and Alembic branch code I still don't see a good reason to have a separate system even as an intermediate step. To me it seems we'd end up with duplicated code which will only be more work to unify again later. Comment Actions Yes. To give you some context, if I recall correctly, Lukas' goal was to make a unified cache library, where various file types would be used. It's only when I asked about volume and Alembic that he got me into working on OpenVDB caching (as I already worked on some OpenVDB patch which didn't make it (D1008)). To say the least, a couple of functions in this patch were copied/adapted from his branch. Since this cache library work is pretty much on a halt, I just merged my work with the point cache, which will make the patch smaller and code review faster ;) (I guess I could have done that before.) There's one issue now, the point cache does not read .vdb files. For some reason PointCache.cached_frames is set to zero after a file was written. By commenting out some code I managed to get the point cache to read .vdb files, so it works, but I don't know where/why it fails to register written .vdb files. I'll upload a new patch shortly for people to have a look. Also there's another issue (an old one), which I kept for visual debugging, the smoke read from vdb files is darker, that's an easy fix, but I keep it to tell where the cache is read from. Comment Actions
Comment Actions I didn't think you'd be able to re-unify the code so quickly, nice :) More comments from reading through the code.
Comment Actions
Comment Actions Some build warnings to fix, no actual bugs in there, for the last one using (T)TOLERANCE should be fine. intern/openvdb/intern/openvdb_reader.h:31:1: warning: 'OpenVDBReader' defined as a class here but previously declared as a struct [-Wmismatched-tags]
intern/openvdb/intern/openvdb_writer.h:31:1: warning: 'OpenVDBWriter' defined as a class here but previously declared as a struct [-Wmismatched-tags]
intern/openvdb/openvdb_capi.cc:80:44: warning: struct 'OpenVDBWriter' was previously declared as a class [-Wmismatched-tags]
intern/openvdb/openvdb_capi.cc:123:30: warning: struct 'OpenVDBReader' was previously declared as a class [-Wmismatched-tags]
intern/openvdb/intern/openvdb_dense_convert.h:62:49: warning: implicit conversion from 'float' to 'typename Tree<RootNode<InternalNode<InternalNode<LeafNode<int, 3>, 4>, 5> > >::ValueType' (aka 'int') changes value from 0.001 to 0 [-Wliteral-conversion]
tools::copyFromDense(dense_grid, grid->tree(), TOLERANCE);
~~~~~ ^~~~~~~~~Actually testing this patch with the default Quick Smoke gives me some visual artifacts, for example frame 3 looks like this. Other than these dark patches the shape of the fire and smoke looks about correct to me. If you're not seeing those artifacts I can try to track down the cause. It might be a mistake in my OpenVDB library build or a bug that only happens on OS X or something else in my environment.
Comment Actions Those compile warning about class/struct are a bit nitpicky :) But that's a easy fix. The dark patches that you see are most likely the drawing issue I mentioned in one of my previous post. I suspect the issue comes from the clipping of the shadow grid.
Comment Actions
Comment Actions I went for lower case names, to be consistent with other .vdb files. Does caching work for you, though? I ask because the only way for you to notice the drawing issue is that the vdb file is read, but here it does not read them unless I comment these two lines: https://developer.blender.org/diffusion/B/browse/master/source/blender/blenkernel/intern/pointcache.c;e2715b129c1e62c8076ab4a212634b2a99cd8bc3$2696-2697 I tried to figure out why there is this issue, but haven't found the cause yet. Comment Actions It seems the caching worked in a file with an existing smoke sim, but trying now in a new file I got an openvdb::v3_2_0::ArithmeticError exception and crash. It seems obmat is entirely zero, commenting out these lines makes it work for me: //mul_m4_m4m4(sds->fluidmat, sds->obmat, sds->fluidmat); //mul_m4_m4m4(sds->fluidmat_wt, sds->obmat, sds->fluidmat_wt); The drawing issue appears to have been fixed though I can't tell which change did it. Comment Actions I think obmat is zero because it needs to read it from the cache. And I just fix the caching issue, some code was relying on the extension to be 5 chars ("bphys"). (The drawing issue was fixed by not clipping the shadow grid (NULL parameter in OpenVDB_export_grid_fl(..., "shadow",....). We clip the grids based on the density grid to optimize file size). Comment Actions Ok, the patch seems to be in pretty good shape now. From my side the two remaining questions are regarding using the object matrix for the grid transform, and ensuring we handle all exception in case of I/O failures.
Comment Actions For the object matrix, I'm 100% sure I added it for rendering, but I don't remember exactly why or what I was trying to fix. So I'm just gonna remove them from the volume matrix computation, and see what's up when I go back to the Cycles implementation. I do feel bad about having them here, actually. For I/O failures we have a few cases:
Comment Actions I just noticed that in the Quick Smoke setup the domain object is scaled, and that the smoke domain voxels are scaled to stay cube shaped in world space. So from that I think actually the scaling should be included in the grid transform? Otherwise it will look squashed when the VDB is loaded in another app? But to me it still makes sense to have the VDB contain object space data, and let Cycles use the object matrix to transform things into world space, just like meshes. That way you could instance the smoke without any special exceptions I think, which seems more messy if (part of) the object matrix is baked into the VDB file. I'm not immediately sure about the right solution, need to think about it more.
An error print in the console is probably enough in that case, telling people that they should have compiled with Blosc. Comment Actions
Now I think it was partly to get uniform voxels, as for some obscure reasons the openvdb::tools::VolumeRayIntersector only works with uniform voxels. But even in that case you can still get non-uniform voxels, depending on the object's dimensions with respect to the domain's resolution. Maybe we could just mulitply the voxel size by the object matrix, and leave the translation untouched.
In case you don't know, VDB files do support instancing (by simply storing the instances' transforms alongside the original grid). However, I don't know how that would fit into the Cycles' and Blender's instancing paradigms, so to speak. Another thing worth mentioning is that (most) vdb files exported also have a per grid is_local_space metadata associated with each grid. (All of the non-Blender vdb files that I have here have that metadata set to false.) Comment Actions
Comment Actions Let's keep the object matrix in the grid transform for now then. Later on we can always decide to change the behavior, perhaps using a new option to choose between local/global transforms. One more thing before this is committed, it's best to leave OpenVDB OFF in CMakeLists.txt and blender_full.cmake until the OpenVDB libraries are available for all platforms, to avoid breaking the build. As far as I'm concerned this looks ready to commit. Comment Actions Went over this patch and made various minor edits.
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||