Bug: YES
Repeatable: YES
Example file available: YES
OS/platform dependant: NO
Setup:
Blender 2.55 r330091 (bug is older)
Gentoo Linux AMD64 last update 16.11.2010 5:02
NVidia GT 430 / nvidia-drivers 260.19.21
ASUS Crosshair Formula IV
AMD Phenom(tm) II X6 1090T Processor (not overclocked)
Description:
All points of focus of the depth of field are currently calculated to lay on a plane. This plane is in parallel to the plane of the (thought) "photograhic film" of the used camera. Its distance is given by the DOF settings of the camera.
This leads to unnatural renderings when there is something rendered with a
large horizontal or vertocal extent.
The correct calculation would put all points of focus onto a sphere around the camera instead which radius would be identical to the distancs of the plane to
the camera.
This would lead to a more realistic rendering.
Example file:
The neon green lines are part of the sphere mentioned above and the neon red
lines are part of the plane.
After rendering has finished you can see, that the lines of the sphere are blurred (which should not happen as explained above) and the red lines are not.
Description
Event Timeline
Hi Meino,
The implementation of the DOF node is a post process trick that tries to approximate a real DOF, with a lot of limitations by design. More over, the code has been provided to us by a developer who doesn't maintain the code, nor does anyone in the team understand it well or maintain it.
Bugs are only accepted as bugs here when code behaves wrong, as in crashes or not providing functionality on a level the developer meant to provide it. User expectations or improvements are not bugs, and *for sure* we would love to see a better DOF implementation once. This has been noted on our todo as well.
The choice basically is: remove the code, or live with it... always very hard for open source projects to work with such cases.
FYI: our todo wiki:
http://wiki.blender.org/index.php/Dev:2.5/Source/Development/Todo/Tools#Compositing