利用者:Jehuty/GSoC 2013/Proposal
Texturing for Volume Rendering in Cycles
Initial Proposal
Synopsis
The following features will be developed as part of this project:
i) Creation of a Voxel Data texture for Cycles analogous to the existing one for Blender Internal to allow the rendering of participating media, such as fog and smoke, and other volumetric effects. Volume shading will be developed independently of texturing.
ii) Integration of the OpenVDB library to represent the sparse volumes defined by voxel data, ie, voxel information from Smoke simulations, for example, will be exported to an OpenVDB grid, which is the representation to be used when rendering.
Benefits to Blender
Volume Rendering in Cycles is one of the features highly anticipated by the user community, and while Volume Rendering requires both shading and texturing functionalities to be available, a Voxel Data texture can still be useful for visualisation, since surfaces can also use 3D textures (as in importing voxel information from 2D slice images of 3D sampled data and projecting these on a stack of planes). With Volume Rendering fully implemented, users will be able to render Smoke simulations, as well as other participating media, using Cycles. Additionally, making the Smoke rendering process a more intuitive one is one of the expected changes to accompany the integration of OpenVDB, as are the performance improvements made possible by its integration, such as smaller memory footprint during rendering.
Deliverables
- Voxel Data texture implementation in Cycles and accompanying nodes for operations on volumes.
- OpenVDB integration, with support for all voxel data sources supported by the Voxel Data texture to be represented in its grid structure. The sources supported will be the same as currently supported by Blender Internal, which are: Smoke, Image Slices and Raw formats.
- Empty Space Traversal Optimization applied to the rendering of sparse volumes, stored as an OpenVDB grid.
- [Tentative] Add Motion Blur support to Voxel Data texture node.
- Documentation, with implementation details and usage guides.
Project Details
1. Voxel Data texture implementation
For the mapping of color to volumetric entities being rendered, a 3D texture is needed, since we're implementing the same strategy used by Blender Internal for Volume Rendering. The implementation of such a 3D texture is the goal here, in which we'll have a source of voxel data, which can be any of the following: voxel output from Smoke simulations, external raw binary data with voxel information or even 2D image slices of 3D sampled data, such as the ones obtained from CT scans. This volume data will be stored internally as an OpenVDB grid, addressed in (2), and will be queried for values during rendering. Interpolation methods will be made available - which methods will be implemented still is an open question. At least one new node will be added to the Node system corresponding to the Voxel Data texture, but others might be added with additional options.
2. OpenVDB integration
The OpenVDB grid will be the data structure used to internally store the volume data - it has some very good advantages over other representations, namely the efficient storage of sparse volumes, in which there is only a marginal hit in storage footprint for empty voxels, and this renders the grid a boundless representation, which theoretically enables us to do away with the domain object bounding the volume. Moreover, OpenVDB supports disk caching, keeping only the grid topology in memory at all times, making the rendering of massive data sets viable.
3. Empty Space Optimization
Strategies to optimize the integration step of the rendering of empty areas in the volumes will be investigated and one solution will be chosen for development. One such possibility is checking which tiles in the OpenVDB grid are populated, and which tiles can be skipped - the integration step can be carried out on the disjoint intervals obtained. At first glance, it seems the resulting integration interval wouldn't be as tight as it could be, but there probably is a trade-off between finding tighter bounds and evaluating intervals quickly. However, this remains to be further investigated.
4. Motion Blur support
If time allows, I'd like to implement Motion Blur support for the Voxel Data texture node, and although there might be alternative approaches, a simple one would be to smear voxel values over the course of movement for a given sample for the duration of a frame, distributing density values over the voxels affected by the samples that are moving.
5. Documentation
Writing the necessary wiki entries for written code and user level instructions on how to use the new features. A separate wiki page to post ongoing development of the project will also be created.
Bio
I'm a Computer Engineering undergraduate studying at the Federal University of São Carlos (UFSCar/BR), in Brazil, and I'll graduate this year.
--
References
[1] Wrenninge, Magnus (2013-01-31). Production Volume Rendering: Design and Implementation (Page 104). A K Peters/CRC Press. Kindle Edition.
[2] Wrenninge, Magnus. Production Volume Rendering - Fundamentals, Siggraph course notes 2011 - (http://magnuswrenninge.com/content/pubs/ProductionVolumeRenderingFundamentals2011.pdf)
[3] Wrenninge, Magnus. Production Volume Rendering - Systems, Siggraph course notes 2011 - (http://magnuswrenninge.com/content/pubs/ProductionVolumeRenderingSystems2011.pdf)
[4] Ken Museth. VDB: High-Resolution Sparse Volumes With Dynamic Topology. (http://www.museth.org/Ken/Publications_files/Museth-VDB_TOG13.pdf)
[5] Brett Miller. Cloud Modeling And Rendering for “Puss In Boots”. (http://www.museth.org/Ken/Publications_files/Miller-etal_SIG12.pdf)