利用者:Joeedh/GoogleSoC2007 Zbuf

提供: wiki
< 利用者:Joeedh
2018年6月29日 (金) 02:51時点におけるYamyam (トーク | 投稿記録)による版 (1版 をインポートしました)
(差分) ← 古い版 | 最新版 (差分) | 新しい版 → (差分)
移動先: 案内検索
Note
This is only a working draft. I only have this posted here because I'm going to be out of town for spring break and I need this online to work on it.


Proposal to Upgrade Blender's Zbuffer With Streaming and Micropoly Displacement

Introduction

Micropoly displacement is the technique of rendering fully displaced geometry, where the tessalation of the geometry is so small, the user cannot see the individual faces.

Streaming Zbuffer is simply the technique of not holding faces in memory once they've been rendered into the Zbuffer. This is useful to avoid a too-large memory overhead from many linked duplicated objects, or geometry such as hair strands.

Limitations

The main limitations of these changes, is that they will not work with ray tracing. Interfacing ray tracing with micropoly displacement requires a complicated cache scheme, which is out of the scope of this project.

User Impact

Blender users will benefit greatly from having the ability to have true, good-looking and accurate displacements. In scanline rendering mode, standard bump mapping techniques (which work by modifying shading of existing objects to appear bumpy, but do not do any actual displacement) will be defunct, avoiding all the issues inherent in bump mapping.

Proper bump mapping technique is not easy; but even with advanced techniques such as parallax mapping there is still the problem that the “displacements” are a shading artifact, causing accuracy problems and also leaving the object outline intact.

Micropoly displacement mapping solves these problems by actually creating displaced geometry, subdividing faces until each face is smaller then the size of a pixel. This allows for true, correct displacement of surfaces, and also allows for correctly smooth subdivision surfaces.

Concrete Goals

The first goal would be for blender's zbuffer to embed texture info in the buffer, rather then forming it on the fly.

The next goal would be to make a system to stream objects into the buffer, freeing geometry as its rasterized. This would of course only happen in scanline mode.

The third goal would be to implement a simple REYES system for rendering displaced simple (e.g. faceted) and catmull-clarke (e.g. smooth) surfaces.