「利用者:Phonybone/Archive/Particle systems thoughts」の版間の差分
細 (moved User:Phonybone/Particle systems thoughts to User:Phonybone/Archive/Particle systems thoughts) |
細 (1版 をインポートしました) |
(相違点なし)
|
2018年6月29日 (金) 03:46時点における最新版
目次
Particle System Design Thoughts
On this page i try to summarize and structure my thoughts on particle system design. After getting some insight into the particle code internals when coding my fire simulation, i found that the basic design has some major flaws in terms of extensibility and does not allow very customized particle behaviour. With every new feature these problems get worse, because everything is in one central location (2 or 3 files).
It is already planned to make the particle system more adaptable by the use of node editors (similar to compositing and materials), but imho it cannot be bad to have several people make their own thoughts about this. I start here with really very basic ideas on why particles are used at all in order to avoid any preconceptions. Then i will take a look at how other 3D packages work in this area and if and how their concepts could be implemented in Blender's own terms. Last but not least i will try to concretize the design with some code sketches and skeletons.
Motivation to use particle systems
Particle systems are a special way of handling geometric, physical and other data. As opposed to meshes, curves and other graphical objects used in 3D, they are usually very simplistic, representing only small amounts of data per particle, usually a position, velocity and the like.
More importantly, a system of particles, i.e. a collection of large amounts of particles, can easily be set up by artists without the need to specify the properties and behaviour of individual particles. Emitters define some kind of (statistical) initialisation and effectors define behaviour while treating each particle in the same way.
A particle system uses particles of the same type, which means that the data types making up each particle are the same for all particles in that system. This means that particles can be stored in memory in a contiguous buffer and easily cached and restored from disk in an efficient manner. In addition to that, the behaviour of particles is defined on the particle system level instead of each particle having it's own set of behavioural effectors. This means that particle effects can also be calculated very efficiently, allowing a large number of particles to be simulated at once.
Compared to this, a classic object or mesh has a lot of internal data (vertices, edges, faces, ...) and per-instance behaviour (e.g. Blender's modifiers). This makes it virtually impossible to store and handle larger amounts of general objects efficiently.
Basic Concepts
Relation of particles to other features
While particles as a separate feature can already produces quite interesting effects (e.g. fireworks), their real power is revealed only when they are made to interact with other objects and features in a 3D environment.
The most simple interactions handle all particles in a system in the same way, regardless of the current state of their data. This includes global force fields, such as gravity or the most simple wind fields. Naturally, the use of such effects is pretty limited.
The next step in complexity can be considered effects which take a (single) particle's state into account. These can be related to the particle position (e.g. many effectors have range limits) or velocity (air friction, magnetic fields) or any other data element belonging to an individual particle. Notice that most of these effects are also based on the settings of a non-particle object, i.e. a position vector and object orientation, an arbitrary surface, volume or something even more complex, depending on the type of effect.
A special case of particle effects are the so called "emitters". These are objects, which trigger the creation of new particles for a particle system. A typical setup would emit particles from an objects's surface or volume in a random fashion over time, using a constant or animated rate of emission. There are however a lot more complex cases that can be thought, like triggering the emission of particles based on external effects (pyrite only releases sparks when stricken with flint stone) or other particle's states (e.g. colliding particles triggering secondary particles, http://en.wikipedia.org/wiki/Photoelectric_effect). So in general, an emitter needs some kind of external impulse (a "trigger") in order to emit a particle, which in the simple first case mentioned above would be just a random timer.
Going on one step further, there are inter-particle effects. These take not only a single particle into account, but more than one. In physics, the number of "particles" (be it molecules, electrons or whatever) in elementary reactions is usually restricted to 2, since (nearly) all forces acting on particles arise from such 1-on-1 particle interactions. However, in 3D simulation most effects are not "elementary", but actually use some kind of statistic. Often one will want to calculate an average effect over a number of particles in close proximity (local density, temperature) or reduce the number of necessary calculations by looking only for surrounding particles.
For the design however, it doesn't really make much of a difference whether only one or a larger number of particles is considered, what counts is that these effects only influence the particles themselves, instead of having an influence on the rest of the 3D environment.
This brings us to the last step of complexity: effectors using particles to influence other objects. Many interesting examples can be brought up in this category and the number of visual and physical effects that can be achieved is unlimited: - snow falling on a roof can pile up and increase wheight and even make the roof collapse - rain makes surfaces wet and slippery - hot gas molecules in a container collide with the boundary surface, increasing pressure (eventually leading to explosion) - etc. etc.
This flexibility in possible reactions comes at a price however: it is then possible to introduce cycles into the de pendency graph (effector influencing particles, influencing the effector, ad infinitum), which has to be taken care of.
Particle types and rendering
The most simple type of particle is just a dot. This means it has a location in space, which is depicted as a single pixel when rendering (actually, the most generic type of particle shouldn't even need a location, more on that later). Even then however, there are multiple ways of rendering the particle location: it could be displayed using several pixels or a fixed-size circle. Adding a size property a halo or a billboard facing the camera can be rendered. When adding an orientation, an object can be instantiated at that location.
Advanced particle types can be hair strands or flame filaments, which contain a set of relative locations forming a curved line and use specialized rendering methods. Other particles may include physical data, like temperature or electrical charge, which are not directly depicted by a renderer, but necessary for the attached effectors to do their simulation.
In conclusion, the type of particle is determined by the types of data it consists of. The availability of data also determines which kind of rendering methods can be used for a particle system and what kind of effectors can be attached to it.
Existing work
I try to compare some of the major 3D package's particle systems below. This proves a bit difficult, since (apart from Blender) i don't own any of these softwares and therefore cannot really provide first-hand user experience. I tried to find out how they work by looking at tutorial websites and videos (linked below the according sections) and demo versions if available. Any information from experienced users is welcome! Also keep in mind that regarding a feature as pro or contra is to a certain degree subjective and reflects only my personal opinion.
Blender
Blender's current particle system implements a lot of different features, including:
- volume and surface emission, fixed rate or animated
- force fields, boids physics, keyed animation
- rendering: halo, billboard, object instance, hair strands and more
- ...
It has however a lot of serious limitations, which make a new approach necessary:
- The particle data structure is fixed, which means that every particle system has all the data types, whether it needs them or not. Even though large parts of the data are outsourced to separate structs and only allocated when necessary, the data struct grows with every new feature, making large systems more and more inefficient and maintenance harder.
- All simulation code apart from effector force calculation is done in a single central location, which also makes maintaning and extending the simulation very cumbersome.
- There is no separation between particle owners and emitters: a particle owning object is essentially also it's only emitter. This is also mixed up with a general particle "type" property. Emission rate is random and can only be controlled by basic parameters.
- There is no general approach to linking particle systems to other objects. Effector influence is roughly controlled via object groups, but all other effects (smoke sim, point density textures, etc) have to directly access particle system data from C code using a confusing system of weak references (object pointer + psys index).
3dsMax: Thinking Particles
Links
3dsMax: Particle Flow
Particle "systems" are made up of events. Each event contains an sequence of effects, called actions, starting with emission and ending in rendering.
The main types of actions are:
- Flow
- Presets for standard particle setups
- Operator
- Manipulates particle data
- Test
- Checks some condition of particles and generates an output, which can be connected to inputs of other events (terminology?).
By connecting events with different emitters, operators and rendering methods, one can generate complex interactions.
Pro
- Easy to set up (drag actions to event boxes)
- Clear separation between different particle types (events)
Contra
- No control over individual properties
- Fuzzy definition of event input/output
- Data flow between events is hidden from the user
Links
Houdini
Everything is organized in networks, where a node represents an operator and a connection means the flow of data between nodes. This extends over multiple hierarchical levels (e.g. scene->object->particle->subnodes, roughly comparable to edit mode in Blender). Nodes can be "grouped" to form new node types (terminology?).
Pro
- Very consistent interface (nodes everywhere)
- Clear concept of particle data flow through nodes
Contra
- No control over individual properties
- Creating complex effects difficult, understanding of operator math required
Links
Maya??
Conclusion
A simple categorization of the different approaches used in these softwares could look like this:
- Fixed pipeline (Blender)
- Network of sequences (Particle Flow)
- Collections (Rules) of low-level networks (Thinking particles)
- Hierarchy of networks (Houdini)
Each of them emphasises either simplicity or flexibility on different levels.
- Low-level networks allow a high degree of flexibility by combining properties in a nearly unlimited fashion, but need to be confined to as few nodes as possible to make them understandable. Having presets and the ability to create new templates to share with others is a great help.
Implementation Details
Possible particle modifiers
Incomplete list of possible particle modifiers by category, trying to cast the currently existing features into modular particle modifiers.
Emission:
Name | Explanation | Data used |
---|---|---|
Mesh | emits from verts, faces or volume, can use normal for speed, tags particles to allow sticking to emitter | location, velocity, face index/UV |
Curve | emits along a curve, can use tangent for speed | location, velocity, curve param |
Copy | copies (a subset of) particles from another particle system | location, velocity |
Animation/Physics/Behaviour:
Name | Explanation | Data used |
---|---|---|
Gravity | stuff falls down | velocity |
Force Field (Group) | applies external effector force or group | location, velocity, mass |
Collision | lets particles bounce of or bump into meshes, etc. and applies friction | location, velocity, mass, size/geometry, ... |
Drag | slows down particles in proportion to speed | velocity, mass, size |
Boids | implements swarming behaviour | location, velocity, mass, ... |
Keyed | uses explicit particle states to interpolate between | everything interpolatable |
Hair | uses softbody physics to animate hair | hair data |
Rendering:
Name | Explanation | Data used |
---|---|---|
Dot | simple solid dot | location |
Halo | smoothly fading circle | location, size |
Billboard | UV-textured quad | location, size |
Strands | shaded lines for hair or grass | hair data |
Path | interpolated particle path | location history |
Object | renders object instances | location, orientation, size |
Surface | surface generated from SPH simulation | location |
Drawing
Name | Explanation | Data used |
---|---|---|
Point | fixed-size point | location |
Axes | oriented xyz axes | location, orientation |
Path | draws location history | location history |
Helpers/Performance/Memory
Name | Explanation | Data used |
---|---|---|
Cache store/retrieve | records/loads particle states for performance | everything cacheable |
Python | custom operator using python methods on particle data | everything |
Questions
- How can Blender's code features be used?
I deliberately used the Houdini term Particle Operator in the previous sections, because i think it's be best of the existing systems and there are some similarities to the new operator+context concept in Blender 2.5: particle operators get data from a context. I could however be on the wrong track here since Blender's operators might be so fundamentally different that a new Blender term for the particle thing should be coined. - How can the data structure be constructed?
There should be no a-priori assumptions on what data members the particle data struct should incorporate, as the kind of operators in the system (or even implemented as plugins, etc. at runtime!) is not known at compile time. Rather the data types needed for the various operators and renderers should be collected on a per-configuration basis, i.e. each operator tells the particle system what data it needs, which will then allocate buffers accordingly.- How can individual properties be accessed?
Since the particle code will not know the correct interpretation of particle data at compile time (e.g. a float for mass, float[3] for location or some complex struct for more complicated operators), each operator needs to use index offsets/void pointers to access the correct data. When an operator is applied to the particle data (which should always be all particles at once - assuming single-particle operators would be bad design), it will simply get a number of pointers describing the address of the actual properties in the buffer (see the next point). Casting from an arbitrary void* to the actual data type represented is tricky to implement with a generic interface - any input on this would be appreciated! - Should data be stored per-particle or in per-property blocks?
Storing data per-property (i.e. num_props buffers of num_particles*prop_size) could make allocations a bit easier, but per-particle (i.e. num_particles buffers of sum(prop_size)) is IMHO a cleaner way of addressing properties.
- How can individual properties be accessed?
- Can Python provide ultimate flexibility?
It would be an extremely powerful feature if Python could be used to create custom particle operators. While for performance-critical operations a C implementation is the best choice, python operators (excuse the mix-up of terms here) would be very handy for custom effects. They could also make a nice testbed for features before implementing them in C and waiting for svn acceptance.