利用者:PrototypeNM1/Multitouch Framework Proposal

提供: wiki
< 利用者:PrototypeNM1
2012年6月24日 (日) 23:57時点におけるwiki>PrototypeNM1による版 (Benefits to Blender)
(差分) ← 古い版 | 最新版 (差分) | 新しい版 → (差分)
移動先: 案内検索

Multitouch Framework Proposal

Extending recognition for multitouch input using libSDL for the purpose of navigation, and a framework for future additions. As envisioned, the immediate result would serve as a compliment to a stylus. This would prepare Blender for the incoming Slate form factor machines (see Samsung Series 7 Slate and Asus Eee EP121), and potentially ease ports to Android touch devices.

I intend to create a framework for adding and maintaining touch support in Blender, as well as adding basic navigation controls using a framework first proposed in "Proton: Multitouch Gestures as Regular Expressions"1.


Benefits to Blender

It is apparent that multitouch interaction is a hot item in 3D content creation. Three out of six of the Pixar Research Group's papers in the last four years have focused on the topic,1-3. Indeed there is significant documentation on the subject.

With Microsoft pushing the slate form factor computers, I believe we will see an increased adoption of them among artists. As it stands, Blender is unusable without a mouse or a 3D mouse due to an inability to navigate the program efficiently. Neither a mouse or 3D mouse are optimal for slate computers as they work against its core benefit: portability while maintaining versatility. Adding basic touch navigation and orientation controls is the fastest way to leverage Blender 3D content creation on these highly portable machines.

Imagine an artists having 15 minutes free time and a slate on hand, when suddenly an idea for a better mouse hits her. With current implementation, she will have to hope that she holds onto the idea long enough to get to a proper workstation to scuplt it out. Worse yet, she might finally cave, and purchase an iPad and Autodesk 123D Sculpt4! This project will help avert such an unfortunate situation.

Adding a framework for multitouch now will dissuade hackish and unmaintainable solutions that might start popping up as slate computers become more prominent, saving developer time and sanity. This will also provide the means for extending touch interactions1 should such additions be deemed desirable in the future. It also primes Blender for potential ports to ARM touch devices.

Deliverables

Ability to navigate Blender with touch and pen input alone.

Single touch should work as MMB click and drag for all editors. Zoom out should be accomplished via pinching two fingers, and zoom in via spreading two fingers. Two fingers shifting in the same direction should function as panning in 3D view. As a bonus, two fingers rotating should spin the 3D View accordingly, further testing is necessary to see if this is useful.

Wiki Documentation

Self explanatory.

Project Details

  • further consulting with a mentor would be necessary to make sure I understand appropriate scope for this project.

The most promising lead I have found for a multitouch framework documented in the Pixar Research paper "Proton: Multitouch Gestures as Regular Expressions"1. At the current time, it is my impression that this would be best implemented as a unique file in GHOST.

Catching multitouch input should be handled in the SDL library to ease potential Android ports in the future.

Support for multitouch outside of 3D view should come last unless there is a generic way to add basic navigation, e.g. panning, to all editors at once. Further research on this is necessary.


References

1. http://graphics.pixar.com/library/ProtonChi/paper.pdf

2. http://graphics.pixar.com/library/Eden/paper.pdf

3. http://graphics.pixar.com/library/MultiTargetUserStudyA/paper.pdf

4. http://www.youtube.com/watch?v=dIuRnaym_hQ&feature=related

5. http://globalgamejam.org/2012/alone - example of prior work in design (please note: I was not heavily involved with coding in this project)