Doc:2.6/Manual/Render/Output/Frameserver

提供: wiki
< Doc:2.6‎ | Manual‎ | Render‎ | Output
2018年6月29日 (金) 02:47時点におけるYamyam (トーク | 投稿記録)による版 (1版 をインポートしました)
(差分) ← 古い版 | 最新版 (差分) | 新しい版 → (差分)
移動先: 案内検索

Connecting to arbitrary codecs: the frameserver

Introduction

Favorite video editors question #1:

What is the best way to connect my editing system to a video codec, if it has no integrated interface for it? JPEG-files, PPM-files, RAW-AVI? And then pipe them into the target-application?

Hey, where is my disk space, dude? (Ever did high definition video editing? Then you know what I mean...)

If we take a look at our big brothers (or small open source brothers like VirtualDub), we notice, that they solved this problem by creating more or less hackish generic interfaces to let the target application work directly on the rendered stream in the source application by a feature called frameserving.

It boils down to let the target application connect to our video editing system using some remote procedure calling-interface.

The crucial step at this point is that the editing system is driven by the codec and not the usual other way around!

The questions asked by the codec are some general video type information (like framerate, framecount, size etc.). And then the simple question to serve out a frame at an arbitrary position.

If we want such an interface in Blender, it should better be simple and be usable by everyone. VirtualDub in contrast does some interesting hacks with the AVI-Subsystem of Windows thereby doing RPC over AVI-DLL calls. That's nice for a Windows Only(tm) program but not a very good idea for Blender. If we want transparent AVI-support we want to do it as a seperate client to our server interface but not at the core of the system.

The easiest RPC-Interface I know of is HTTP. You can write a simple HTTP-Server if you are drunk and it is 4 o'clock in the morning.

Same goes for the client. If your codec is command-line driven and can receive its video input from standard input, you can write a small shell-script, that does the procedure calls using wget...

That leads us to the...

Implementation: writeframeserver.c

If you choose frame-server rendering in the render menu and press the "Anim" button, the frameserver is started and listens on port 8080 by default for connections. You can change this in the user preferences.

Now the Blender renderer is driven by the frameserver and thereby handed control over to the codec. The codec is only allowed to do a small set of operations. The only harm some evil client can do is bringing your loadavg to 1.0 by rendering frames. The server only handles one request at a time, so a real denial of service attack is not possible. There are only two points in the code where you can try to inject a buffer-overrun and both are thoroughly checked.

If you point your favorite webbrowser to http://localhost:8080/ you will receive a small help screen, that gives a small explanation of the interface. (This is just like calling Blender with --help...).

 /info.txt:

Gives the general information as key-value pairs:

   * start: start-frame
   * end: end-frame
   * width: width of frame
   * height: height of frame
   * rate
   * ratescale: target framerate = rate/ratescale 
 /close.txt:

Stops the frameserver.

 /images/ppm/%d.ppm:

Contain the rendered images in ppm-format. Any other formats that make sense should go to /images/openexr/... or the like some day.

A sample client using sh and encodedv

If you prefer the DV-encoder of libdv over the one integrated into ffmpeg this little shell script might be for you:

  #!/bin/sh
  BLENDER=http://localhost:8080
  OUTPUT=/tmp/output.dv
  eval `wget ${BLENDER}/info.txt -O - 2>/dev/null |
      while read key val ; do
        echo R_$key=$val  
      done`
  i=$R_start
  {
    while [ $i -le $R_end ] ; do
         wget ${BLENDER}/images/ppm/$i.ppm -O - 2>/dev/null
         i=$(($i+1))
    done
  } | encodedv -v -- - - >$OUTPUT
  wget ${BLENDER}/close.txt -O - 2>/dev/null >/dev/null

Should be pretty self explainating. If you have problems with encodedv I'm the one to blame since I wrote that, too...

A sample client for TMPGenc: release/windows/contrib/vfapi

There is a vfapi-plugin included for use with TMPGenc. Why the heck support a commercial codec in an open source tool like Blender? (Thereby even integrating 10 lines of html-code into Blender ;-)

We have to face it: the current state of ffmpeg is not good enough for rendering DVD-quality MPEG-2 streams with it *period*.

If you read any codec comparisons you will notice that ffmpeg is more oriented towards speed than quality.

On the other hand TMPGenc is to my knowledge the best mpeg-2 encoder around. It is fast and it just works(tm).

So: if we ported Blender to Windows and Mac OSX, why not add a client to make it support TMPGenc, which is a small shareware tool? For most people, codec decision isn't based on religion but more on the need of a good end result. TMPGenc delivers that.

To compile the vfapi-plugin using mingw simply do:

  i586-mingw32msvc-gcc -shared vfapi-plugin.c -o blenderserver.vfp -lwsock32
  i586-mingw32msvc-strip blenderserver.vfp

and copy the resulting plugin into the TMPGenc directory.

Then create a file called "localhost.blu" which just contains "hostname:port" e.g. "localhost:8080".

You can open the blu-file in TMPGenc. That's all. The rest is automagic. By the way: the whole thing is developed completely under Linux and tested successfully with a vanilla wine-0.9.6 and worked out of the box.