﻿<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="ja">
	<id>https://wiki.blender.jp/index.php?action=history&amp;feed=atom&amp;title=Community%3AScience%2FRobotics%2FYarp_Camera_Example</id>
	<title>Community:Science/Robotics/Yarp Camera Example - 版の履歴</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.blender.jp/index.php?action=history&amp;feed=atom&amp;title=Community%3AScience%2FRobotics%2FYarp_Camera_Example"/>
	<link rel="alternate" type="text/html" href="https://wiki.blender.jp/index.php?title=Community:Science/Robotics/Yarp_Camera_Example&amp;action=history"/>
	<updated>2026-04-21T12:15:49Z</updated>
	<subtitle>このウィキのこのページに関する変更履歴</subtitle>
	<generator>MediaWiki 1.31.0</generator>
	<entry>
		<id>https://wiki.blender.jp/index.php?title=Community:Science/Robotics/Yarp_Camera_Example&amp;diff=90781&amp;oldid=prev</id>
		<title>Yamyam: 1版 をインポートしました</title>
		<link rel="alternate" type="text/html" href="https://wiki.blender.jp/index.php?title=Community:Science/Robotics/Yarp_Camera_Example&amp;diff=90781&amp;oldid=prev"/>
		<updated>2018-06-28T18:43:30Z</updated>

		<summary type="html">&lt;p&gt;1版 をインポートしました&lt;/p&gt;
&lt;table class=&quot;diff diff-contentalign-left&quot; data-mw=&quot;interface&quot;&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;ja&quot;&gt;
				&lt;td colspan=&quot;1&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;← 古い版&lt;/td&gt;
				&lt;td colspan=&quot;1&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;2018年6月28日 (木) 18:43時点における版&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-notice&quot; lang=&quot;ja&quot;&gt;&lt;div class=&quot;mw-diff-empty&quot;&gt;(相違点なし)&lt;/div&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;</summary>
		<author><name>Yamyam</name></author>
		
	</entry>
	<entry>
		<id>https://wiki.blender.jp/index.php?title=Community:Science/Robotics/Yarp_Camera_Example&amp;diff=90780&amp;oldid=prev</id>
		<title>wiki&gt;Mindrones: moved Robotics/Yarp Camera Example to Community:Science/Robotics/Yarp Camera Example: moving under Community:Science (&quot;Community&quot; being a new namespace)</title>
		<link rel="alternate" type="text/html" href="https://wiki.blender.jp/index.php?title=Community:Science/Robotics/Yarp_Camera_Example&amp;diff=90780&amp;oldid=prev"/>
		<updated>2011-11-02T19:51:49Z</updated>

		<summary type="html">&lt;p&gt;moved &lt;a href=&quot;/Robotics/Yarp_Camera_Example&quot; class=&quot;mw-redirect&quot; title=&quot;Robotics/Yarp Camera Example&quot;&gt;Robotics/Yarp Camera Example&lt;/a&gt; to &lt;a href=&quot;/Community:Science/Robotics/Yarp_Camera_Example&quot; title=&quot;Community:Science/Robotics/Yarp Camera Example&quot;&gt;Community:Science/Robotics/Yarp Camera Example&lt;/a&gt;: moving under Community:Science (&amp;quot;Community&amp;quot; being a new namespace)&lt;/p&gt;
&lt;p&gt;&lt;b&gt;新規ページ&lt;/b&gt;&lt;/p&gt;&lt;div&gt;= Using VideoTexture to export the video from a camera =&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
This is an extension of a previous example posted in this web page. In the example [[Robotics:Middleware/YARP/Python Simulator Example|Robotics:Yarp_Python_Simulator_Example]] a secondary viewport is created to show what is being captured by the camera object parented to the robot. Then the OpenGL function glReadPixels is used to scan the region of the screen where the secondary viewport is located.&lt;br /&gt;
&lt;br /&gt;
A more robust solution is to use Blender's VideoTexture module to render the scene from the camera into a texture, and use the image rendered directly. Here is a screenshot of the results:&lt;br /&gt;
&lt;br /&gt;
[[Image:yarp-VideoTexture-demo.png|none|650px]]&lt;br /&gt;
&lt;br /&gt;
== Pre-requisites ==&lt;br /&gt;
This new example is mostly based on the previous one [[Robotics:Middleware/YARP/Python Simulator Example|Robotics:Yarp_Python_Simulator_Example]], including all the procedures necessary to setup and use YARP.&lt;br /&gt;
It is also based on the explanation of VideoTexture&lt;br /&gt;
[http://wiki.blender.org/index.php/Dev:Source/GameEngine/2.49/VideoTexture].&lt;br /&gt;
&lt;br /&gt;
== Description ==&lt;br /&gt;
Basically the only changes to the previous example are two new python scripts, as explained for using VertexTexture, and a change to the &amp;quot;image_grabber_yarp&amp;quot; script.&lt;br /&gt;
&lt;br /&gt;
Initially it is necessary to create an object, where the video from the camera will be &amp;quot;projected&amp;quot;. Add a new plane (I call it Screen) and place it anywhere on the scene. It has to be in one of the active layers, otherwise the video update will not work. Give a material to the plane, including an image texture. The image is not important, but there must be one specified, or again VideoTexture won't work.&lt;br /&gt;
&lt;br /&gt;
Following the instructions for VideoTexture, create a Python Controller to be executed once when the GE is started. This script will link the output of the camera to the texture on the Screen plane:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;source lang=&amp;quot;python&amp;quot;&amp;gt;&lt;br /&gt;
import VideoTexture&lt;br /&gt;
&lt;br /&gt;
contr = GameLogic.getCurrentController()&lt;br /&gt;
scene = GameLogic.getCurrentScene()&lt;br /&gt;
obj = scene.objects['OBScreen']&lt;br /&gt;
tvcam = scene.objects['OBCameraRobot']&lt;br /&gt;
&lt;br /&gt;
if not hasattr(GameLogic, 'tv'):&lt;br /&gt;
	matID = VideoTexture.materialID(obj, 'IMplasma.png')&lt;br /&gt;
	GameLogic.tv = VideoTexture.Texture(obj, matID)&lt;br /&gt;
	GameLogic.tv.source = VideoTexture.ImageRender(scene,tvcam)&lt;br /&gt;
&amp;lt;/source&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This gets references to the camera we want to use (in the example, the one parented to the robot) and the Screen object. It then initializes the VideoTexture and links it to the appropriate objects.&lt;br /&gt;
&lt;br /&gt;
NOTE: Initially I was trying to limit the size of the texture captured, and I was using something like:&lt;br /&gt;
&amp;lt;code&amp;gt;GameLogic.tv.source.capsize = [512,512]&amp;lt;/code&amp;gt;&lt;br /&gt;
This gave me duplicated images when projecting the texture. Simply removing that line, the problem was solved.&lt;br /&gt;
&lt;br /&gt;
Now, add another Python Controller linked to an Always Sensor set to repeat constantly. Put this script in it.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;source lang=&amp;quot;python&amp;quot;&amp;gt;&lt;br /&gt;
if hasattr(GameLogic, 'tv'):&lt;br /&gt;
  GameLogic.tv.refresh(True)&lt;br /&gt;
&amp;lt;/source&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Finally, to capture the video images and send them through a YARP port, the &amp;quot;image_grabber_yarp&amp;quot; script changes to this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;source lang=&amp;quot;python&amp;quot;&amp;gt;&lt;br /&gt;
import Blender, yarp, Rasterizer&lt;br /&gt;
import array&lt;br /&gt;
import struct	&lt;br /&gt;
&lt;br /&gt;
from Blender.BGL import *&lt;br /&gt;
from middleware.yarp import YarpBlender&lt;br /&gt;
&lt;br /&gt;
def decode_image (image_string):&lt;br /&gt;
	&amp;quot;&amp;quot;&amp;quot;	Remove the alpha channel from the images taken from Blender.&lt;br /&gt;
		Convert the binary images to an array of integers, to be&lt;br /&gt;
		passed to the middleware &amp;quot;&amp;quot;&amp;quot;&lt;br /&gt;
	image_buffer = []&lt;br /&gt;
	length = len(image_string)&lt;br /&gt;
&lt;br /&gt;
	# Grab 4 bytes of data, representing a single pixel&lt;br /&gt;
	for i in range(0, length, 4):&lt;br /&gt;
		rgb = GameLogic.structObject.unpack(image_string[i:i+3])&lt;br /&gt;
		image_buffer.extend ( rgb )&lt;br /&gt;
&lt;br /&gt;
	return image_buffer&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
sensor = GameLogic.getCurrentController().sensors['imageGrabbing']&lt;br /&gt;
&lt;br /&gt;
# Create an instance of the Struct object, to make the unpacking more efficient&lt;br /&gt;
GameLogic.structObject = struct.Struct('=BBB')&lt;br /&gt;
&lt;br /&gt;
ok = sensor.positive&lt;br /&gt;
&lt;br /&gt;
# execute only when the 'grab_image' key is released &lt;br /&gt;
# (if we don't test that, the code get executed two time,&lt;br /&gt;
# when pressed, and when released)&lt;br /&gt;
if ok:&lt;br /&gt;
	#retrieve the YARP port we want to write on &lt;br /&gt;
	p = YarpBlender.getPort('/blender_simu/cam')&lt;br /&gt;
&lt;br /&gt;
	# extract VideoTexture image&lt;br /&gt;
	if hasattr(GameLogic, 'tv'):&lt;br /&gt;
		imX,imY = GameLogic.tv.source.size&lt;br /&gt;
		buf = decode_image (GameLogic.tv.source.image)&lt;br /&gt;
&lt;br /&gt;
	# Convert it to a form where we have access to a memory pointer&lt;br /&gt;
	data = array.array('B',buf)&lt;br /&gt;
	info = data.buffer_info()	&lt;br /&gt;
&lt;br /&gt;
	# Wrap the data in a YARP image&lt;br /&gt;
	img = yarp.ImageRgb()&lt;br /&gt;
	img.setTopIsLowIndex(0)&lt;br /&gt;
	img.setQuantum(1)&lt;br /&gt;
	img.setExternal(info[0],imX,imY)	&lt;br /&gt;
&lt;br /&gt;
	# copy to image with &amp;quot;regular&amp;quot; YARP pixel order&lt;br /&gt;
	img2 = yarp.ImageRgb()&lt;br /&gt;
	img2.copy(img)&lt;br /&gt;
&lt;br /&gt;
	# Write the image&lt;br /&gt;
	p.write(img2)&lt;br /&gt;
&amp;lt;/source&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The main distraction here is the different image formats used by YARP and Blender. The image returned by VertexTexture is a string with the pixels in binary format in RGBA, meaning 4 bytes per pixel. YARP's image object is an array of integers, described in RGB.&lt;br /&gt;
So the python struct.unpack function is used here.&lt;br /&gt;
&lt;br /&gt;
I create a struct object with the format that will be used for unpacking the binary data. Then every time the image is captured, the &amp;lt;code&amp;gt;decode_image&amp;lt;/code&amp;gt; function is called. It takes 3 bytes (RGB) and unpacks them, and ignores the fourth byte (Alpha) of every pixel. Unpacking converts them to an array, as needed to give to yarp.&lt;br /&gt;
&lt;br /&gt;
The rest is done exactly as in the previous example.&lt;br /&gt;
&lt;br /&gt;
Here is the blend file for the example:&lt;br /&gt;
&lt;br /&gt;
[[Image:Yarp_python_simulator-5.blend]]&lt;/div&gt;</summary>
		<author><name>wiki&gt;Mindrones</name></author>
		
	</entry>
</feed>