tag:blogger.com,1999:blog-280933882024-02-20T00:07:01.657-08:00binarymillenniumbinarymilleniumhttp://www.blogger.com/profile/17419830604356775608noreply@blogger.comBlogger101125tag:blogger.com,1999:blog-28093388.post-77154442480592784532016-04-08T08:40:00.000-07:002016-04-08T08:40:07.218-07:00Moving to lucasw.github.ioMoving to <a href="http://lucasw.github.io/">http://lucasw.github.io/</a>Anonymoushttp://www.blogger.com/profile/12190442172800693364noreply@blogger.com0tag:blogger.com,1999:blog-28093388.post-7588449115506962072014-02-20T19:54:00.005-08:002014-02-25T05:41:46.180-08:00Getting started with V-Rep with Octave on Ubuntu for AMRxThis <a href="https://www.edx.org/course/ethx/ethx-amrx-autonomous-mobile-robots-1342">edX Autonomous Mobile Robots</a> course started last week, and the V-Rep simulator with an octave/Matlab interface is going to be a big part of the optional exercises for the course. There is a free temporary license available for Matlab, but I don't like installing proprietary binaries on my Linux system, especially temporarily (Linux and Ubuntu really needs a standard way of installing applications into per-user directories that don't require system root). So I'm trying out the <a href="https://www.gnu.org/software/octave/">Octave</a> route.<br />
<br />
<span style="font-family: inherit;">Octave 3.6.2 is available as a standard package for my Ubuntu 12.10 install, but didn't work initially with the AMRx exercise 1 test scripts, so I had to build my own remApi.oct.</span><br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEii9EDqUPcfqlV6OADZ-mf-QB_hOGYxuWzxEWlDlK25hj-dhC6-SPBZv-8pQjcyjXQKr4UCAiGPtsGuq5hbugvImgety_r6yk1pi9bXeXo1-blqtJ6_AArIrRLfyxwKa3choJU_Eg/s1600/vrep2.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEii9EDqUPcfqlV6OADZ-mf-QB_hOGYxuWzxEWlDlK25hj-dhC6-SPBZv-8pQjcyjXQKr4UCAiGPtsGuq5hbugvImgety_r6yk1pi9bXeXo1-blqtJ6_AArIrRLfyxwKa3choJU_Eg/s1600/vrep2.png" height="344" width="640" /></a></div>
<br />
<br />
<h3>
<br /></h3>
<div style="background-color: white; color: #222222; font-size: 15px;">
<div>
<h3>
<span style="font-family: inherit;">Building remApi.oct</span></h3>
<div>
<span style="font-family: inherit;">Get the mkoct binary for octave:</span></div>
<div style="font-family: arial, sans-serif;">
<br /></div>
</div>
<div>
<span style="font-family: Courier New, Courier, monospace;">sudo apt-get install octave-pkg-dev</span><br />
<div style="font-family: arial, sans-serif;">
<br /></div>
</div>
<div>
<span style="font-family: inherit;">There is a file in the vrep tar ball to run within octave:</span></div>
<div>
<br /></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">V-REP_PRO_EDU_V3_1_0_64_Linux/<wbr></wbr>programming/remoteApiBindings/<wbr></wbr>octave/buildLin.m</span></div>
<div style="font-family: arial, sans-serif;">
<br /></div>
<div>
<span style="font-family: inherit;">It needs some setup first, which is documented within it:</span></div>
<div>
<br /></div>
<div>
<div>
<span style="font-family: Courier New, Courier, monospace;">cd V-REP_PRO_EDU_V3_1_0_64_Linux/<wbr></wbr>programming/remoteApiBindings/<wbr></wbr>octave/</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">cp ../../remoteApi/* .</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">cp ../../include/* .</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">octave</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">octave:4> buildLin</span></div>
</div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<div>
<span style="font-family: Courier New, Courier, monospace;">extApiPlatform.c: In function ‘extApi_readFile’:</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">extApiPlatform.c:222:8: warning: ignoring return value of ‘fread’, declared with attribute warn_unused_result [-Wunused-result]</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">remApi.cc: In function ‘octave_value_list FsimxAddStatusbarMessage(const octave_value_list&, int)’:</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">remApi.cc:161:35: warning: ‘octave_value::octave_value(<wbr></wbr>const charNDArray&, bool, char)’ is deprecated (declared at /usr/include/octave-3.6.2/<wbr></wbr>octave/../octave/ov.h:237) [-Wdeprecated-declarations]</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">remApi.cc: In function ‘octave_value_list FsimxCopyPasteObjects(const octave_value_list&, int)’:</span></div>
</div>
<div>
<span style="font-family: Courier New, Courier, monospace;">...</span></div>
<div>
<div>
<div>
<span style="font-family: Courier New, Courier, monospace;">remApi.cc:2834:31: warning: ‘void Array<T>::resize(octave_idx_<wbr></wbr>type) [with T = float; octave_idx_type = int]’ is deprecated (declared at /usr/include/octave-3.6.2/<wbr></wbr>octave/../octave/Array.h:459) [-Wdeprecated-declarations]</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">remApi.cc: In function ‘octave_value_list FsimxUnpackInts(const octave_value_list&, int)’:</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">remApi.cc:2856:29: warning: ‘void Array<T>::resize(octave_idx_<wbr></wbr>type) [with T = octave_int<int>; octave_idx_type = int]’ is deprecated (declared at /usr/include/octave-3.6.2/<wbr></wbr>octave/../octave/Array.h:459) [-Wdeprecated-declarations]</span></div>
</div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">octave:4> exit</span></div>
</div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<div>
<span style="font-family: Courier New, Courier, monospace;">cp remApi.oct ~/own/edx_amrx/exercise1/code/<wbr></wbr>common/libs/octave/<wbr></wbr>linuxLibrary64Bit</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">cd ~/own/edx_amrx/exercise1/code/<wbr></wbr>common/vrep</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">octave</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">octave:1> test</span></div>
</div>
<div style="font-family: arial, sans-serif;">
<br /></div>
<div>
<span style="font-family: inherit;">And it works! The connection from octave to the running V-REP is established, and then the script commands the simulation to start and then stops soon after.</span></div>
<div>
<span style="font-family: inherit;"><br /></span></div>
</div>
<h3>
<span style="font-family: inherit;">
Annoyances</span></h3>
<span style="font-family: inherit;"><br /></span>
<span style="font-family: inherit;">The edX platform has some annoying quirks I've documented <a href="https://plus.google.com/103190342755104432973/posts/e9zCCmyAZuM%C2%A0">elsewhere.</a> But other than that it is pretty good. </span><br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg14heLlz_h_D5xcyljF074B2emdA18-O_QxeBGVQotK90g63DWcB_sBJdDdJ74ddgCy2YSKwQ5gh1dkRMFhkh_L9JAqHwUVAdyBFiD37P9av7ZguuMIusUCbFhceyLbj3NNa710Q/s1600/vrep_r2d2.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg14heLlz_h_D5xcyljF074B2emdA18-O_QxeBGVQotK90g63DWcB_sBJdDdJ74ddgCy2YSKwQ5gh1dkRMFhkh_L9JAqHwUVAdyBFiD37P9av7ZguuMIusUCbFhceyLbj3NNa710Q/s1600/vrep_r2d2.png" height="342" width="640" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<br />
<br />
<span style="font-family: inherit;">V-REP is impressive also (coming from using Gazebo a great deal over the past six months), but has an annoying feature where a mouse right click can both rotate the view and open a context menu, the very first option is to close the 3D view window. So it is very possible especially on a laptop trackpad to accidentally try to rotate but end up closing the window of the view that was to be rotated.</span><br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhqrbpv2x_Mso-4S7__ojpPM6554c-3dcOeBs4CungbdJkNZ9Fr_QV7-pDIFQYXdqGlfZCsvaYd3RnamWsCudx9RHZnlPkC2Dn24BrbBwBeqKgND8sKCzn_UJcVE4fJog7Zgy25pQ/s1600/2014-02-20-193632_1920x1080_scrot.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhqrbpv2x_Mso-4S7__ojpPM6554c-3dcOeBs4CungbdJkNZ9Fr_QV7-pDIFQYXdqGlfZCsvaYd3RnamWsCudx9RHZnlPkC2Dn24BrbBwBeqKgND8sKCzn_UJcVE4fJog7Zgy25pQ/s1600/2014-02-20-193632_1920x1080_scrot.png" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<h3>
Debugging steps (not part of the solution)</h3>
<br />
<span style="font-family: inherit;">The first thing I tried was to launch vrep.sh from the vrep tar ball, load the exercise 1 ttt scene, and then enter the </span><span style="background-color: white; color: #222222; font-size: 15px;"><span style="font-family: Courier New, Courier, monospace;">exercise1/code/common/vrep/ </span><span style="font-family: inherit;">directory, launch octave, and try to run test.m:</span></span><br />
<br />
<div style="background-color: white; color: #222222; font-size: 15px;">
<span style="font-family: Courier New, Courier, monospace;">octave:2> conn = simulation_setup();</span></div>
<div style="background-color: white; color: #222222; font-size: 15px;">
<span style="font-family: Courier New, Courier, monospace;">/home/lwalter/own/edx_amrx/<wbr></wbr>exercise1/code/common/vrep/../<wbr></wbr>libs/octave/linuxLibrary64Bit/<wbr></wbr>remApi.oct</span></div>
<div style="background-color: white; color: #222222; font-size: 15px;">
<span style="font-family: Courier New, Courier, monospace;">octave:3> robot_nb=0</span></div>
<div style="background-color: white; color: #222222; font-size: 15px;">
<span style="font-family: Courier New, Courier, monospace;">robot_nb = 0</span></div>
<div style="background-color: white; color: #222222; font-size: 15px;">
<span style="font-family: Courier New, Courier, monospace;">octave:4> conn = simulation_openConnection(<wbr></wbr>conn, robot_nb )</span></div>
<div style="background-color: white; color: #222222; font-size: 15px;">
<span style="font-family: Courier New, Courier, monospace;">error: simulation_openConnection: /home/lwalter/own/edx_amrx/<wbr></wbr>exercise1/code/common/vrep/../<wbr></wbr>libs/octave/linuxLibrary64Bit/<wbr></wbr>remApi.oct: failed to load: liboctinterp.so: cannot open shared object file: No such file or directory</span></div>
<div style="background-color: white; color: #222222; font-size: 15px;">
<span style="font-family: Courier New, Courier, monospace;">error: called from:</span></div>
<div style="background-color: white; color: #222222; font-size: 15px;">
<span style="font-family: Courier New, Courier, monospace;">error: /home/lwalter/own/edx_amrx/<wbr></wbr>exercise1/code/common/vrep/<wbr></wbr>simulation_openConnection.m at line 30, column 28</span></div>
<div style="background-color: white; color: #222222; font-family: arial, sans-serif; font-size: 15px;">
<br /></div>
<div style="background-color: white; color: #222222; font-size: 15px;">
<span style="font-family: inherit;">I have liboctinerp.so.1, but no liboctinterp.so, so in a user directory on my LD_LIBRARY_PATH I added links to it and other libraries that were subsequently not found:</span></div>
<div style="background-color: white; color: #222222; font-family: arial, sans-serif; font-size: 15px;">
<br /></div>
<div style="background-color: white; color: #222222; font-size: 15px;">
<div>
<span style="font-family: Courier New, Courier, monospace;">ln -s /usr/lib/x86_64-linux-gnu/<wbr></wbr>liboctinterp.so.1 ~/other/install/lib/<wbr></wbr>liboctinterp.so</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">ln -s /usr/lib/x86_64-linux-gnu/<wbr></wbr>liboctave.so.1 ~/other/install/lib/liboctave.<wbr></wbr>so</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">ln -s /usr/lib/x86_64-linux-gnu/<wbr></wbr>libcruft.so.1 ~/other/install/lib/libcruft.<wbr></wbr>so</span><br />
<span style="font-family: Courier New, Courier, monospace;"><br /></span><span style="font-family: inherit;">Update - on a 13.04 Ubuntu I built the remApi.oct first, these steps are unnecessary.</span></div>
<div>
<span style="font-family: inherit;"><br /></span></div>
<div>
<span style="font-family: inherit;">I tried test.m again and ran into this problem:</span></div>
<div style="font-family: arial, sans-serif;">
<br /></div>
<div>
<div>
<span style="font-family: Courier New, Courier, monospace;">octave:3> connection = simulation_openConnection(<wbr></wbr>connection, robotNb);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">error: simulation_openConnection: /home/lwalter/own/edx_amrx/<wbr></wbr>exercise1/code/common/vrep/../<wbr></wbr>libs/octave/linuxLibrary64Bit/<wbr></wbr>remApi.oct: failed to load: /home/lwalter/own/edx_amrx/<wbr></wbr>exercise1/code/common/vrep/../<wbr></wbr>libs/octave/linuxLibrary64Bit/<wbr></wbr>remApi.oct: undefined symbol: _ZN5ArrayI12octave_valueED0Ev</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">error: called from:</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">error: /home/lwalter/own/edx_amrx/<wbr></wbr>exercise1/code/common/vrep/<wbr></wbr>simulation_openConnection.m at line 30, column 28</span></div>
</div>
<div style="font-family: arial, sans-serif;">
<br /></div>
<div>
<span style="font-family: inherit;">I saw some references to being able to rebuild remApi.oct, so set out to do that next.</span><br />
<span style="font-family: inherit;"><br /></span></div>
<div>
</div>
</div>
<br />
<br />Anonymoushttp://www.blogger.com/profile/12190442172800693364noreply@blogger.com0tag:blogger.com,1999:blog-28093388.post-19549905552519292262014-02-05T20:41:00.001-08:002014-02-05T20:41:20.276-08:00Text-to-speech audio books with text image videos for youtube<div style="background-color: white; color: #222222; font-family: arial, sans-serif; font-size: 14px;">
</div>
<br />Down and Out in the Magic Kingdom by Cory Doctorow has a very permissive license for reuse, so I've gone through the steps of making an audio book with images of the text and putting it on youtube:<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.youtube.com/embed/q1emEbNj-Rc?feature=player_embedded' frameborder='0'></iframe></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
To do this, the first thing was to download the text from the Cory Doctorow site:<br /><a href="http://craphound.com/down/Cory_Doctorow_-_Down_and_Out_in_the_Magic_Kingdom.txt">http://craphound.com/down/Cory_Doctorow_-_Down_and_Out_in_the_Magic_Kingdom.txt</a><br /><br />There are some issues with text encoding that I mostly plowed through though I suspect another process for conversion to UTF8 could have worked better.</div>
<div>
<br /></div>
<div>
First thing is to get rid of some ampersand hash forty fives that I think were dashes in vim:</div>
<div>
<br /><span style="font-family: Courier New, Courier, monospace;">:%s/&#45;//g</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
Also need to remove <a href="http://en.wikipedia.org/wiki/Specials_(Unicode_block)">http://en.wikipedia.org/wiki/Specials_(Unicode_block)</a> the U+FFFD unicode characters.<br /><br /><span style="font-family: Courier New, Courier, monospace;">%s/\%uFFFD//g</span></div>
<div>
<br /></div>
<div>
Also replacing tabs with spaces turned out to be necessary.</div>
<div>
<br /></div>
Imagemagick wouldn't do automatic line breaks for me later in this process (though pango might have worked), so added linebreaks to keep lines under 80 characters was necessary:<div>
<br /></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">fmt ../Cory_Doctorow_-_Down_and_Out_in_the_Magic_Kingdom.txt > ../Cory_Doctorow_-_Down_and_Out_in_the_Magic_Kingdom_line_breaks.txt </span><div>
<br /></div>
<div>
There were still some odd question marks generated by convert in the text, I hand edit to get the worst one out- the one that would have appeared on the title of the book.</div>
<div>
<br /></div>
<div>
Next thing was to split the book at every blank line into roughly 1500 text files which will probably be short enough to show in a single image:</div>
<div>
<br /></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">csplit -f down -b '%05d.txt' ../*.txt '/^$/' '{*}'</span><br /><br />Next is the conversion of each of the split text files into HD png files</div>
<div>
<br /><span style="font-family: Courier New, Courier, monospace;">for i in *.txt; </span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">do convert -background black -fill white -size 1920x1080 -pointsize 45 -gravity center label:"$(<$i)" PNG8:"$i.png"; </span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">done</span><br /><br />And then generate wave files from each of the 1500 text files:<br /><br /><span style="font-family: Courier New, Courier, monospace;">for i in *txt;<br />do pico2wave -w $i.wav "$(<$i)"<br />done</span><br /><br />Videos are then created from putting the png images together with the images, this part is very similar to the process in http://binarymillenium.com/2013/07/turn-set-of-mp3s-into-static-image.html</div>
<div>
<br /><span style="font-family: Courier New, Courier, monospace;">for i in *.txt; </span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">do avconv -loop 1 -r 1 -i "$i.png" -c:v libx264 -i "$i.wav" -c:a aac -b:a 32k -strict experimental -shortest "$i.mp4"; </span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">done</span><br /><br />Some conversions result in 0 length mp4s with this error:<br /><span style="font-family: Courier New, Courier, monospace;">[buffer @ 0x8959e0] Invalid pixel format string '-1' , </span></div>
<div>
this turned out to be caused by some of the convert png images being 16-bit instead of 8-bit (why wasn't it consistent, most were 8-bit), but putting PNG8: into the convert command line fixed this.<br /><br />Create a text file listing of all the mp4 files:</div>
<div>
<br /><span style="font-family: Courier New, Courier, monospace;">rm all_videos.txt </span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">for i in *mp4; </span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">do
echo $i
echo "file '$i'" >> all_videos.txt </span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">done</span><br /><br />And concatenate all the mp4 files together into one giant 6 hour video with no recompression (only 500MB though):</div>
</div>
<div>
<br /></div>
<div>
<pre style="background-color: white; color: #222222; font-size: 14px; line-height: 18.479999542236328px;"><code><span style="font-family: Courier New, Courier, monospace;">mkdir output
avconv -f concat -i all_videos.txt -c copy output/down_and_out.mp4</span></code></pre>
<pre style="background-color: white; color: #222222; font-size: 14px; line-height: 18.479999542236328px;"><code>
</code></pre>
For the first few minutes on youtube it looked like the video was all black instead of showing the titles, but a few minutes later this was fixed.</div>
<div>
<br /></div>
Anonymoushttp://www.blogger.com/profile/12190442172800693364noreply@blogger.com0tag:blogger.com,1999:blog-28093388.post-13118891259922206092014-02-03T18:01:00.003-08:002014-04-21T13:58:55.011-07:00Installing Full Desktop ROS Hydro from source on Ubuntu 13.10Since there aren't any ROS packages for 13.10, I'm did a full catkin source install as specified in <a href="http://wiki.ros.org/hydro/Installation/Source">http://wiki.ros.org/hydro/Installation/Source</a>. I'm also going to do a full gazebo 2.0 install from source in order to debug <a href="http://answers.gazebosim.org/question/5223/setting-projector-pose-vs-enclosing-link-pose/">http://answers.gazebosim.org/question/5223/setting-projector-pose-vs-enclosing-link-pose/</a> . <br />
<br />
As I understand it the proper use of catkin is to create a catkin workspace for all the standard ROS stuff, build and install it ( ./src/catkin/bin/catkin_make_isolated --install ) and then source the install setup.sh from that install ( source ~/ros_catkin_ws/install_isolated/setup.bash ) and then go on and create a new catkin workspace to actually do development in. Otherwise the build times will be ridiculous if catkin has to traverse 250 packages. <br />
<br />
<br />
<h3>
Gazebo</h3>
<div>
<br /></div>
Since the core gazebo isn't a ros package (yet?) it ought to be built separately following the instructions on <a href="http://gazebosim.org/wiki/2.0/install">http://gazebosim.org/wiki/2.0/install</a> .<br />
<br />
I ran into this error near the end of the build:<br />
<div>
<br />
<div>
<span style="font-family: 'Courier New', Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">[ 99%] Building CXX object interfaces/player/CMakeFiles/gazebo_player.dir/GazeboDriver.cc.o<br />In file included from /home/lwalter/other/gazebo_source/gazebo/interfaces/player/GazeboInterface.hh:26:0,<br /> from /home/lwalter/other/gazebo_source/gazebo/interfaces/player/GazeboDriver.cc:25:<br />/home/lwalter/other/gazebo_source/gazebo/interfaces/player/player.h:22:38: fatal error: libplayercore/playercore.h: No such file or directory<br /> #include <libplayercore/playercore.h></span><br />
<br />
<br />
So install libplayer-dev? No, that is a different player. I had libplayerc3.0-dev and libplayerc++3.0-dev installed already, and the file in question was located in /usr/include/player-3.0/libplayercore/playercore.h but gazebo wasn't seeing it. <br />
<br />
I'm sure I could have done this cleaner, but I just hand-edited interfaces/player/CMakeLists.txt:<br />
<br />
<br />
<span style="font-family: Courier New, Courier, monospace;">include_directories( /usr/include/player-3.0 ${SDF_INCLUDE_DIRS} ${PLAYER_INCLUDE_DIRS} ${OPENGL_INCLUDE_DIR} ${OGRE_INCLUDE_DIRS} ${Boost_INCLUDE_DIRS})</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span>I got a lot of these warnings but built 100% (haven't fully tested yet so they may yet cause problems):</div>
<div>
<br />
<br />
<span style="font-family: Courier New, Courier, monospace;">/usr/bin/ld: warning: libboost_system.so.1.49.0, needed by /usr/lib/gcc/x86_64-linux-gnu/4.8/../../../x86_64-linux-gnu/libsdformat.so, may conflict with libboost_system.so.1.53.0</span><br />
<br />
The post install bashrc instructions are not quite what is on the gazebo install page, I had to do this:</div>
<div>
<br />
<div>
<br /></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">export DEST_DIR=/home/lwalter/other/install<br />export LD_LIBRARY_PATH=$DEST_DIR/lib/x86_64-linux-gnu/:$LD_LIBRARY_PATH</span><br />
<span style="font-family: Courier New, Courier, monospace;"></span><br />
<span style="font-family: Courier New, Courier, monospace;">export PATH=$DEST_DIR/bin:$PATH</span><br />
<span style="background-color: white; color: #262626; font-size: 17px; line-height: 18.200000762939453px;"><span style="font-family: Courier New, Courier, monospace;">export </span></span><span style="background-color: white; color: #262626; font-family: 'Courier New', Courier, monospace; font-size: 17px; line-height: 18.200000762939453px;">PKG_CONFIG_PATH</span><span style="background-color: white; color: #262626; font-size: 17px; line-height: 18.200000762939453px;"><span style="font-family: Courier New, Courier, monospace;">=PKG_CONFIG_PATH=$DEST_DIR/lib/x86_64-linux-gnu/pkgconfig:$DEST_DIR/lib/pkgconfig:$PKG_CONFIG_PATH</span></span><br />
<h3>
ROS</h3>
<br />
Something went wrong in the ros libstage package, it never generated a config.h from ros_catkin_ws/src/stage/config.h.in ( <a href="https://github.com/rtv/Stage/blob/master/config.h.in">https://github.com/rtv/Stage/blob/master/config.h.in</a> ) - possibly this was due to not having the environmental variables pointing at gazebo correctly.</div>
<div>
<br />
<span style="font-family: Courier New, Courier, monospace;"><br />[ 10%] Building CXX object libstage/CMakeFiles/stage.dir/gl.o[ 12%] Building CXX object libstage/CMakeFiles/stage.dir/logentry.o/home/lwalter/other/ros_catkin_ws/src/stage/libstage/file_manager.cc:5:45: fatal error: config.h: No such file or directory #include "config.h" // to get INSTALL_PREFIX ^compilation terminated.[ 14%] make[2]: *** [libstage/CMakeFiles/stage.dir/file_manager.o] Error 1make[2]: *** Waiting for unfinished jobs....Building CXX object libstage/CMakeFiles/stage.dir/model.o/home/lwalter/other/ros_catkin_ws/src/stage/libstage/model.cc:141:45: fatal error: config.h: No such file or directory #include "config.h" // for build-time config ^compilation terminated.make[2]: *** [libstage/CMakeFiles/stage.dir/model.o] Error 1make[1]: *** [libstage/CMakeFiles/stage.dir/all] Error 2make: *** [all] Error 2<== Failed to process package 'stage':<br /> Command '/home/lwalter/other/ros_catkin_ws/install_isolated/env.sh make -j4 -l4' returned non-zero exit status 2<br />Reproduce this error by running:==> cd /home/lwalter/other/ros_catkin_ws/build_isolated/stage && /home/lwalter/other/ros_catkin_ws/install_isolated/env.sh make -j4 -l4</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
The really ugly hack solution is to create config.h by hand:</div>
<div>
<br />
<br />
<span style="font-family: Courier New, Courier, monospace;">vi /home/lwalter/other/ros_catkin_ws/src/stage/libstage/config.h<br /> <br />#define INSTALL_PREFIX "/home/lwalter/other/install/"<br />#define PLUGIN_PATH "/home/lwalter/other/install/usr/local/lib"<br />#define VERSION "3.0.2"<br />#define PROJECT "Stage"</span><br />
<div>
<br />
That much worked, though those values may cause problems later if not correct.<br />
<br />
<br />
<h4>
Telling ROS about Gazebo</h4>
<div>
<br /></div>
(I didn't discover the gazebo bashrc instructions were wrong until after going through these steps, they probably aren't necessary)<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">==> cmake /home/lwalter/other/ros_catkin_ws/src/gazebo_plugins -...<br />CMake Error at CMakeLists.txt:40 (find_package):<br /> By not providing "Findgazebo.cmake" in CMAKE_MODULE_PATH this project has asked CMake to find a package configuration file provided by "gazebo", but CMake did not find one.<br /> Could not find a package configuration file provided by "gazebo" with any of the following names:<br /> gazeboConfig.cmake<br /> gazebo-config.cmake<br /> Add the installation prefix of "gazebo" to CMAKE_PREFIX_PATH or set "gazebo_DIR" to a directory containing one of the above files. If "gazebo" provides a separate development package or SDK, be sure it has been installed.<br />-- Configuring incomplete, errors occurred!<br /><== Failed to process package 'gazebo_plugins':<br /> <br /> Command '/home/lwalter/other/ros_catkin_ws/install_isolated/env.sh cmake /home/lwalter/other/ros_catkin_ws/src/gazebo_plugins -DCATKIN_DEVEL_PREFIX=/home/lwalter/other/ros_catkin_ws/devel_isolated/gazebo_plugins -DCMAKE_INSTALL_PREFIX=/home/lwalter/other/ros_catkin_ws/install_isolated' returned non-zero exit status 1<br />Reproduce this error by running:<br />==> cd /home/lwalter/other/ros_catkin_ws/build_isolated/gazebo_plugins && /home/lwalter/other/ros_catkin_ws/install_isolated/env.sh cmake /home/lwalter/other/ros_catkin_ws/src/gazebo_plugins -DCATKIN_DEVEL_PREFIX=/home/lwalter/other/ros_catkin_ws/devel_isolated/gazebo_plugins -DCMAKE_INSTALL_PREFIX=/home/lwalter/other/ros_catkin_ws/install_isolated</span><br />
<br />
<br />
Command failed, exiting.<br />
<br />
<br />
It can't find gazebo, so run cmake-gui . in ros_catkin_ws/build_isolated/gazebo_plugins and set gazebo_DIR to <br />
<br />
<span style="font-family: Courier New, Courier, monospace;">/home/lwalter/other/install/share/gazebo/cmake</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span>
<h4>
SDFormat</h4>
<div>
<br /></div>
Now it looks like the debian supplied sdfformat is conflicting with the one gazebo built, uninstall and rebuild the ros_caktin_ws <br />
<br />
<br />
<br />
<span style="font-family: Courier New, Courier, monospace;">cd /home/lwalter/other/ros_catkin_ws/build_isolated/gazebo_plugins<br />cmake-gui .</span></div>
<div>
<br /></div>
<div>
SDFormat_DIR needs to be set to<br />
<span style="font-family: Courier New, Courier, monospace;">/home/lwalter/other/install//lib/x86_64-linux-gnu/cmake/sdformat</span><br />
<br />
Have to set the above for several packages.<br />
<br />
<h3>
RVIZ build problems with libshiboken</h3>
<span style="font-family: Courier New, Courier, monospace;"><br /><br />Linking CXX shared library /home/lwalter/other/ros_catkin_ws/devel_isolated/rviz/lib/libdefault_plugin.so<br />[ 95%] Built target default_plugin<br />make: *** [all] Error 2<br /><== Failed to process package 'rviz':<br /> Command '/home/lwalter/other/ros_catkin_ws/install_isolated/env.sh make -j4 -l4' returned non-zero exit status 2<br />Reproduce this error by running:<br />==> cd /home/lwalter/other/ros_catkin_ws/build_isolated/rviz && /home/lwalter/other/ros_catkin_ws/install_isolated/env.sh make -j4 -l4</span><br />
<br />
<br />
Investigate this with make VERBOSE=1</div>
<div>
<br />
<br />
<span style="font-family: Courier New, Courier, monospace;">...<br /> type 'QX11EmbedWidget' is specified in typesystem, but not defined. This could potentially lead to compilation errors.<br />Segmentation fault (core dumped)<br />make[2]: *** [src/python_bindings/shiboken/librviz_shiboken/librviz_shiboken_module_wrapper.cpp] Error 139<br />make[2]: Leaving directory `/home/lwalter/other/ros_catkin_ws/build_isolated/rviz'<br />make[1]: *** [src/python_bindings/shiboken/CMakeFiles/rviz_shiboken.dir/all] Error 2<br />make[1]: Leaving directory `/home/lwalter/other/ros_catkin_ws/build_isolated/rviz'<br />make: *** [all] Error 2</span></div>
<div>
There is some discussion of probably the same issue at<br />
<a href="https://aur.archlinux.org/packages/ros-hydro-rviz/">https://aur.archlinux.org/packages/ros-hydro-rviz/</a><br />
<br />
The solution seems to be to remove shiboken:<br />
<br />
sudo apt-get remove libshiboken-dev<br />
<br />
Cmake generates this new warning output:</div>
<div>
<br /></div>
<div>
<br />
<span style="font-family: Courier New, Courier, monospace;">Add the installation prefix of "GeneratorRunner" to CMAKE_PREFIX_PATH or<br />set "GeneratorRunner_DIR" to a directory containing one of the above files.<br />If "GeneratorRunner" provides a separate development package or SDK, be<br />sure it has been installed.<br />Call Stack (most recent call first):<br />src/python_bindings/shiboken/CMakeLists.txt:9 (include)<br />CMake Warning at /home/lwalter/other/ros_catkin_ws/install_isolated/share/python_qt_binding/cmake/shiboken_helper.cmake:41 (message):<br />Shiboken binding generator NOT available.<br />Call Stack (most recent call first):<br />src/python_bindings/shiboken/CMakeLists.txt:9 (include)<br />SIP binding generator available.<br />Python binding generators: sip<br />Configuring done</span><br />
<br />
But the pacckages all build and install now.</div>
<div>
<br />
<h3>
Misc</h3>
<div>
<br /></div>
Next try out building the catkin workspace with the projects I'm working on, the first thing missing appears to be the joy package, so clone it and rerun the catkin make install in the main ros catkin ws:</div>
<div>
<br /></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">git clone https://github.com/ros-drivers/joystick_drivers.git<br />sudo apt-get install libusb-dev libspnav-dev</span><br />
<div>
<br /></div>
<div>
What I don't understand about re-running ./src/catkin/bin/catkin_make_isolated --install is how much stuff has to be re-done even when nothing or very little has changed. Object files are correctly recognized as already compiled, but something high level gets dirtied and many shared libraries and scripts have to be rerun to presumably generate the exact same output files that were already generated.</div>
</div>
<div>
<br /></div>
<div>
<br /></div>
</div>
</div>
</div>
Anonymoushttp://www.blogger.com/profile/12190442172800693364noreply@blogger.com0tag:blogger.com,1999:blog-28093388.post-65637486542824838702013-10-24T12:44:00.001-07:002013-10-24T12:56:12.563-07:00Software Archaeology #1: GPS tagged street videoAround 10 years ago I was working on a number of personal software projects with a mostly common C++ code-base that had a lot of boilerplate OpenGL and vector classes I'd built up from reading the <a href="http://nehe.gamedev.net/">NeHe tutorials</a>. Some of that work was <a href="http://bioviewer.sourceforge.net/">properly documented and put into source control and madee public</a>, the rest were periodically made into version numbered tarballs. When I finished or lost interest in developing some graphics technique or physics simulation or anything else I would rename the directory to reflect the new project and start on new functionality: some of old was still useful, some of it had to get ifdeffed out, and some just sat unused. Some of those were <a href="http://icculus.org/~lucasw/">documented but not open-sourced</a>, and a few of those tarballs were archived in my online home directory. Eventually a lot of the code was superseded by vastly superior open source libraries so it didn't make sense to continue using it, but I would sometimes make backups of the old stuff on DVD and copy them to multiple hard drives as I bought them but with less and less care as time went by. <br />
<div>
<br /></div>
<div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjXnkbbzGKbgYVWmXB1OVrbcwgXkFnuQZpL7A0F874PoRNjcKhksm-yBN8bS1UAesT7M3Ir3Ae5vGPl1wk9JiHdYz9Y_8KNMiBecGw-gD9gf5bOI-I2c-Ydb6UcqOZVgW_LlQVL3A/s1600/PICT0225+car+tripod.JPG" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjXnkbbzGKbgYVWmXB1OVrbcwgXkFnuQZpL7A0F874PoRNjcKhksm-yBN8bS1UAesT7M3Ir3Ae5vGPl1wk9JiHdYz9Y_8KNMiBecGw-gD9gf5bOI-I2c-Ydb6UcqOZVgW_LlQVL3A/s320/PICT0225+car+tripod.JPG" width="240" /></a></div>
Fast forward to the present, and reading a section of <a href="http://www.amazon.com/Planet-Google-Companys-Audacious-Everything/dp/1416546960">Planet Google</a> about StreetView, and I started thinking about a particular project where I was driving around Seattle with a DV camera mounted in on the passenger side and a GPS on my roof being logged on a laptop. I'm pretty sure I was inspired by reading about the <a href="http://en.wikipedia.org/wiki/Aspen_Movie_Map">Aspen Movie Map</a> from the <a class="g-profile" href="http://plus.google.com/105273428597140573510" target="_blank">+Howard Rheingold</a> book <a href="http://books.google.com/books/about/Virtual_reality.html?id=hHZQAAAAMAAJ">Virtual Reality</a>.<br />
<br />
<br /></div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div>
Some OpenGL software loaded the images extracted from the video and then displayed them on top of a 3D GPS trajectory. It worked fine, but I only did it once and took no screenshots or videos and told no more than one or two people about it. Maybe I thought it was a such a good idea it had to be kept secret until the opportunity to capitalize arose, obviously the opportunity is now long past. But it it still was fun to have done and having it run again would be cool... but I couldn't find it on any of my still running desktop computers or laptops. Eventually I found a 250GB Maxtor drive in a shoebox and plugged it in with a usb-to-sata adapter, and there it was: 700 megabytes of video and images all nicely organized along with scripts and source code. And it compiled: after resolving the SDL dependencies the only thing I had to do was move the ordering -lGL etc. linker options to be after the listing of object files: <span style="font-family: Courier New, Courier, monospace;">$(CXX) -o $(PROGRAM) $(OBJECTS) $(LIBS)</span> instead of <span style="font-family: Courier New, Courier, monospace;">$(CXX) -o $(PROGRAM) $(LIBS) $(OBJECTS)</span>. And it ran fine with <span style="background-color: white; color: #222222; font-family: 'Courier New', Courier, monospace; font-size: 16px;">./gpsimage --gps ../capture_10_22_</span><wbr style="background-color: white; color: #222222; font-size: 16px;"></wbr><span style="background-color: white; color: #222222; font-size: 16px;"><span style="font-family: Courier New, Courier, monospace;">2004.txt --bmp biglist.txt</span><span style="font-family: inherit;">, and with some minor modification to the keyboard controls and the resolution I was able to take screenshots and a video:</span></span></div>
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.youtube.com/embed/AdT3bh-SrUY?feature=player_embedded' frameborder='0'></iframe></div>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh1IFUc5FiJBxsxs6kpGeLgVrnFFNEXT2I4nEZbtMDxrAF5HLT86keoH1VZLVv5v-0WSGTer8Gq4tHtLzA6ott8YpHzsFTPZRgCQaTQxsvN8sacMGJaz4_KZ4wdQ6aTgNxLeH2j6A/s1600/vlcsnap-2013-10-24-07h49m27s3.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="188" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh1IFUc5FiJBxsxs6kpGeLgVrnFFNEXT2I4nEZbtMDxrAF5HLT86keoH1VZLVv5v-0WSGTer8Gq4tHtLzA6ott8YpHzsFTPZRgCQaTQxsvN8sacMGJaz4_KZ4wdQ6aTgNxLeH2j6A/s320/vlcsnap-2013-10-24-07h49m27s3.png" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Ballard surface streets</td></tr>
</tbody></table>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhqYAe_YDbNxnd6DPPrf7YnFlq4oUVq9p22LWRi1YbtM4t-318kx85abuHxjgdfTPNTvwzOpD3Pl9lJSDp1r_9zLtp-F4ELa2mN9E3iB_cdRsd5TsFqqt6g1eGma-kMEToJ1aTySw/s1600/vlcsnap-2013-10-24-07h50m06s140.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="188" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhqYAe_YDbNxnd6DPPrf7YnFlq4oUVq9p22LWRi1YbtM4t-318kx85abuHxjgdfTPNTvwzOpD3Pl9lJSDp1r_9zLtp-F4ELa2mN9E3iB_cdRsd5TsFqqt6g1eGma-kMEToJ1aTySw/s320/vlcsnap-2013-10-24-07h50m06s140.png" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiGtxZ5xBq8fz_3_8Wnimj4p1-5E1SHj2i-KtBWWu9dIUNFFxdTDsuA1GtPt9w4S9BR7PZzXzlu_s5N0lMxuseQlOCGO7C8edc77-b7_MFG3DfI7OJ6KlEz2XbPCsvr5UOMwyeFnA/s1600/vlcsnap-2013-10-24-07h52m23s213.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="188" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiGtxZ5xBq8fz_3_8Wnimj4p1-5E1SHj2i-KtBWWu9dIUNFFxdTDsuA1GtPt9w4S9BR7PZzXzlu_s5N0lMxuseQlOCGO7C8edc77-b7_MFG3DfI7OJ6KlEz2XbPCsvr5UOMwyeFnA/s320/vlcsnap-2013-10-24-07h52m23s213.png" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Ballard surface streets</td></tr>
</tbody></table>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj5epxfMsWB9uLy1wctf-1qAOlPiDROg3VQUrjRapZpNtMD1P-31szfERkaY6TxWNfBcfwwhHIhSQLln-pxdGVpcwLUX4LqbEVAFaefExk5lLakihQYqvAIPY3GEdMifFNOOIDAZw/s1600/vlcsnap-2013-10-24-07h50m24s75.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="188" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj5epxfMsWB9uLy1wctf-1qAOlPiDROg3VQUrjRapZpNtMD1P-31szfERkaY6TxWNfBcfwwhHIhSQLln-pxdGVpcwLUX4LqbEVAFaefExk5lLakihQYqvAIPY3GEdMifFNOOIDAZw/s320/vlcsnap-2013-10-24-07h50m24s75.png" width="320" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgJ-6NLx2_PZ89wRcGOnIkd-EzPmVcGBpN6bN6eDD6D_kuK8TVzHrRjh2M_7nmb24sCF4SYPMfMUZ-5TeuK_RJ_Bf7I7hDXgc9AAMxMTVCdFus3e2uBktqFmfuwcEdkcRex-V3tFQ/s1600/vlcsnap-2013-10-24-07h52m33s93.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="188" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgJ-6NLx2_PZ89wRcGOnIkd-EzPmVcGBpN6bN6eDD6D_kuK8TVzHrRjh2M_7nmb24sCF4SYPMfMUZ-5TeuK_RJ_Bf7I7hDXgc9AAMxMTVCdFus3e2uBktqFmfuwcEdkcRex-V3tFQ/s320/vlcsnap-2013-10-24-07h52m33s93.png" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhv9xFxM9DTUyKUZlqlRMiMHBJemiDvhpSsLYvh4GEZPvarbx6YCYYirIyZHwxM_uiWwijEqc18ucQJ7ekM_4kCA34wl8gUH-zj4HJ-ro1fXEpRJcW_8skDYjMk7qcZOBB4PzcQPA/s1600/vlcsnap-2013-10-24-07h52m42s205.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="188" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhv9xFxM9DTUyKUZlqlRMiMHBJemiDvhpSsLYvh4GEZPvarbx6YCYYirIyZHwxM_uiWwijEqc18ucQJ7ekM_4kCA34wl8gUH-zj4HJ-ro1fXEpRJcW_8skDYjMk7qcZOBB4PzcQPA/s320/vlcsnap-2013-10-24-07h52m42s205.png" width="320" /></a></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjIbPkIGoI-TsLfslyZkm-AjlBeuNfBDN89D5rGb5-uVankDGd95pFwkWOjOyqVXlaG08iGbJz5dXzB1YiVVgm8g4GcQa6Uzx87KnJYc6gJIJVB6A0dADdgXGV8By72t2823AFpTg/s1600/vlcsnap-2013-10-24-07h53m39s241.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="188" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjIbPkIGoI-TsLfslyZkm-AjlBeuNfBDN89D5rGb5-uVankDGd95pFwkWOjOyqVXlaG08iGbJz5dXzB1YiVVgm8g4GcQa6Uzx87KnJYc6gJIJVB6A0dADdgXGV8By72t2823AFpTg/s320/vlcsnap-2013-10-24-07h53m39s241.png" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Exiting the tunnel to get on the viaduct</td></tr>
</tbody></table>
<div class="separator" style="clear: both; text-align: center;">
</div>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_YJR8AfpWHTtgGiLkiu2j-8e6cM8uZ2ClXg39wK6K357OBwHIdU5rSJOJ-_0RPJLDMPRkwmilS-RKCU0dmloTej_iYRvt02Jemb-FQlK6HWabbBi3SH9oqROBkCgrm4zI-50NcA/s1600/vlcsnap-2013-10-24-07h54m25s202.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="188" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_YJR8AfpWHTtgGiLkiu2j-8e6cM8uZ2ClXg39wK6K357OBwHIdU5rSJOJ-_0RPJLDMPRkwmilS-RKCU0dmloTej_iYRvt02Jemb-FQlK6HWabbBi3SH9oqROBkCgrm4zI-50NcA/s320/vlcsnap-2013-10-24-07h54m25s202.png" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Driving south on the 99 viaduct looking west</td></tr>
</tbody></table>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjqf3ijxTqbwZK8tiQtRf-GMCa8-pn7dfbYRv4xZnP4Yt7yWFnRsxob_hAB4HRCkH3_HlXU8J2_ylYFRBCqf3POJvCNY1lAKcaP4g1jPSKIuEykgJ4LwUmvfeGho0ubsOGAGbH6Sw/s1600/vlcsnap-2013-10-24-07h54m02s212.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="188" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjqf3ijxTqbwZK8tiQtRf-GMCa8-pn7dfbYRv4xZnP4Yt7yWFnRsxob_hAB4HRCkH3_HlXU8J2_ylYFRBCqf3POJvCNY1lAKcaP4g1jPSKIuEykgJ4LwUmvfeGho0ubsOGAGbH6Sw/s320/vlcsnap-2013-10-24-07h54m02s212.png" width="320" /></a></div>
<br />
<h3>
<span style="background-color: white; color: #222222; font-size: 16px;"><span style="font-family: inherit;">Implementation</span></span></h3>
<div>
It might be nice to actually check in some of the code to github or something, but for now I'll document the important parts here.</div>
<div>
<br /></div>
<div>
I used <a href="http://linux.die.net/man/1/dvgrab">dvgrab</a> to extract video from the camera, and converted that to decimated timestamped bmp images. The text gps log which looks like this:</div>
<div>
<br /></div>
<div>
<div>
<span style="font-family: Courier New, Courier, monospace;">$GPGGA,162651.395,4740.2379,N,12222.4207,W,1,06,1.5,15.0,M,-17.3,M,0.0,0000*7E</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">$GPGSA,A,3,23,13,16,20,01,25,,,,,,,2.8,1.5,2.4*3A</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">$GPGSV,3,1,09,23,81,041,46,13,51,298,48,16,46,083,46,20,42,175,44*7F</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">$GPGSV,3,2,09,01,20,100,37,04,19,284,34,27,19,240,40,25,16,061,40*7E</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">$GPGSV,3,3,09,24,12,320,30*47</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">$GPRMC,162651.395,A,4740.2379,N,12222.4207,W,22.57,179.63,221004,,*2C</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">$GPGGA,162652.395,4740.2316,N,12222.4208,W,1,06,1.5,14.4,M,-17.3,M,0.0,0000*7E</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">$GPGSA,A,3,23,13,16,20,01,25,,,,,,,2.8,1.5,2.4*3A</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">$GPRMC,162652.395,A,4740.2316,N,12222.4208,W,22.64,178.75,221004,,*2F</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">$GPGGA,162653.395,4740.2253,N,12222.4208,W,1,06,1.5,13.8,M,-17.3,M,0.0,0000*74</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">$GPGSA,A,3,23,13,16,20,01,25,,,,,,,2.8,1.5,2.4*3A</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">$GPRMC,162653.395,A,4740.2253,N,12222.4208,W,22.76,178.28,221004,,*25</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">$GPGGA,162654.395,4740.2189,N,12222.4208,W,1,06,1.5,13.2,M,-17.3,M,0.0,0000*7D</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">$GPGSA,A,3,23,13,16,20,01,25,,,,,,,2.8,1.5,2.4*3A</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">...</span></div>
</div>
<div>
<br /></div>
<div>
was converted like this:</div>
<div>
<br /></div>
<div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> ifstream parts(fileName.c_str());</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> if (!parts) {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> OUT("File \"" << fileName << "\" not found.");</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> exit(1);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> vector3f initialPos;</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> string lines;</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> while (getline(parts,lines)) {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> //cout << lines << "\n";</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> vector<string> tokens = tokenize(lines,",");</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> if ((tokens.size() > 0) && (tokens[0] == "$GPGGA") && tokens.size() > 9) {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> float rawTime = atof(tokens[1].c_str());</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> int tsec = (int)rawTime%100;</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> int tmin = ((int)rawTime/100)%100;</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> /// convert to local time</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> int thr = (int)rawTime/10000 -7;</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> float time = (float)thr + ((float)tmin+tsec/60.0f)/60.0f;</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> vector3f pos = vector3f(10000.0f*atof(tokens[2].c_str())-initialPos[0],</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> atof(tokens[9].c_str())-initialPos[1],</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> -10000.0f*atof(tokens[4].c_str())- initialPos[2]</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> );</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> if (initialPos == vector3f()) {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> initialPos = pos;</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> pos = vector3f(0,0,0);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> pair<float,vector3f> tp(time,pos);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> timePos.push_back(tp);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
</div>
<div>
<br /></div>
<div>
<br /></div>
<div>
(tokenize was a function to split up lines of text, I think the standard C++ libraries didn't do that at the time)</div>
<div>
<br /></div>
<div>
The timestamped bmp files look like this in a directory:</div>
<div>
<br /></div>
<div>
<div>
<span style="font-family: Courier New, Courier, monospace;">vid_2004.10.20_09-24-49.bmp</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">vid_2004.10.20_09-24-50.bmp</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">vid_2004.10.20_09-24-51.bmp</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">vid_2004.10.20_09-24-52.bmp</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">vid_2004.10.20_09-24-53.bmp</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">vid_2004.10.20_09-24-54.bmp</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">...</span></div>
</div>
<div>
<br /></div>
<div>
And read in like this:</div>
<div>
<br /></div>
<div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> ifstream bmpList(bmpListFileName.c_str());</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> if (!bmpList) {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> OUT("File \"" << fileName << "\" not found.");</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> exit(1);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> while (getline(bmpList,lines)) {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> vector<string> tokens = tokenize(lines,".");</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> if (tokens.size() > 3) {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> string messyTime = tokens[tokens.size()-2];</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> vector<string> items = tokenize(tokenize(messyTime,"-"),"_");</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> if (items.size() == 4) {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> //OUT( items[1] << ":" << items[2] << ":" << items[3]);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> float time = atof(items[1].c_str())</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> +(atof(items[2].c_str())</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> +(atof(items[3].c_str())/60.0f))/60.0f;</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> /// arbitrary offset to match gps to images better</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> time += .012f;</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> timeImage.push_back(pair<float,string>(time,lines));</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> } else {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> OUT("list time wrongly formatted " << messyTime);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> } else {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> OUT("list items have wrong format" << lines);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
</div>
<div>
<br /></div>
<div>
<br /></div>
<div>
Then brute force O(n^2) the correspondence between image timestamps and gps timestamps:</div>
<div>
<br /></div>
<div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> /// using the times extracted from the bmp file names, find what the closest</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> /// gps coordinates for those times</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> for (unsigned i = 0; i < timeImage.size(); i++) {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> for (unsigned j = 0; j < timePos.size()-1; j++) {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> if ((timePos[j].first <= timeImage[i].first)</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> && (timePos[j+1].first > timeImage[i].first)) {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> struct tpi newTpi;</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> newTpi.time = timeImage[i].first;</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> /// interpolate - is this working?</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> float factor = (newTpi.time - timePos[j].first)</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> / (timePos[j+1].first - timePos[j].first);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> //OUT(i << " " <<j << " " <<factor); </span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> newTpi.pos = timePos[j].second</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> + (timePos[j+1].second - timePos[j].second) * factor;</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> createTexture(newTpi.texture, timeImage[i].second);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> /// don't interpolate just use the same point</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> //newTpi.pos = timePos[j].second;</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> /// attitude</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> vector3f up = vector3f(0,1.0f,0);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> /// this is arbitrary based on the fact the video was shot at a right angle to </span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> /// the direction of travel</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> vector3f right = (timePos[j+1].second - timePos[j].second);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> right = right/right.Length();</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
</div>
<div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> // make all axes orthogonal</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> vector3f out = Cross(up,right);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> up = Cross(right,out);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> // normalize</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> out = out/out.Length();</span></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhP-Iy2SCx5IA5gK2KiTgpOTMKRysTiCU9bInmJpo3MLrkbKVpQpeAXJQlk0Yf9oXNsb7xXBbucljY1NWMV63Kcn2N0BvY6sTtu_HevgHjSpLqsPLbDl5UsMKIdtjwTPh1IG3V7yA/s1600/vlcsnap-2013-10-24-07h48m45s170.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="188" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhP-Iy2SCx5IA5gK2KiTgpOTMKRysTiCU9bInmJpo3MLrkbKVpQpeAXJQlk0Yf9oXNsb7xXBbucljY1NWMV63Kcn2N0BvY6sTtu_HevgHjSpLqsPLbDl5UsMKIdtjwTPh1IG3V7yA/s320/vlcsnap-2013-10-24-07h48m45s170.png" width="320" /></a></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> up = up/up.Length();</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> newTpi.attitude.Set(right,up,out);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> /// scale</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> if (i >0) {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> newTpi.scale = (newTpi.pos - tpiList[i-1].pos).Length()/2.0f;</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> } else {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> newTpi.scale = 5.0f;</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> tpiList.push_back(newTpi);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
</div>
<div>
<br /></div>
<div>
And then draw it later:</div>
<div>
<br /></div>
<div>
<div>
<span style="font-family: Courier New, Courier, monospace;">void gps::draw()</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">{</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> /// the gps signal</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glPushAll();</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glColor3f(0.67398f,.459f, 0.459f);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glBegin(GL_LINE_STRIP);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> for (unsigned i = 0; i <timePos.size(); i++) {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> /// subtract first position to make path always start from origin</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glVertex3fv((timePos[i].second).vertex);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glEnd();</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glColor3f(0.67398f,.159f, 0.059f);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glPointSize(9.0f);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glBegin(GL_POINTS);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> for (unsigned i = 0; i <timePos.size(); i++) {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> /// subtract first position to make path always start from origin</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glVertex3fv((timePos[i].second).vertex);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glEnd();</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> /// interpolated image position</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glColor3f(0.37398f,.659f, 0.459f);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glBegin(GL_LINE_STRIP);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> for (unsigned i = 0; i <tpiList.size(); i++) {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glVertex3fv((tpiList[i].pos).vertex);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glEnd();</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">/* glColor3f(0.17398f,0.559f, 0.859f);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glPointSize(10.0f);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glBegin(GL_POINTS);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> for (unsigned i = 0; i <tpiList.size(); i++) {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glVertex3fv((tpiList[i].pos).vertex); </span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> } </span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glEnd(); </span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">*/</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glPopAll();</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glPushAll();</span></div>
</div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glEnable(GL_TEXTURE_2D);</span><br />
<span style="font-family: Courier New, Courier, monospace;"> glColor3f(1.0f,1.0f,1.0f);</span><br />
<span style="font-family: Courier New, Courier, monospace;"><br /></span>
<span style="font-family: Courier New, Courier, monospace;"> /// always pointed at camera </span><br />
<span style="font-family: Courier New, Courier, monospace;"> //matrix16f temp = Registry::instance()->theCamera->location;</span><br />
<span style="font-family: Courier New, Courier, monospace;"> //temp.SetTranslation(vector3f(0.0f,0.0f,0.0f));</span><br />
<span style="font-family: Courier New, Courier, monospace;"><br /></span>
<span style="font-family: Courier New, Courier, monospace;"> vector3f loc = Registry::instance()->theCamera->location.GetTranslation();</span><br />
<span style="font-family: Courier New, Courier, monospace;"><br /></span>
<span style="font-family: Courier New, Courier, monospace;"> int oldI = 0;</span><br />
<span style="font-family: Courier New, Courier, monospace;"> for (unsigned i = 0; i <tpiList.size(); i++) {</span><br />
<span style="font-family: Courier New, Courier, monospace;"> float scale = tpiList[i].scale;</span><br />
<span style="font-family: Courier New, Courier, monospace;"><br /></span>
<span style="font-family: Courier New, Courier, monospace;"> /// simple distance culling</span><br />
<span style="font-family: Courier New, Courier, monospace;"> float dist = (loc - tpiList[i].pos).Length();</span><br />
<span style="font-family: Courier New, Courier, monospace;"> /*if ((dist >= 5000)) {</span><br />
<span style="font-family: Courier New, Courier, monospace;"> /// make far away textures bigger, and show less of them</span><br />
<span style="font-family: Courier New, Courier, monospace;"> float f= dist/5000;</span><br />
<span style="font-family: Courier New, Courier, monospace;"> f =f*f;</span><br />
<span style="font-family: Courier New, Courier, monospace;"> i += (int)f+1;</span><br />
<span style="font-family: Courier New, Courier, monospace;"> scale*= f;</span><br />
<span style="font-family: Courier New, Courier, monospace;"> }*/</span><br />
<span style="font-family: Courier New, Courier, monospace;"> if ((dist > 3000) && (dist <= 8000)) {</span><br />
<span style="font-family: Courier New, Courier, monospace;"> if (i%5==0) {</span><br />
<span style="font-family: Courier New, Courier, monospace;"> //i+=10;</span><br />
<span style="font-family: Courier New, Courier, monospace;"> scale *=5;</span><br />
<span style="font-family: Courier New, Courier, monospace;"> } else {</span><br />
<span style="font-family: Courier New, Courier, monospace;"> dist = 20000;</span><br />
<span style="font-family: Courier New, Courier, monospace;"> }</span><br />
<span style="font-family: Courier New, Courier, monospace;"> }</span><br />
<span style="font-family: Courier New, Courier, monospace;"> if (dist > 8000) {</span><br />
<span style="font-family: Courier New, Courier, monospace;"> if (i%10==0) {</span><br />
<span style="font-family: Courier New, Courier, monospace;"> //i+=10;</span><br />
<span style="font-family: Courier New, Courier, monospace;"> scale *=10;</span><br />
<span style="font-family: Courier New, Courier, monospace;"> } else {</span><br />
<span style="font-family: Courier New, Courier, monospace;"> dist = 20000;</span><br />
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
<div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> if (dist < 16000) {</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glBindTexture(GL_TEXTURE_2D, tpiList[i].texture);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glBegin(GL_QUADS);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> matrix16f temp = tpiList[i].attitude;</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glTexCoord2f(0.0f, 0.0f);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glVertex3fv((tpiList[i].pos+temp.Transform(scale*vector3f(1.0,1.0,0.0))).vertex);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glTexCoord2f(1.0f, 0.0f);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glVertex3fv((tpiList[i].pos+temp.Transform(scale*vector3f(-1.0,1.0,0.0))).vertex);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glTexCoord2f(1.0f, 1.0f);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glVertex3fv((tpiList[i].pos+temp.Transform(scale*vector3f(-1.0,-1.0,0.0))).vertex);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glTexCoord2f(0.0f, 1.0f);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glVertex3fv((tpiList[i].pos+temp.Transform(scale*vector3f(1.0,-1.0,0.0))).vertex);</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glEnd();</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> oldI = i;</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> }</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"> glPopAll();</span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;"><br /></span></div>
<div>
<span style="font-family: Courier New, Courier, monospace;">}</span></div>
</div>
<div>
<br />
<h3>
Future</h3>
</div>
<div>
<br />
A few other old projects could be revived, though some have more obscure dependencies (paragui and maybe another opengl gui). It's not a high priority but it would be nice to create better records now than wait even longer for more bitrot to set in, and I have a restored interest in low-ish level OpenGL so it would be nice to get refreshed on the stuff I've already done.</div>
Anonymoushttp://www.blogger.com/profile/12190442172800693364noreply@blogger.com0tag:blogger.com,1999:blog-28093388.post-50484093270946076362013-07-25T00:09:00.001-07:002013-07-25T00:09:57.432-07:00Turn a set of mp3s into static image music videos<br />
I wanted to take a directory full of mp3s, in this case a bunch of Creative Commons Attribution from Kevin MacLeod (<a href="http://incompetech.com/music/">http://incompetech.com/music/</a>) and make videos that simply have the artist name and track name, and moreover string many of those videos together into a longer compilation- the Linux bash script to do this follows.<br />
<br />
It seems like ffmpeg fails to concatenate after the videos reached an hour in length- I would get a segfault at that point. The music and video was getting unsynchronized which causes the titles to run longer than the music does, I'll have to look more into that.<br />
<br />
<iframe allowfullscreen="" frameborder="0" height="390" src="http://www.youtube.com/embed/iJ_WavjyUoI" width="640"></iframe>
<b>Make title image videos from a directory of mp3s:
</b>
<code>
</code><br />
<pre><code>
mkdir output
rm output/*
for i in *mp3;
do
convert -background black -fill white \
-size 1920x1080 -pointsize 80 -gravity center \
label:"Kevin Macleod\n\n`echo $i | sed s/.mp3//`" output/"$i.png"
# TBD replace with ffmpeg
avconv -loop 1 -r 1 -i output/"$i.png" -c:v libx264 -i "$i" -c:a aac -strict experimental -shortest output/"$i.mp4"
done
</code></pre>
Then concatenate into one long video (thanks to <a href="https://trac.ffmpeg.org/wiki/How%20to%20concatenate%20(join,%20merge)%20media%20files">https://trac.ffmpeg.org/wiki/How%20to%20concatenate%20(join,%20merge)%20media%20files</a>)
<br />
<pre><code>
rm all_videos.txt
for i in *mp4;
do
echo $i
echo "file '$i'" >> all_videos.txt
done
mkdir output
ffmpeg -f concat -i all_videos.txt -c copy output/kevin_macleod_1.mp4
</code></pre>
Anonymoushttp://www.blogger.com/profile/12190442172800693364noreply@blogger.com0tag:blogger.com,1999:blog-28093388.post-46838423083108162462013-05-23T22:17:00.000-07:002013-05-23T22:19:13.057-07:00soundpaint<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.youtube.com/embed/xz0ClQ67k7I?feature=player_embedded' frameborder='0'></iframe></div>
<a href="http://www.youtube.com/watch?v=xz0ClQ67k7I">http://www.youtube.com/watch?v=xz0ClQ67k7I</a><br />
<br />
Draw sound waveforms with a mouse, then play the sounds with keys that vary in pitch. The frequency and phase spectrum can also be manipulated in the same way.<br />
<br />
Mostly I want to create crude chiptunes sound effects which it can do pretty well, I think it needs more layering/modulation capability to be a bit more useful. Also most of the interesting frequencies are very near the left hand fifth of the frequency plot, an ability to zoom there and on the time waveform would be very useful- maybe doubling or tripling the amount of horizontal resolution devoted to the plots would be nice as well.<br />
<br />
The mouse drawing code is pretty crude, it can't even interpolate between two different sampled mouse y positions yet.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgi0Zr5IZ_9J4KvEcQCATsQ81fxUJZCdXuCPBy5h_XeLvb7EUiCZ6ONBs6GGaaPJEm7fkbybHKSpQOqiz_1MW8_0gxZhsjouHkp8beVdQwJISTVI_DXOOWiBXzyo2dYHCZ5mFWSHg/s1600/Screenshot+from+2013-05-23+22:11:57.png" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="379" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgi0Zr5IZ_9J4KvEcQCATsQ81fxUJZCdXuCPBy5h_XeLvb7EUiCZ6ONBs6GGaaPJEm7fkbybHKSpQOqiz_1MW8_0gxZhsjouHkp8beVdQwJISTVI_DXOOWiBXzyo2dYHCZ5mFWSHg/s640/Screenshot+from+2013-05-23+22:11:57.png" width="640" /></a></div>
<br />
<br />
I used Processing and the minim sound library which didn't directly support manipulation or viewing of phase information. The trick was to subclass fft like this:<br />
<br />
<a href="https://github.com/lucasw/soundpaint/blob/master/soundpaint.pde#L40">https://github.com/lucasw/soundpaint/blob/master/soundpaint.pde#L40</a><br />
<br />
<br />Anonymoushttp://www.blogger.com/profile/12190442172800693364noreply@blogger.com0tag:blogger.com,1999:blog-28093388.post-43783864147329502332011-11-23T20:03:00.001-08:002011-11-28T13:07:09.300-08:00Lunar DTM100 to Blender displacment map<div>
This post is an aggregration of multiple threads created on google plus and additional findings, the scattered nature of those threads made it impossible to find all the information in one place hence this post. It's also a work in progress with some blank spots- help is welcome!<br />
<br />
(I've mostly moved to google plus which is great for mini-blogging (as opposed to the micro blogging of twitter and full size regular blogger blogging) and has very good engagement once you find good people to put in your circles. I expect greater plus/blogger integration in the future probably starting with comments becoming plusified.)<br />
<br />
<table style="width: auto;"><tbody>
<tr><td><a href="https://picasaweb.google.com/lh/photo/Xv6S3L3VANIMyLaK05bDp9MTjNZETYmyPJy0liipFm0?feat=embedwebsite"><span class="Apple-style-span" style="font-family: inherit;"><img height="356" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiRjMh3cZVWCLyUM26Q4aHYkrdTyRAgYTqZdu-1yeip4QVLnKauyfuQINrN0cx6ryWjaKAmpzxVD9IxQjvor2g758bviISPtqTg4put9zknV0gIU0-V8ndK55V4hCN7mtBX-aiMog/" width="500" /></span></a></td></tr>
<tr><td style="text-align: right;"><span class="Apple-style-span" style="font-family: inherit;">From <a href="https://picasaweb.google.com/wsacul/LROCLunarMap?authuser=0&feat=embedwebsite">LROC Lunar Map</a></span><br />
<div>
<br /></div>
</td></tr>
</tbody></table>
<br />
<b><span class="Apple-style-span" style="font-family: inherit;"><br /></span></b><br />
<b><span class="Apple-style-span" style="font-family: inherit;">Lunar Elevation Data</span></b><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<span class="Apple-style-span" style="font-family: inherit;">Source DTM files are in the IMG files:</span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<span class="Apple-style-span" style="font-family: inherit;"><a class="ot-anchor" href="http://lroc.sese.asu.edu/data/LRO-L-LROC-5-RDR-V1.0/LROLRC_2001/DATA/SDP/WAC_DTM/" style="background-color: white; color: #3366cc; cursor: pointer; line-height: 18px; text-decoration: none;">http://lroc.sese.asu.edu/data/LRO-L-LROC-5-RDR-V1.0/LROLRC_2001/DATA/SDP/WAC_DTM/</a><span class="Apple-style-span" style="background-color: white; line-height: 18px;"> </span></span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<span class="Apple-style-span" style="font-family: inherit;">The highest resolution maps are in the </span>100M.IMG files, which means 100 meters/pixel (which seems large for an object so close as the moon- why don't we have 1 meter per pixel, or 0.1 meter yet?).<br />
<br />
I haven't written a script for handling the files that don't completely cover the lunar globe, and will probably use other tools to get that right. (grass gis <a href="http://grass.fbk.eu/gdp/index.php">http://grass.fbk.eu/gdp/index.php</a>?)<br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<span class="Apple-style-span" style="font-family: inherit;">TBD detail on file format.</span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<b><br /></b><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<b><span class="Apple-style-span" style="font-family: inherit;">Generating 32-bit elevation tifs with Python</span></b></div>
<div>
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<a class="ot-anchor" href="http://code.google.com/p/binarymillenium/source/browse/trunk/python/lroc_to_vtx.py" style="background-color: white; color: #3366cc; cursor: pointer; line-height: 18px; text-decoration: none;"><span class="Apple-style-span" style="font-family: inherit;">http://code.google.com/p/binarymillenium/source/browse/trunk/python/lroc_to_vtx.py</span></a><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span></div>
<div>
<span class="Apple-style-span" style="font-family: inherit;"><span style="background-color: white; line-height: 18px;">Current 32-bit tifs:</span></span><br />
<span class="Apple-style-span" style="font-family: inherit;"><a href="https://docs.google.com/open?id=0BwP06t4a405kNDFhNjE3NTUtNTc2YS00NjFlLWExNTgtOTBkZjVmM2M0MGM2" style="background-color: white; color: #3366cc; line-height: 18px; text-decoration: none;" target="_blank">https://docs.google.com/open?<wbr></wbr>id=<wbr></wbr>0BwP06t4a405kNDFhNjE3NTUtNTc2Y<wbr></wbr>S00NjFlLWExNTgtOTBkZjVmM2M0MGM<wbr></wbr>2</a></span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br style="background-color: white; line-height: 18px;" /></span><br />
<span class="Apple-style-span" style="font-family: inherit;"><a href="https://docs.google.com/open?id=0BwP06t4a405kMzA0MGMzY2ItZGJmZC00NDI2LWE5NmYtYjM1ZTI5MmQ1OWZh" style="background-color: white; color: #3366cc; line-height: 18px; text-decoration: none;" target="_blank">https://docs.google.com/open?<wbr></wbr>id=<wbr></wbr>0BwP06t4a405kMzA0MGMzY2ItZGJmZ<wbr></wbr>C00NDI2LWE5NmYtYjM1ZTI5MmQ1OWZ<wbr></wbr>h</a></span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br style="background-color: white; line-height: 18px;" /></span><br />
<a href="https://docs.google.com/open?id=0BwP06t4a405kNzM3ODIyZjktOWM3MC00NDJlLWI3ZGYtNGIwMWRmMWJjYTUz" style="background-color: white; color: #3366cc; line-height: 18px; text-decoration: none;" target="_blank"><span class="Apple-style-span" style="font-family: inherit;">https://docs.google.com/open?<wbr></wbr>id=<wbr></wbr>0BwP06t4a405kNzM3ODIyZjktOWM3M<wbr></wbr>C00NDJlLWI3ZGYtNGIwMWRmMWJjYTU<wbr></wbr>z</span></a></div>
<div>
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<span class="Apple-style-span" style="font-family: inherit;"><b>Conversion to viewable</b></span><br />
<span class="Apple-style-span" style="font-family: inherit;"><b><br /></b></span></div>
<div>
<span class="Apple-style-span" style="font-family: inherit;">Not a lot of programs can display those 32-bit tifs correctly, and Blender didn't like them. Get a version of imagemagick with hdri enabled (./configure --enable-hdri when building it from source) so they can be converted to friendlier formats.</span><br />
<br /></div>
<span class="Apple-style-span" style="font-family: inherit;">Making an easier to view jpg from the 32-bit tifs:</span><br />
<div>
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<span class="Apple-style-span" style="font-family: inherit;">convert -define quantum:scale=255.0 -normalize moon.tif moon_fromtif.jpg</span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<br />
<span class="Apple-style-span" style="font-family: inherit;">And the following is the result:</span><br />
<table style="width: auto;"><tbody>
<tr><td><a href="https://picasaweb.google.com/lh/photo/-V7RXQLsAGzZNlKXUsa749MTjNZETYmyPJy0liipFm0?feat=embedwebsite"><span class="Apple-style-span" style="font-family: inherit;"></span></a></td></tr>
<tr><td style="text-align: right;"><div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEja0xPod_-_2E7gD2OUH6iPH3bPrNDdnsja9CUn9MpsJl1g5B3TIa9_BCfGJGn_jbadFOO_9OGmzOIj1Yu2pNwRVm_Frk4cJaEfCUtJZKi6w5dnCINM83iTtI33hbI07t7WsGixhg/s1600/WAC_GLD100_E000N1800_032P_norm_downsized.jpg" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="72" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEja0xPod_-_2E7gD2OUH6iPH3bPrNDdnsja9CUn9MpsJl1g5B3TIa9_BCfGJGn_jbadFOO_9OGmzOIj1Yu2pNwRVm_Frk4cJaEfCUtJZKi6w5dnCINM83iTtI33hbI07t7WsGixhg/s144/WAC_GLD100_E000N1800_032P_norm_downsized.jpg" width="144" /></a></div>
<span class="Apple-style-span" style="font-family: inherit;">From <a href="https://picasaweb.google.com/wsacul/LROCLunarMap?authuser=0&feat=embedwebsite">LROC Lunar Map</a></span><br />
<div>
<br /></div>
</td></tr>
</tbody></table>
</div>
<div>
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
The jpeg is usable in Blender but doesn't hold up well with a lot of zooming- the 255 levels of elevation possible in a jpeg produce stair step artifacts:<br />
<table style="width: auto;"><tbody>
<tr><td><a href="https://picasaweb.google.com/lh/photo/4aD7wu5CE12GtVpmghsKBdMTjNZETYmyPJy0liipFm0?feat=embedwebsite"><img height="80" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgOgoi1xNV5-B8Jc8eqZftNBMbgs7LqEId3zIhVqO9NUZBMnKPHt7R0gSPbDEX9MIti0NYLKBvSlq2-trxYNJ8OKApyY_K2t_T0IwRgmkxpw3gs2O2hGct7TiUGA1S-6szdZ9HYCw/s144/moon_quantized.png" width="144" /></a></td></tr>
<tr><td style="font-family: arial,sans-serif; font-size: 11px; text-align: right;">From <a href="https://picasaweb.google.com/wsacul/LROCLunarMap?authuser=0&feat=embedwebsite">LROC Lunar Map</a></td></tr>
</tbody></table>
<br />
<b>Conversion to blender usable</b><br />
<br />
So Blender<span class="Apple-style-span" style="font-family: inherit;"> could use a 16-bit format like openexr for 255x smoother gradations, use this conversion:</span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<span class="Apple-style-span" style="background-color: white; color: #222222;"><span class="Apple-style-span" style="font-family: inherit;">convert -define quantum:scale=255.0 moon.tif moon.exr</span></span><br />
<span class="Apple-style-span" style="background-color: white; color: #222222;"><span class="Apple-style-span" style="font-family: inherit;"><br /></span></span><br />
<span class="Apple-style-span" style="background-color: white; color: #222222;"><span class="Apple-style-span" style="font-family: inherit;">The quantum scale there seems like it ought to be 65535.0 but the 255.0 works, and imagemagick identify -verbose shows that the range of values is 65535.0.</span></span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span></div>
<div>
<span class="Apple-style-span" style="font-family: inherit;"><b>Setting up image based displacement + bump in Blender 2.6x Cycles (latest svn)</b></span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<span class="Apple-style-span" style="font-family: inherit;">TBD flesh this out in greater detail</span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<span class="Apple-style-span" style="font-family: inherit;">Add | Mesh | UV sphere</span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<span class="Apple-style-span" style="font-family: inherit;">Object Modifies | Add Modifier | Subdivision Surface | Render 6</span><br />
<br />
<span class="Apple-style-span" style="font-family: inherit;">Material | Surface | Use Nodes</span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<span class="Apple-style-span" style="font-family: inherit;">Shift-A | Texture | Image Texture | Open moon.exr</span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<span class="Apple-style-span" style="font-family: inherit;">Connect the color to the diffuse bsdf, then should see texture on sphere if in texture or render view mode (maybe have to do something to force redraw/update).</span><br />
<br />
<span class="Apple-style-span" style="font-family: inherit;">Edit Mode | Select all edges</span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<span class="Apple-style-span" style="font-family: inherit;">Mesh | UV Unwrap | Sphere projection | Align to object</span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<span class="Apple-style-span" style="font-family: inherit;">Connect image texture to color to bw converter and then to displacement input on material output. TBD proper height scaling of craters.</span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
Object Data | Displacement | Method | Both<br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
Link to .blend file:<br />
<br />
<span class="zj"><a class="ot-anchor" href="http://goo.gl/ZrPJ8">http://goo.gl/ZrPJ8</a></span> <br />
<br />
<table style="width: auto;"><tbody>
<tr><td><a href="https://picasaweb.google.com/lh/photo/Z5zT1e3oEWTnEtsic-Kb_NMTjNZETYmyPJy0liipFm0?feat=embedwebsite"><span class="Apple-style-span" style="font-family: inherit;"><img height="250" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiEAAvwaOHCVqu92oFAq5tviaRASgcb4AUwuqBX-maQEcRuHflED3ZuC5F3Xzbpu7kLBtBUlhWLqNrGIf9xr9MeNWPw2AVhExUjC_ADiPGWaMaRuR_Z9SDEq0L6F5yfMgZm2SK_cA/" width="500" /></span></a></td></tr>
<tr><td style="text-align: right;"><span class="Apple-style-span" style="font-family: inherit;">From <a href="https://picasaweb.google.com/wsacul/LROCLunarMap?authuser=0&feat=embedwebsite">LROC Lunar Map</a></span></td></tr>
</tbody></table>
<br />
<b>UV Sphere Projection Polar Problems</b><br />
<b><br /></b><br />
The UV spheres generated by blender have the problem of having triangles instead of quads around the poles. The spherical projection will produce distortion at the poles, smoothing the sphere prior to uv unwrap helps minimize it. I wonder if there is something problematic about making the pole polygons quads because it would involve multiple polygon points and edges right on top of each other.<br />
<br />
You can see the problem areas in the uv image below- the nice quad projections become distorted triangles at the top and bottom. <br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjKt_1C5FINykQ9-oHfJwS-C7mGjEB0LVRtfHBWWnEWfffwY5lCGK2-SU1RUUrcmfsZvCeK8L3DkudJcoACR5iSCa1dfaF95hF33D3IYXIUCwK0lLmoNQE1iaJYM0GmdMti7w4q_w/s1600/blender_uv_sphere_problems.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="134" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjKt_1C5FINykQ9-oHfJwS-C7mGjEB0LVRtfHBWWnEWfffwY5lCGK2-SU1RUUrcmfsZvCeK8L3DkudJcoACR5iSCa1dfaF95hF33D3IYXIUCwK0lLmoNQE1iaJYM0GmdMti7w4q_w/s320/blender_uv_sphere_problems.png" width="320" /></a></div>
<br />
The result is these pinched areas:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgGT8fWdUux3GMrbfjCKwokRV3mPn2r0EGtg4r-Pphu3eqLkJT_qig26TY1sa_JEdR7NKfxRaJPKlqedS_uSPDDXxZuBrVwrp7QmPvpEjmX8b80CAlOuDn62vG9y4kMQ3jhR90V3A/s1600/blender_uv_sphere_distortion.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="154" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgGT8fWdUux3GMrbfjCKwokRV3mPn2r0EGtg4r-Pphu3eqLkJT_qig26TY1sa_JEdR7NKfxRaJPKlqedS_uSPDDXxZuBrVwrp7QmPvpEjmX8b80CAlOuDn62vG9y4kMQ3jhR90V3A/s320/blender_uv_sphere_distortion.png" width="320" /></a></div>
<br />
<b><br /></b><br />
<br />
<span class="Apple-style-span" style="font-family: inherit;"><b>Lunar Visual Mosaics</b></span><br />
<span class="Apple-style-span" style="font-family: inherit;"><b><br /></b></span><br />
The moon isn't perfectly grey, there are many interesting light and dark features. I haven't located a good texture generated from LROC or Clementine data (LROC would be ideal since it would probably guarantee all visual features line up with elevation features). <br />
<br />
<a href="http://wms.lroc.asu.edu/lroc/wac_mosaic">http://wms.lroc.asu.edu/lroc/wac_mosaic</a><br />
<br />
<a href="http://ser.sese.asu.edu/MOON/clem_color.html">http://ser.sese.asu.edu/MOON/clem_color.html</a><br />
<br />
There are some random ones to be found on the web but I haven't tried them yet.<br />
<br />
<br />
<span class="Apple-style-span" style="font-family: inherit;"><b>Future</b></span><br />
<span class="Apple-style-span" style="font-family: inherit;"><b><br /></b></span><br />
Grass gis <a href="http://grass.fbk.eu/gdp">http://grass.fbk.eu/gdp</a><br />
<br />
gdal - has python bindings (ubuntu intall python-gdal gdal-bin) <a href="http://www.gdal.org/">http://www.gdal.org/</a><br />
Turn python generated tiff images into geotiffs<br />
<br />
<br />
osgearth - uses geotiff output from gdal to produce lod/paged terrain databases viewable in OpenSceneGraph osgviewer.<br />
<br />
<span class="Apple-style-span" style="font-family: inherit;">Use 100M data (the 256P IMG files). Parse dtm within python to do this? Minimum is extracting width x height.</span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span><br />
<br />
<span class="Apple-style-span" style="font-family: inherit;"><b>Sources</b></span><br />
<span class="Apple-style-span" style="font-family: inherit;"><br /></span></div>
<div>
<span class="Apple-style-span" style="font-family: inherit;">Original discussions that originated this post:</span></div>
<div>
<span class="Apple-style-span" style="font-family: inherit;"><br /></span></div>
<div>
<a href="http://feeds.laughingsquid.com/~r/laughingsquid/~3/mGUeRmGUfu0/"><span class="Apple-style-span" style="font-family: inherit;">http://feeds.laughingsquid.com/~r/laughingsquid/~3/mGUeRmGUfu0/</span></a></div>
<div>
<span class="Apple-style-span" style="font-family: inherit;"><br /></span></div>
<div>
<a href="https://plus.google.com/u/0/107180534974005900062/posts/713nRXGEhn7"><span class="Apple-style-span" style="font-family: inherit;">https://plus.google.com/u/0/107180534974005900062/posts/713nRXGEhn7</span></a></div>
<div>
<span class="Apple-style-span" style="font-family: inherit;"><br /></span></div>
<div>
<a href="https://plus.google.com/u/0/116599331662269985445/posts/5GJw1T8Tu27"><span class="Apple-style-span" style="font-family: inherit;">https://plus.google.com/u/0/116599331662269985445/posts/5GJw1T8Tu27</span></a></div>
<div>
<span class="Apple-style-span" style="font-family: inherit;"><br /></span></div>
<div>
<a href="https://plus.google.com/u/0/103190342755104432973/posts/1krmFVvnh7d"><span class="Apple-style-span" style="font-family: inherit;">https://plus.google.com/u/0/103190342755104432973/posts/1krmFVvnh7d</span></a></div>
<div>
<br />
<br /></div>Anonymoushttp://www.blogger.com/profile/12190442172800693364noreply@blogger.com0tag:blogger.com,1999:blog-28093388.post-70840316830200507472011-01-30T15:49:00.000-08:002011-05-08T09:04:55.171-07:00Google Docs StorageI've been playing with google docs storage for about a month. The user interface is inferior to <a href="http://aws.amazon.com/s3/">Amazon S3</a> + <a href="http://www.s3fox.net/">S3Fox</a> in every way, but for 1/5th the cost I'm willing to put up with it (though it's only 1/5th the cost if I fill up all the storage for the given price tier, I think they expect most users not to use a large fraction). AVIs and JPEGs can be dragged-and-dropped in quantity, but NEF and .ini files have to use the 'select more pictures' dialog which can only handle 10-15 files at a time (otherwise a strange character appears instead of a list of all the files). Upload speeds seem good (a megabyte every 3 seconds or so), better than I remember S3 being last time I tried it.<br /><br />There are some linux filesystem programs that can mount my entire google online storage (including blog posts), but only allow uploading of the google docs formats and not any file at all like can be done with the upload dialog. Hopefully this changes.<br /><br /><span style="font-weight:bold;">5/8/2011 update</span><br /><br />Folder upload is now possible (no more shift selecting the contents of a folder and having to cut and paste folder names), and it looks like subfolders are uploaded properly. But sometimes a file fails to upload properly, and using the file method it was possible to see which file failed and re-queue for upload. Now after I've uploaded a folder with 90 items and it took 20 minutes, there is an error message saying that one file failed to upload, but no way to know which one. Retry fails repeatedly. <a href="http://www.google.com/support/forum/p/Google+Docs/thread?tid=6f9fda70c3b02ce5&hl=en">Some reports of same</a>, though I haven't seen the other bugs mentioned.binarymilleniumhttp://www.blogger.com/profile/17419830604356775608noreply@blogger.com8tag:blogger.com,1999:blog-28093388.post-1297279530448047572010-12-14T20:46:00.000-08:002010-12-14T21:05:56.778-08:00OpenGameArtYou can't help but want to make a 2d overhead or isometric game after looking at some of the art available on <a href="http://opengameart.org">OpenGameArt</a>- I'll settle for a little procedural terrain generation:<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjHvJwtHKFJwFQRxKreBaKle2ZlbUlIFNW23hZVHIEf6LtRh02AOdRuIX-Nt2A-WNsoCwzkPTS6SkH1wfrOvdoC6iFPUBov8Sx2GAs0a_LmvBVpp83U3cleuUTSysGb_yva6Bbv/s1600/screenshot.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 251px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjHvJwtHKFJwFQRxKreBaKle2ZlbUlIFNW23hZVHIEf6LtRh02AOdRuIX-Nt2A-WNsoCwzkPTS6SkH1wfrOvdoC6iFPUBov8Sx2GAs0a_LmvBVpp83U3cleuUTSysGb_yva6Bbv/s400/screenshot.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5550768964335813058" /></a><br /><br /><a href="http://www.openprocessing.org/visuals/?visualID=16704">http://www.openprocessing.org/visuals/?visualID=16704</a> (click run if it asks about running old java)<br /><br />The OGA site could use more contributors, as well as better organization to promote the best work- and of course it could be popularized by making use of the content there. I think I'll use this and some variations at the next VJ event I do- I've used recorded video from games before, but even better to generate some game-like imagery live.<br /><br />Another useful feature would be for terrain tilesets to have a standard connectivity definition file- an ascii file that says which tiles connect best to which neighbors in which direction. Current that's hardcoded into the processing sketch.binarymilleniumhttp://www.blogger.com/profile/17419830604356775608noreply@blogger.com1tag:blogger.com,1999:blog-28093388.post-12506831138026625252010-12-07T18:38:00.000-08:002010-12-07T18:43:45.779-08:00Gephex 0.4.3b built on 64-bit Ubuntu 10.10I haven't found a better graph-based VJ tool than Gephex to use in Linux or Windows (but haven't been looking much either), and it's not trivial to get it running on a modern Ubuntu system- but I've worked through all the compiler messages (with the big exception of leaving out ffmpeg) and have an archive of the results available for download.<br /><br />Go here for details and a link to the download:<br /><br /><a href="http://code.google.com/p/binarymillenium/wiki/Gephex64BitBuild">http://code.google.com/p/binarymillenium/wiki/Gephex64BitBuild</a><br /><br />The reconfiguring may screw it up a little, otherwise let it install to my choice of directories and then move the bin and lib etc. files as appropriate.<br /><br />I'm hoping there is a new and well-supported tool out there that can take the place of gephex- <a href="http://movid.org/">movid</a> is not intended for VJ work but may be retrofittable.binarymilleniumhttp://www.blogger.com/profile/17419830604356775608noreply@blogger.com1tag:blogger.com,1999:blog-28093388.post-72511366198108534662010-08-23T19:30:00.000-07:002010-09-06T09:55:00.143-07:00First Steps in Bullet physicsI've played with <a href="http://www.ode.org/">ODE</a> for several small projects, but recently discovered <a href="http://bulletphysics.org/wordpress/">Bullet physics</a> while looking through <a href="http://www.ros.org/wiki/">ROS</a> (Robot Operating System) documentation. The difference in rigid body simulation between the two is not that obvious, the Bullet rigid body demos seem very impressive and perhaps speedier than ODE. But the soft body physics have no equivalent in ODE, and seem very worthy of investigation- and equally intriguing are the decomposition capabilities, where objects can be shattered into smaller pieces with a function call (it may be more difficult than that, I haven't looked at the code yet).<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjqO52nUNhSLxymJ10akZBHRQIhJ4wcJxAVidqCKEH8mZ4AlbY9HQVu9idweeZuzSwax8kvNhlp7k2YyGl3pUblD9UoSZGTDlCrO52Qs1LBx0LTGTPLNiypH1Q_IAW1ip1ataZn/s1600/bullet_soft.png"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 300px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjqO52nUNhSLxymJ10akZBHRQIhJ4wcJxAVidqCKEH8mZ4AlbY9HQVu9idweeZuzSwax8kvNhlp7k2YyGl3pUblD9UoSZGTDlCrO52Qs1LBx0LTGTPLNiypH1Q_IAW1ip1ataZn/s400/bullet_soft.png" border="0" alt=""id="BLOGGER_PHOTO_ID_5508802714607033682" /></a><br /><br />There is a good forum, not a ton of documentation or second-party dispersed know-how in the form of tutorials especially for the soft body physic, but the demo code is great- and beyond that it comes down to experimentation.<br /><br /><span style="font-weight:bold;">Soft bodies from 3d files</span><br /><br />A long time ago I made my own soft body physics simulation code where I would load an .obj file and replace all vertex edges with spring-dampers, and also create a grid of internal springs to give the body volume. It worked okay in the best conditions but would frequently explode in others. Now in bullet there is a toolchain to take a 3d object and turn it volumetric, and then simulate much more robustly than in my amateur effort (though explosions still can occur).<br /><br />The bunny.inl and cube.inl files in Demos/SoftDemo provide the first clues, they have auto-generated comments like '# Generated by tetgen -YY bunny.smesh'. <br /><br /><span style="font-weight:bold;">Installation of tools</span><br /><br /><a href="http://tetgen.berlios.de/">Tetgen</a> is in the Ubuntu repositories, but <a href="http://tetgen.berlios.de/tetview.html">tetview</a> has to be downloaded as a binary. libg2c isn't in Ubuntu 10.04 lucid lynx, I had to download prebuilt version here http://www.fluvial.ch/d/libg2c.tgz , and also libstdc++.so.5 from http://packages.debian.org/lenny/i386/libstdc++5/download (extract with dpkg-deb file.deb .).<br /><br />Wings3D is a good program for generating 3d meshes, and is in Ubuntu repositories.<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjtOasdYfhlwn1kh8jUcjV9tXc7ywmokeOhPwIYzMkZg_-SLB6h9TFSPnAvk5CcQhD6rTXC_LnNQb10R0AK1XL90YHFWkMV6R5dpH-BM09QrnsSC7CkDVZ0VZqVlqG-ycSJYBa1/s1600/wings_b.png"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 293px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjtOasdYfhlwn1kh8jUcjV9tXc7ywmokeOhPwIYzMkZg_-SLB6h9TFSPnAvk5CcQhD6rTXC_LnNQb10R0AK1XL90YHFWkMV6R5dpH-BM09QrnsSC7CkDVZ0VZqVlqG-ycSJYBa1/s400/wings_b.png" border="0" alt=""id="BLOGGER_PHOTO_ID_5508820057084759042" /></a><br /><br />The stl it exports in binary and doesn't work for tetgen ("wrong number of vertices"), use meshlab (also in ubuntu repos) to save it as an stl but uncheck binary. Objects with holes in them didn't work right, tetview kind of locks up on them. Concave areas don't seem quite right either.<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi1KE34ReFIHEmyvQESVb23_NseoerbguUGXvoRIFJ4GpIDr9d8ft4RdX0KlPbM2OBDHjVMHdJ2Pzs6oUctxXuLp9rBwAvBrUZ-RMUWo49ZGk68pvyOdTMV3dW682qebfLdDkmn/s1600/tetview.png"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 400px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi1KE34ReFIHEmyvQESVb23_NseoerbguUGXvoRIFJ4GpIDr9d8ft4RdX0KlPbM2OBDHjVMHdJ2Pzs6oUctxXuLp9rBwAvBrUZ-RMUWo49ZGk68pvyOdTMV3dW682qebfLdDkmn/s400/tetview.png" border="0" alt=""id="BLOGGER_PHOTO_ID_5508830242141832434" /></a><br /><br />I used 'tetgen -p file.stl' and it outputs file.1.ele, file.1.face, file.1.node, and file.1.smesh. 'tetview file.1' will view the output. In the bunny.inl file there is a getElements() and getNodes() function, the data there corresponds to the lists in file.1.ele and file.1.node. Cut and paste the lists into the functions, then use a text editor to put the quotes and line breaks in as seen in bunny.inl.<br /><br />On trying to use the inl in SoftDemo I get this error<br /><code>./AppSoftBodyDemo <br />*** glibc detected *** ./AppSoftBodyDemo: malloc(): memory corruption: 0x090f5c38 ***<br />======= Backtrace: =========<br />/lib/tls/i686/cmov/libc.so.6(+0x6b591)[0x75f591]<br />/lib/tls/i686/cmov/libc.so.6(+0x6e395)[0x762395]<br />/lib/tls/i686/cmov/libc.so.6(__libc_malloc+0x5c)[0x763f9c]<br />/usr/lib/nvidia-current/libGL.so.1(+0x33e80)[0x2fae80]</code><br /><br />Building the debug version helps out (cmake -DCMAKE_BUILD_TYPE=debug), but I haven't been able to figure it out yet.<br /><br />Here is the backtrace:<br /><code><br />Program received signal SIGABRT, Aborted.<br />0x0012d422 in __kernel_vsyscall ()<br />(gdb) bt<br />#0 0x0012d422 in __kernel_vsyscall ()<br />#1 0x003ff651 in raise () from /lib/tls/i686/cmov/libc.so.6<br />#2 0x00402a82 in abort () from /lib/tls/i686/cmov/libc.so.6<br />#3 0x0043649d in ?? () from /lib/tls/i686/cmov/libc.so.6<br />#4 0x00440591 in ?? () from /lib/tls/i686/cmov/libc.so.6<br />#5 0x00443395 in ?? () from /lib/tls/i686/cmov/libc.so.6<br />#6 0x00444f9c in malloc () from /lib/tls/i686/cmov/libc.so.6<br />#7 0x08157d51 in btAllocDefault (size=1195)<br /> at /home/bm/other/bullet-2.77/src/LinearMath/btAlignedAllocator.cpp:24<br />#8 0x08157e6f in btAlignedAllocInternal (size=1176, alignment=16)<br /> at /home/bm/other/bullet-2.77/src/LinearMath/btAlignedAllocator.cpp:170<br />#9 0x080b4803 in btCollisionObject::operator new (sizeInBytes=1176)<br /> at /home/bm/other/bullet-2.77/src/BulletCollision/CollisionDispatch/btCollisionObject.h:115<br />#10 0x080e9e85 in btSoftBodyHelpers::CreateFromTetGenData (worldInfo=..., <br /> ele=0x81707e4 "48 4 0\n 1 26 28 16 12\n 2 28 21 16 12\n 3 25 19 2 20\n 4 28 27 10 14\n 5 17 3 20 1\n 6 20 24 18 19"..., <br /> face=0x0, node=0x81707bb "# Generated by tetgen -YY test.1.smesh \n", <br /> bfacelinks=false, btetralinks=true, bfacesfromtetras=true)<br /> at /home/bm/other/bullet-2.77/src/BulletSoftBody/btSoftBodyHelpers.cpp:957<br />#11 0x080b0bc5 in Init_TetraCube (pdemo=0x8309c60)<br />---Type <return> to continue, or q <return> to quit---<br /> at /home/bm/other/bullet-2.77/Demos/SoftDemo/SoftDemo.cpp:1315<br />#12 0x080b122c in SoftDemo::clientResetScene (this=0x8309c60)<br /> at /home/bm/other/bullet-2.77/Demos/SoftDemo/SoftDemo.cpp:1451<br />#13 0x080b383f in SoftDemo::initPhysics (this=0x8309c60)<br /> at /home/bm/other/bullet-2.77/Demos/SoftDemo/SoftDemo.cpp:1849<br />#14 0x080a5b90 in main (argc=1, argv=0xbffff384)<br /> at /home/bm/other/bullet-2.77/Demos/SoftDemo/main.cpp:28<br /></code><br /><br />I put the question to the forum <a href="http://bulletphysics.org/Bullet/phpBB3/viewtopic.php?f=9&t=5567">here</a> but no responses- not many people are that adept at the soft body physics yet.<br /><br /><b>Something functional</b><br /><br />But I've been able to create a trivial example from scratch that works:<br /><code><br />static const char* getNodes() { return(<br />"8 3 0 0\n"<br />" 0 1 1 1\n"<br />" 1 1 1 -1\n"<br />" 2 1 -1 -1\n"<br />" 3 1 -1 1\n"<br />" 4 -1 -1 1\n"<br />" 5 -1 1 1\n"<br />" 6 -1 1 -1\n"<br />" 7 -1 -1 -1\n"<br />"# \n"); }<br /><br />static const char* getElements() { return(<br />"4 4 0\n"<br />" 0 0 1 3 5\n"<br />" 1 4 7 5 3\n"<br />" 2 6 5 1 7\n"<br />" 3 2 1 7 3\n"<br />"# \n"); }<br /></code><br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiV8TuSK7FJulekDntpxCwQic3AfKPrj0ax1_LrTP6BliYO8qecEEYMDDg4mpeTuKvivgmYVNbqKpDXP6uGfW2OVxzhdLCoaYDc1FNPycHs0MW-eY1FnBQPWGAz56rbdfggmgEO/s1600/softbody_works.png"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 300px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiV8TuSK7FJulekDntpxCwQic3AfKPrj0ax1_LrTP6BliYO8qecEEYMDDg4mpeTuKvivgmYVNbqKpDXP6uGfW2OVxzhdLCoaYDc1FNPycHs0MW-eY1FnBQPWGAz56rbdfggmgEO/s400/softbody_works.png" border="0" alt=""id="BLOGGER_PHOTO_ID_5512523912069012658" /></a><br /><br />Maybe I should try to corrupt the above into provoking the same kind of crash?<br /><br /><span style="font-weight:bold;">Update- Solution Found</span><br /><br />tetgen by default is creating 1-based vertex indices, while Bullet is expecting 0-based indices- the -z flag will make tetgen output zero-based. So no more crashing.<br /><br /><iframe src="http://player.vimeo.com/video/14738742?portrait=0&color=01AAEA" width="400" height="287" frameborder="0"></iframe><p><a href="http://vimeo.com/14738742">Bullet Soft Body Physics</a> from <a href="http://vimeo.com/user168788">binarymillenium</a> on <a href="http://vimeo.com">Vimeo</a>.</p>binarymilleniumhttp://www.blogger.com/profile/17419830604356775608noreply@blogger.com0tag:blogger.com,1999:blog-28093388.post-28541872903932472422010-04-08T07:45:00.000-07:002010-04-08T08:00:58.472-07:00Bundler on 64-bit Ubuntu 9.10It builds fine, but seg faults when run:<br /><br /><code><br />...<br />[KeyMatchFull] Matching to image 312<br />[KeyMatchFull] Matching took 71.610s<br />[KeyMatchFull] Matching to image 313<br />[KeyMatchFull] Matching took 69.720s<br />mkdir: cannot create directory `bundle': File exists<br />[- Running Bundler -]<br />/home/binarymillenium/bundler-v0.3-source/RunBundler.sh: line 93: 22589 Segmentation fault $BUNDLER list.txt --options_file options.txt › bundle/out<br />[- Done -]<br /><br /><br />Program received signal SIGSEGV, Segmentation fault.<br />0x0000000000522650 in dscal_ ()<br />(gdb) bt<br />#0 0x0000000000522650 in dscal_ ()<br />#1 0x0000000000519d16 in dsytf2_ ()<br />#2 0x00000000004e9644 in dsytrf_ ()<br />#3 0x00000000004a6d99 in sba_symat_invert_BK ()<br />#4 0x00000000004a05ec in sba_motstr_levmar_x ()<br />#5 0x000000000049d058 in sba_motstr_levmar ()<br />#6 0x000000000049ae8e in run_sfm ()<br />#7 0x0000000000415e06 in BundlerApp::RunSFM(int, int, int, bool, camera_params_t*, v3_t*, int*, v3_t*, std::vector‹std::vector‹std::pair‹int, int›, std::allocator‹std::pair‹int, int› › ›, std::allocator‹std::vector‹std::pair‹int, int›, std::allocator‹std::pair‹int, int› › › › ›&, double, double*, double*, double*, double*, bool) ()<br />#8 0x000000000042a16e in BundlerApp::BundleAdjustFast() ()<br />#9 0x0000000000405c33 in BundlerApp::OnInit() ()<br />#10 0x0000000000406f77 in main ()<br /><br /></code><br /><br />I'll update this post with results of trying out the 32-bit binaries, and also trying to build it on 64-bit with 32-bit compilation.binarymilleniumhttp://www.blogger.com/profile/17419830604356775608noreply@blogger.com22tag:blogger.com,1999:blog-28093388.post-21460892380434748742010-02-14T09:01:00.000-08:002010-02-14T09:05:32.860-08:00There needs to be better buzz blogger integrationThe generic rss import gadget sort of works, but the rss headlines of buzz are really useless, only saying that there is a new buzz from username- they should have the first sentence at least of a buzz item:<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjnwpzZbrj7pnSq25Anm9hX3DInExuAduGr_Eko2xitb7A7g0GOvwdtPug-lPWrDJxan3Bt2dlWWhyphenhyphenzR-Zu1XE5I3OPZrT93ryp1_zCqjTv58rGs69BKueXsUSbXtvY4DPx2auq/s1600-h/buzzimport.PNG"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 355px; height: 400px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjnwpzZbrj7pnSq25Anm9hX3DInExuAduGr_Eko2xitb7A7g0GOvwdtPug-lPWrDJxan3Bt2dlWWhyphenhyphenzR-Zu1XE5I3OPZrT93ryp1_zCqjTv58rGs69BKueXsUSbXtvY4DPx2auq/s400/buzzimport.PNG" border="0" alt=""id="BLOGGER_PHOTO_ID_5438146092346453074" /></a>binarymilleniumhttp://www.blogger.com/profile/17419830604356775608noreply@blogger.com0tag:blogger.com,1999:blog-28093388.post-20454673121582261022010-01-08T18:25:00.001-08:002010-01-08T18:47:27.639-08:00Xbox Project Natal - Estimated SpecsThis video went up yesterday, was pulled down and then restored later for some reason (not before other youtube users posted their own copies and gathered many more views than the official channel)<br /><br /><object width="560" height="340"><param name="movie" value="http://www.youtube.com/v/-_UzcnTYqc4&hl=en_US&fs=1&"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/-_UzcnTYqc4&hl=en_US&fs=1&" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="560" height="340"></embed></object><br /><br />The video states 30Hz, and there is an interesting shot starting at two minutes.<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgAslgKCy37nE2mTxL_oNjPJaRZB9TDbTDFAyiYlF7KUis6XFSr-kcn-yNSuc4I-iF3Nlgp-nftzdbvWrCOqPVrIkIhRKL4CPNfetzVRMz57Nm_epTk3exZfDTJmte5nLJLyKY_/s1600-h/natal_grid.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 239px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgAslgKCy37nE2mTxL_oNjPJaRZB9TDbTDFAyiYlF7KUis6XFSr-kcn-yNSuc4I-iF3Nlgp-nftzdbvWrCOqPVrIkIhRKL4CPNfetzVRMz57Nm_epTk3exZfDTJmte5nLJLyKY_/s400/natal_grid.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5424562600967910082" /></a><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjHGGtXPiSaaSF5LhZoLunwoTE6yfAn4LfZf5DNZ2A-nKJOWr-hT4OfdhQSMa-3EF1A3Zm8O7nkTcvsnV4hiIWjvu09wP5sFEmrZFu6iuiCKbI2ksI6oM_K7C90aJRVl3alwC8r/s1600-h/natal_grid3.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 238px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjHGGtXPiSaaSF5LhZoLunwoTE6yfAn4LfZf5DNZ2A-nKJOWr-hT4OfdhQSMa-3EF1A3Zm8O7nkTcvsnV4hiIWjvu09wP5sFEmrZFu6iuiCKbI2ksI6oM_K7C90aJRVl3alwC8r/s400/natal_grid3.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5424565705343486194" /></a><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj5__g9Ybd5qaqkTPuDxupKlei6mSVxL42UMBBk3uz8ZJhD2Dqy3Fbb445o6T9YmAraJgxSMIOlt6e0it-CJA7hhv1qdgoGDRtOOOFMzfLxrt1-0d-NrsqqvGmX8SWvbBFRXTq6/s1600-h/natal_grid2.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 238px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj5__g9Ybd5qaqkTPuDxupKlei6mSVxL42UMBBk3uz8ZJhD2Dqy3Fbb445o6T9YmAraJgxSMIOlt6e0it-CJA7hhv1qdgoGDRtOOOFMzfLxrt1-0d-NrsqqvGmX8SWvbBFRXTq6/s400/natal_grid2.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5424565324931287666" /></a><br />From the above screenshots it's possible to count the number of pixels horizontally and vertically, and also most of the depth bins (there may be more range beyond the wall behind the person but a minimum count is better than nothing).<br /><br />So there at least 18 depth bins, counted at 2:01-2:04 in the video. Each is maybe 2 or3 inches spaced.<br /><br />The resolution looks like 64x64.binarymilleniumhttp://www.blogger.com/profile/17419830604356775608noreply@blogger.com2tag:blogger.com,1999:blog-28093388.post-75395698231645500362010-01-04T06:17:00.000-08:002010-01-04T06:45:24.074-08:00Structured Light For 3d ScanningI learned a little about structured light from the 2008 House of Cards Radiohead video and more since Kyle McDonald created an open source <a href="http://code.google.com/p/structured-light/">google code implementation</a>- but only very recently really gotten into how it works.<br /><br /><b><a href="http://binarymillenium.googlecode.com/svn/trunk/processing/simstructuredlight/">Simulated Structured Light</a></b><br /><br />I had trouble getting good scans initially and I thought it would be good to create a perfectly controlled setup that didn't require a projector or camera- instead generate scenes from software, projecting textures onto 3d objects. <br /><br />One of the generated input phase images:<br /><img src="http://binarymillenium.googlecode.com/svn/trunk/processing/simstructuredlight/phase1.jpg"></img><br /><br />Output from the google code ThreePhase processing app:<br /><img src="http://binarymillenium.googlecode.com/svn/trunk/processing/simstructuredlight/threephase1.jpg"></img><br /><br /><br />The obvious improvement is to add in an obj loader to this program, instead of just generating a semi-random blobby shape.<br /><br /><b>Real World Scans</b><br /><br />I was hoping for a lot more but managed to get a few scans of people during a New Year's Eve party:<br /><br /><table style="width:auto;"><tr><td><a href="http://picasaweb.google.com/lh/photo/O0MHQyKynl8Hd3IHTAtxfg?feat=embedwebsite"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhXwN7qaqFdULo9lZyiC1oHoMbLkXhLbL-POUTRqJ1Agn9SWc2qvcyME8Y-1y-p8I5QKe3lX4U9ZuFfJZ-YCoKvXwSGHpg3g6hYihDP4Cw1abWp5UHAfG9Iv8KMIaX7Y174TSIv/s400/out2hud.jpg" /></a></td></tr><tr><td style="font-family:arial,sans-serif; font-size:11px; text-align:right">From <a href="http://picasaweb.google.com/binarymillenium/20091231_nextyear?feat=embedwebsite">2009.12.31_nextyear</a></td></tr></table><br /><br /><table style="width:auto;"><tr><td><a href="http://picasaweb.google.com/lh/photo/D4fkrfwcFvycnei0Ki_RaA?feat=embedwebsite"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiT3oOnoqzMkA1N3id9u9hou0KYUQGsrHBbM3OekvopZQV6PM-_13xGFbc5PmxMAa1atAADtwo7b294dZt8J2HJ3VNT5_AUrFGUsSkffNsav4sgjzDJd7zsAg1w28nwZyNfzctX/s400/out4hud.jpg" /></a></td></tr><tr><td style="font-family:arial,sans-serif; font-size:11px; text-align:right">From <a href="http://picasaweb.google.com/binarymillenium/20091231_nextyear?feat=embedwebsite">2009.12.31_nextyear</a></td></tr></table><br /><br /><table style="width:auto;"><tr><td><a href="http://picasaweb.google.com/lh/photo/6Yj-wxn5tDBNvIay7LcVyQ?feat=embedwebsite"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjw0G7F-2RGu3wkO5YSkyg8YPz_QbmW7JifJmQl_gAAFMkrR0NA-ChxDjipRejtz4E0avkA0kx0mOkf6WqkCL11GDiIE5KF22UCjECQJmyzPUy-Bo8_iQte9VKH9flqAfrxMQ1A/s400/out3hud.jpg" /></a></td></tr><tr><td style="font-family:arial,sans-serif; font-size:11px; text-align:right">From <a href="http://picasaweb.google.com/binarymillenium/20091231_nextyear?feat=embedwebsite">2009.12.31_nextyear</a></td></tr></table><br /><br />Those were all made using modified versions of Kyle McDonald's slDecode and slCapture programs. I'll post those somewhere eventually. <br /><br /><br /><b><a href="http://binarymillenium.googlecode.com/svn/trunk/processing/simstructuredlight/slight.m">Matlab Script</a></b><br /><br />This script is currently very very slow, but it's good to be able to debug in matlab and have easy access to fft and other functions.<br /><br />Wrapped phase:<br /><img src="http://binarymillenium.googlecode.com/svn-history/r487/trunk/processing/simstructuredlight/wrapped.png" width=600></img><br /><br />Unwrapped phase:<br /><img src="http://binarymillenium.googlecode.com/svn-history/r487/trunk/processing/simstructuredlight/unwrapped.png" width=600></img><br /><br />Note the vertical lines where phase propagates vertically in glitchy ways- some filtering (and even slower processing time) ought to be able to clean that during the flood fill.<br /><br /><b>Next</b><br /><br />I hope to go mainly in the direction of high resolution high fidelity scans, as opposed to high frame-rate low sample-time scans, though I think I can access several high frame rate cameras for use in the latter. There is also lots of room for improvements (especially at the phase unwrapping stage) that would benefit either one of those.binarymilleniumhttp://www.blogger.com/profile/17419830604356775608noreply@blogger.com5tag:blogger.com,1999:blog-28093388.post-1849472352876368712009-11-21T13:13:00.000-08:002009-11-21T14:01:42.679-08:00Natal competitor: Optricam?<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEinODP-SYmJOC7aQi-Am7yM099bKGRvKhO3gSEPRYiYbluRwmu0OPAq4QDfQqDuic7OY9AtDGhfgRZlkRmlttMZIgDj7tQTcg8hYE5km_Cd0OlV6ZclihNIIC6rUYIzEwbJJ-uG/s1600/ico_camera.png"><img style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 90px; height: 90px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEinODP-SYmJOC7aQi-Am7yM099bKGRvKhO3gSEPRYiYbluRwmu0OPAq4QDfQqDuic7OY9AtDGhfgRZlkRmlttMZIgDj7tQTcg8hYE5km_Cd0OlV6ZclihNIIC6rUYIzEwbJJ-uG/s400/ico_camera.png" alt="" id="BLOGGER_PHOTO_ID_5406677518973003490" border="0" /></a><br /><br />For a while I was worried that after Microsoft acquired 3DV Systems and their depth sensing ZCam, which might have been available at the beginning of this year, they were going to release the <a href="http://binarymillenium.com/2009/06/xbox-project-natal.html">Natal</a> exclusively for the Xbox and the possibilities for non-MS sanctioned applications would be severely hampered. And then to have to suffer delays for software I don't care about when the hardware may be ready now. But there may be similar products from other vendors due to arrive soon.<br /><br />I learned about the <a href="http://www.pmdtec.com/products-services/pmdvisionr-cameras/pmdvisionr-camcube-20/">PMDTec Camcube</a> a few months ago. It is a lot less expensive than a Swissranger, but there are no U.S. distributors and nothing on the web from customers that are using it.<br /><br />But this last week there was <a href="http://www.pr-inside.com/softkinetic-optrima-and-texas-instruments-collabor-r1591685.htm">this press-release</a> about a Belgian company <a href="http://www.optrima.com/">Optrima</a> (Israel, Germany, Switzerland, and now Belgium- all the U.S. flash lidar vendors seemed to have missed the boat on low cost continuous-wave IR led range finding systems to instead focus on extremely expensive aerospace and high end mobile robotics applications) teaming with TI and a body motion capture software vendor (<a href="http://www.softkinetic.net/">Softkinetic</a>) to produce Natal/Zcam-like results in the same application area- games and general computer UI. There is mention of Beagleboard support in the press-release, so having say Ubuntu be able to communicate is very likely.<br /><br />Hopefully TI will really get behind it and all the economies of scale that MS can bring to bear can be matched, and it will only cost around $100. <br /><br />Also, I'm seeing more and more red clothing and jumpsuits- possibly with specially IR reflective component, which makes me suspicious about the limitations of these sensors (also using them with windows letting in direct sunlight is probably out of the question).<br /><br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgPpZBw4WE4EgEEszF77YcHvXyDx688N2-N-f8trN5Q5ZB4kTj_yxcIs-YvLn6m7w3UfQWVK0ssrg2wFjcFtZSuhrOeesPVo9CmgVeR64185z_FNu56IJ_5sI2MPQ8Dgmzq1-P3/s1600/Project-NatalXBOX-6_816648a.jpg"><img style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 400px; height: 235px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgPpZBw4WE4EgEEszF77YcHvXyDx688N2-N-f8trN5Q5ZB4kTj_yxcIs-YvLn6m7w3UfQWVK0ssrg2wFjcFtZSuhrOeesPVo9CmgVeR64185z_FNu56IJ_5sI2MPQ8Dgmzq1-P3/s400/Project-NatalXBOX-6_816648a.jpg" alt="" id="BLOGGER_PHOTO_ID_5406676577180754738" border="0" /></a><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhYGvj3f-l9xRdCYrwJxm-yT3DtuPcsU41_iYcrlw7AQE9gXE63-qm_Pf5yZQR8porA-craUYSonuiBpZBya4mZKjQFRkP2OVOX66fmyQ2lr7yBLaAy2uhmQrmHVx2ryI9cOG66/s1600/natal-on-jimmy-fallon.jpg"><img style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 400px; height: 267px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhYGvj3f-l9xRdCYrwJxm-yT3DtuPcsU41_iYcrlw7AQE9gXE63-qm_Pf5yZQR8porA-craUYSonuiBpZBya4mZKjQFRkP2OVOX66fmyQ2lr7yBLaAy2uhmQrmHVx2ryI9cOG66/s400/natal-on-jimmy-fallon.jpg" alt="" id="BLOGGER_PHOTO_ID_5406676368829438082" border="0" /></a><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi4ycUoOfQzGR-6MGAkTxCICWGS5b1WSxsn5vdqXhBhyphenhyphenYLx39pm3x6reJbSZHXM-jhlON9W_l1xwYKW9kDi8Yv2SBSokMgfc7jTYd_hZ0SfoPc-unH-mV-sa7IBhSuZwaCRwyfw/s1600/Karate-300x200.png"><img style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 300px; height: 200px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi4ycUoOfQzGR-6MGAkTxCICWGS5b1WSxsn5vdqXhBhyphenhyphenYLx39pm3x6reJbSZHXM-jhlON9W_l1xwYKW9kDi8Yv2SBSokMgfc7jTYd_hZ0SfoPc-unH-mV-sa7IBhSuZwaCRwyfw/s400/Karate-300x200.png" alt="" id="BLOGGER_PHOTO_ID_5406676142076452370" border="0" /></a>binarymilleniumhttp://www.blogger.com/profile/17419830604356775608noreply@blogger.com0tag:blogger.com,1999:blog-28093388.post-32648749101435395352009-11-02T20:40:00.000-08:002009-11-02T20:49:42.342-08:00Using a canon camera to view jpegsI tried using my Canon camera as a picture viewer, putting some downloaded jpegs on it- the pictures didn't show up at all when I tried to review them, only the pictures I had taken with the camera. Renaming the pictures to have camera names like DSC_0025.jpg made the camera show a question mark icon for the picture at least.<br /><br />A little searching later I discovered a tool called <a href="http://paint.net/">paint.net</a>, which saves jpegs in the proper format so the Canon will like them. The other trick is to re-size the canvas the picture is on so all dimensions are multiples of 8. <br /><br />paint.net itself is somewhat interesting, quicker to load than <a href="http://www.gimp.org">Gimp</a> but not so much easier or more intuitive or like Deluxe Paint that I'll use it for anything else.binarymilleniumhttp://www.blogger.com/profile/17419830604356775608noreply@blogger.com0tag:blogger.com,1999:blog-28093388.post-37013233896345141462009-09-12T18:54:00.000-07:002009-09-12T18:42:00.075-07:00Instructions for rendering with Processing on Amazon EC2There are detailed instructions elsewhere on how to get started with EC2 in general, here are the high level things to do for my <a href="http://binarymillenium.com/2009/08/computing-cloud-rendering-with.html">headless rendering project</a>:<br /><br />Get a unix command line environment that has python and ssh, I use cygwin under Windows, other times I dual boot into Ubuntu.<br /><br />Get an <a href="http://aws.amazon.com/ec2/">Amazon EC2 account</a>, create a ~/username.pem file, and make environmental variables for the keys (follow <a href="http://boto.googlecode.com/svn/trunk/doc/ec2_tut.txt">boto instructions</a>).<br />Make sure pem permission are set to 700.<br /><br />Edit ssh_config so that StrictHostChecking is set to no, otherwise ssh sessions started by the scripts will ask if it's okay to connect to every created instance- I could probably automate that response though.<br /><br />Make sure there are no carriage returns (\r) in the pem file in Linux.<br /><br />Get <a href="http://developer.amazonwebservices.com/connect/entry.jspa?externalID=609">Elasticfox</a>, put your credentials in.<br /><br />Get <a href="http://code.google.com/p/boto/">boto</a><br /><br />Get <a href="http://code.google.com/p/trajectorset/source/checkout">trajectorset</a><br /><br />Create a security group called http that at least allows your ip to access a webserver of an ec2 instance that uses it. <br /><br />At this point it should be possible to run ec2start.py, visit the ip address of the head node and watch the results come in. The ec2start script launches a few instances, one head node that will create noise seeds to send to the worker nodes via sqs, and then wait for the workers to process the seeds and send sqs messages back. The head node then copies the results files and renders the graphics, copying the latest results to folder that can be seen by index.html for web display.<br /><br />My code is mainly for demonstration, so the key things I did that will help with alternate applications follow:<br /><br /><b>Custom AMI</b><br /><br />You can use the AMI I created with the id 'ami-2bfd1d42', I used one of the Alestic Ubuntu amis and added Java, Xvfb, Boto, and a webserver like lighttpd (I forget if Xvfb was already installed or not). <br /><br /><b>Headless rendering</b><br /><br />The EC2 instance lack graphics contexts at first, and trying to run a graphical application like an exported Processing project will not work (TBD did I ever try that?). <a href="http://en.wikipedia.org/wiki/Xvfb">Xvfb</a> creates a virtual frame buffer that Processing can render to after running these commands:<br /><br /><code>Xvfb :2<br />export DISPLAY=:2<br /></code><br /><br /><b>Launching processes and detaching from them</b><br /><br />I use python subprocess.Popen frequently to execute commands on the instances like this:<br /><code><br />cmd = "Xvfb :2" <br />whole_cmd = "ssh -i ~/lucasw.pem root@" + dns_name + " \"" + cmd + "\"" <br />proc = subprocess.Popen(whole_cmd, shell=True, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)<br />(stdout,stderr) = proc.communicate()<br /></code><br />The problem is when one wants to run something and close the connection, and leave it running - like Xvfb above, it needs to run and stay running. One method is to leave the ssh connection open, but there is a limit of about 20 ssh sessions.<br /><br />The trick is to use nohup:<br /><code>cmd = "nohup Xvfb :2"<br /></code><br /><br />Don't put extra quotes around the command to execute, which brings me to the next topic.<br /><br /><b>Quote escaping</b><br /><br />There are a few bash commands that require parts to be in quotes- but in python the bash command is already is in quotes, and python will not understand the inner set of quotes unless they are escaped with the backslash:<br /><code>cmd = "echo \"blah\" > temp.txt";</code><br /><br />Then at other times an additional level of quote escaping is required:<br /><code>cmd = "echo \\\"blah\\\" > temp.txt";</code><br />(I do this when I pass all of the cmd variable to be executed by ssh, and ssh wants it in quotes)<br /><br />One backslash escapes on level of quoting, three escapes two levels? It's because the escaping backslash itself needs to be escaped. This gets confusing fast, and some experimentation with python in interactive mode is required to get it right.<br /><br /><b>Config file driven</b><br /><br />It's not currently, not as much as at it needs to be, which makes it very brittle- to change plots requires making about three different edits, when a source config file should specify it for all.binarymilleniumhttp://www.blogger.com/profile/17419830604356775608noreply@blogger.com0tag:blogger.com,1999:blog-28093388.post-8834875802976234652009-08-27T22:26:00.000-07:002009-08-27T22:31:58.956-07:00Computing Cloud Rendering with Processing and Amazon EC2This <a href="code.google.com/p/trajectorset/source/browse/#svn/trunk/ec2">project</a> is my first experiment with using <a href="http://aws.amazon.com/ec2/">Amazon EC2</a> for cloud rendering. The source code is all there and I'll post detailed instructions on how to use it later, but here is a speeded up video of the output:<br /><br /><object width="400" height="225"><param name="allowfullscreen" value="true" /><param name="allowscriptaccess" value="always" /><param name="movie" value="http://vimeo.com/moogaloop.swf?clip_id=6299622&server=vimeo.com&show_title=1&show_byline=1&show_portrait=0&color=&fullscreen=1" /><embed src="http://vimeo.com/moogaloop.swf?clip_id=6299622&server=vimeo.com&show_title=1&show_byline=1&show_portrait=0&color=&fullscreen=1" type="application/x-shockwave-flash" allowfullscreen="true" allowscriptaccess="always" width="400" height="225"></embed></object><p><a href="http://vimeo.com/6299622">Computing Cloud Rendering</a> from <a href="http://vimeo.com/user168788">binarymillenium</a> on <a href="http://vimeo.com">Vimeo</a>.</p><br /><br />It looks kind of cool, but not too exciting- but there's potential for better things.<br /><br />What I've done is launched several compute instances on EC2, where worker nodes create individual lines seen in the plots, and then pass data back to a head node, which creates the plots, puts them on a web page for real-time feedback, and stores all the frames for retrieval at the end of the run. <br /><br />The plots are aggregations of all the results, blue is the presence of any line, and white is a high density of lines, and greenish tinge signifies the line was from a recently aggregated set. It's interesting because the more lines are aggregated, the less the plot changes, so it becomes increasingly boring.<br /><br />All the plotting and data generation is done using java applications exported from Processing. 3D graphics are also possible, and something like this <a href="http://vimeo.com/5648243">earlier video</a> could be ported to the scripts I've made. There is no graphics card accessible on the EC2 machines, but virtual frame buffer software like Xvfb and software rendering (either Processing's P3D or software opengl) make it possible to trick the application into thinking there is.<br /><br />It's not distributed rendering since all the rendering is on one computer, but I think I need to distribute the rendering in order to speed it up.<br /><br />There is potential for more dynamic applications, involving user interaction through webpages, or simulations that interact with the results of previous simulations, and communicate with other nodes to alter what they are doing.binarymilleniumhttp://www.blogger.com/profile/17419830604356775608noreply@blogger.com0tag:blogger.com,1999:blog-28093388.post-50014076717793318662009-08-23T17:30:00.000-07:002009-08-23T20:47:36.628-07:00Save Image As And Close Tab Firefox AddonI haven't made a firefox addon before, but I thought I'd try something simple: <a href="http://binarymillenium.googlecode.com/svn/trunk/mozilla/saveasclosetab/saveasclosetab.xpi">combine the context menu "Save Image As..." with closing the current tab</a>. My contribution consists of putting these two lines together:<br /><code><br />gContextMenu.saveImage();<br />gBrowser.removeCurrentTab();<br /></code> <br /><br />To start out with I used the <a href="http://ted.mielczarek.org/code/mozilla/extensionwiz/">Firefox/Thunderbird Extension Wizard</a>. Initially I didn't select the 'Create context menu item' and that may have caused problems with gContextMenu not being defined - it was either that or the fact I was trying to embed the commands into the firefoxOverlay.xul file as embedded javascript instead of putting it in the overlay.js file.<br /><br />I found the first function by looking through the firefox source code first for the menuitem name of the function "Save Image As", and from there finding saveImage. The removeCurrentTab function was harder to find, but this addon provided source code that showed it: <a href="https://addons.mozilla.org/en-US/firefox/addon/1466">Stephen Clavering's CTC</a>.<br /><br />Tutorial pages I initially found about extension development were helpful, but I didn't see anything that talks about mozilla fundamentals- probably I need to find a book about it.<br /><br />This addon goes well with the <a href="https://addons.mozilla.org/en-US/firefox/addon/710">Menu Editor</a> and <a href="https://addons.mozilla.org/en-US/firefox/addon/25">Download Sort</a>.<br /><br />There is code in the real Save Image As for determining whether an image is being selected or not (my addon shows up regardless) I should add in next, and there should be logic that prevents the close action if the save as was canceled (less sure how to do that).binarymilleniumhttp://www.blogger.com/profile/17419830604356775608noreply@blogger.com0tag:blogger.com,1999:blog-28093388.post-88186584828140323432009-08-05T09:32:00.000-07:002009-08-05T09:33:14.813-07:00Quick jmatio in Processing example1. Download jmatio from <a href="http://www.mathworks.com/matlabcentral/fileexchange/10759">mathworks file exchange</a><br />2. unzip and put contents in folder called jmatio<br />3. rename lib dir to library<br />3. rename library/jamtio.jar to library/jmatio.jar<br />4. create a mat file in the sketch data dir called veh_x.mat which contains an array called veh_x<br />5. Run the following code:<br /><br /><code><br />import com.jmatio.io.*;<br />import com.jmatio.types.*;<br /><br /> MatFileReader mfr = null;<br /> try {<br /> mfr = new MatFileReader(sketchPath + "/data/veh_x.mat" );<br /> } catch (IOException e) {<br /> e.printStackTrace();<br /> exit(); <br /> }<br /> <br />if (mfr != null) {<br /> double[][] data = ((MLDouble)mfr.getMLArray( "veh_x" )).getArray();<br /> <br /> println(data.length +" " + data[0].length + " " + data[0][0]);<br /> <br />}<br /><br /></code><br /><br />TBD use getContents instead of requiring the mat file name and array name be the same.binarymilleniumhttp://www.blogger.com/profile/17419830604356775608noreply@blogger.com1tag:blogger.com,1999:blog-28093388.post-82246469386807194192009-07-03T14:59:00.000-07:002009-07-03T15:14:11.999-07:00Building Bundler v0.3 on Ubuntu'The Office Box' requested help with running Bundler on linux ( specifically Ubuntu 9- I have a virtualbox vm of Ubuntu 8.04 fully updated to today, but I'll try this out on Ubuntu 9 soon) so I went through the process myself.<br /><br />The binary version depends on libgfortran.so.3, which I couldn't find with aptitude, so I tried the building from source- it turned out to be not that hard. The is no 'configure' for bundler 0.3 to search for dependencies that aren't installed, so I built incrementally and installed packages as I ran into build failures. I might be missing a few I already had installed for other purposes, but do a sudo aptitude install on the following:<br /><code><br />build-essentials<br />gfortran-4.2 <br />zlib1g-dev<br />libjpeg-dev<br /></code><br /><br />A missing gfortran produces the cryptic 'error trying to exec 'f951': execvp: No such file or directory)' message.<br /><br />These might be necessary I'm not sure:<br /><code><br />lapack3<br />libminpack1<br />f2c<br /></code><br /><br />After that run the provided makefile, add the bundler bin folder to your LD_LIBRARY_PATH, and then go into the examples/kermit directory and run ../../RunBundler.sh to see there are good ply files in the bundle directory. Bundler is a lot slower than Photosynth for big jobs, I haven't tried the intel math libs though.<br /><br />The full output from a successful kermit RunBundler run looks like this:<br /><code><br />Using directory '.'<br />0<br />Image list is list_tmp.txt<br />[Extracting exif tags from image ./kermit000.jpg]<br /> [Focal length = 5.400mm]<br />[Couldn't find CCD width for camera Canon Canon PowerShot A10]<br />[Found in EXIF tags]<br /> [CCD width = 5.230mm]<br /> [Resolution = 640 x 480]<br /> [Focal length (pixels) = 660.803<br />[Extracting exif tags from image ./kermit001.jpg]<br /> [Focal length = 5.400mm]<br />[Couldn't find CCD width for camera Canon Canon PowerShot A10]<br />[Found in EXIF tags]<br /> [CCD width = 5.230mm]<br /> [Resolution = 640 x 480]<br /> [Focal length (pixels) = 660.803<br />[Extracting exif tags from image ./kermit002.jpg]<br /> [Focal length = 5.400mm]<br />[Couldn't find CCD width for camera Canon Canon PowerShot A10]<br />[Found in EXIF tags]<br /> [CCD width = 5.230mm]<br /> [Resolution = 640 x 480]<br /> [Focal length (pixels) = 660.803<br />[Extracting exif tags from image ./kermit003.jpg]<br /> [Focal length = 5.400mm]<br />[Couldn't find CCD width for camera Canon Canon PowerShot A10]<br />[Found in EXIF tags]<br /> [CCD width = 5.230mm]<br /> [Resolution = 640 x 480]<br /> [Focal length (pixels) = 660.803<br />[Extracting exif tags from image ./kermit004.jpg]<br /> [Focal length = 5.400mm]<br />[Couldn't find CCD width for camera Canon Canon PowerShot A10]<br />[Found in EXIF tags]<br /> [CCD width = 5.230mm]<br /> [Resolution = 640 x 480]<br /> [Focal length (pixels) = 660.803<br />[Extracting exif tags from image ./kermit005.jpg]<br /> [Focal length = 5.400mm]<br />[Couldn't find CCD width for camera Canon Canon PowerShot A10]<br />[Found in EXIF tags]<br /> [CCD width = 5.230mm]<br /> [Resolution = 640 x 480]<br /> [Focal length (pixels) = 660.803<br />[Extracting exif tags from image ./kermit006.jpg]<br /> [Focal length = 5.400mm]<br />[Couldn't find CCD width for camera Canon Canon PowerShot A10]<br />[Found in EXIF tags]<br /> [CCD width = 5.230mm]<br /> [Resolution = 640 x 480]<br /> [Focal length (pixels) = 660.803<br />[Extracting exif tags from image ./kermit007.jpg]<br /> [Focal length = 5.400mm]<br />[Couldn't find CCD width for camera Canon Canon PowerShot A10]<br />[Found in EXIF tags]<br /> [CCD width = 5.230mm]<br /> [Resolution = 640 x 480]<br /> [Focal length (pixels) = 660.803<br />[Extracting exif tags from image ./kermit008.jpg]<br /> [Focal length = 5.400mm]<br />[Couldn't find CCD width for camera Canon Canon PowerShot A10]<br />[Found in EXIF tags]<br /> [CCD width = 5.230mm]<br /> [Resolution = 640 x 480]<br /> [Focal length (pixels) = 660.803<br />[Extracting exif tags from image ./kermit009.jpg]<br /> [Focal length = 5.400mm]<br />[Couldn't find CCD width for camera Canon Canon PowerShot A10]<br />[Found in EXIF tags]<br /> [CCD width = 5.230mm]<br /> [Resolution = 640 x 480]<br /> [Focal length (pixels) = 660.803<br />[Extracting exif tags from image ./kermit010.jpg]<br /> [Focal length = 5.400mm]<br />[Couldn't find CCD width for camera Canon Canon PowerShot A10]<br />[Found in EXIF tags]<br /> [CCD width = 5.230mm]<br /> [Resolution = 640 x 480]<br /> [Focal length (pixels) = 660.803<br />[Found 11 good images]<br />[- Extracting keypoints -]<br />Finding keypoints...<br />1245 keypoints found.<br />Finding keypoints...<br />1305 keypoints found.<br />Finding keypoints...<br />1235 keypoints found.<br />Finding keypoints...<br />1220 keypoints found.<br />Finding keypoints...<br />1104 keypoints found.<br />Finding keypoints...<br />1159 keypoints found.<br />Finding keypoints...<br />949 keypoints found.<br />Finding keypoints...<br />1108 keypoints found.<br />Finding keypoints...<br />1273 keypoints found.<br />Finding keypoints...<br />1160 keypoints found.<br />Finding keypoints...<br />1122 keypoints found.<br />[- Matching keypoints (this can take a while) -]<br />../../bin/KeyMatchFull list_keys.txt matches.init.txt<br />[KeyMatchFull] Reading keys took 1.020s<br />[KeyMatchFull] Matching to image 0<br />[KeyMatchFull] Matching took 0.010s<br />[KeyMatchFull] Matching to image 1<br />[KeyMatchFull] Matching took 0.170s<br />[KeyMatchFull] Matching to image 2<br />[KeyMatchFull] Matching took 0.380s<br />[KeyMatchFull] Matching to image 3<br />[KeyMatchFull] Matching took 0.560s<br />[KeyMatchFull] Matching to image 4<br />[KeyMatchFull] Matching took 0.740s<br />[KeyMatchFull] Matching to image 5<br />[KeyMatchFull] Matching took 0.960s<br />[KeyMatchFull] Matching to image 6<br />[KeyMatchFull] Matching took 1.060s<br />[KeyMatchFull] Matching to image 7<br />[KeyMatchFull] Matching took 1.210s<br />[KeyMatchFull] Matching to image 8<br />[KeyMatchFull] Matching took 1.410s<br />[KeyMatchFull] Matching to image 9<br />[KeyMatchFull] Matching took 1.600s<br />[KeyMatchFull] Matching to image 10<br />[KeyMatchFull] Matching took 1.760s<br />[- Running Bundler -]<br />[- Done -]<br /></code>binarymilleniumhttp://www.blogger.com/profile/17419830604356775608noreply@blogger.com19tag:blogger.com,1999:blog-28093388.post-55885765606425497342009-06-16T05:08:00.000-07:002009-06-24T07:37:03.988-07:00Xbox Project NatalA little less than a year ago I remember stumbling across <a href="http://www.3dvsystems.com/gallery/gallery.html">the Zcam from 3DV Systems</a>, the company promised two orders of magnitude decreases in the cost of flash array lidar through mass production- the trick is to market it as a device anyone can use, not just as a robotics or general automation tool. The company promised to be in the market by the end of 2008, and after emails went unanswered I assumed it was vaporware. <br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://www.3dvsystems.com"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 382px; height: 400px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg1KeGREuiMB3OMOCRnES-oYzd63b1HvMPEMsj296vpC9D8AifMRTIi-MVovyk-j4BCKmv7D8ZZXkJ91NWH1zGeEtLHcV3xPX7sNE5Kmgp-Q4RJS3T1SoqU3-5CvAtQU0O2AKau/s400/pic3.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5347905888547478850" /></a><br /><br />The closest competitor would be the <a href="http://www.mesa-imaging.ch/prodview4k.php">Mesa Imaging SwissRanger</a>, which I think goes for $5000-$10000. Beyond that there are very expensive products from <a href="http://www.advancedscientificconcepts.com/">Advanced Scientific Concepts</a> or <a href="http://www.ballaerospace.com/page.jsp?page=30&id=297">Ball Aerospace</a> that are in the hundreds of thousands of dollars range at least. ASC made a deal with iRobot that might bring the price down through economies of scale, though they probably aren't going to put it on the Roomba anytime soon. More likely the Packbot which already costs $170K, why not round that up to half a million?<br /><br />In late 2008 to early 2009 rumors surfaced that Microsoft was going to buy 3DV Systems, and now we have the official announcements about Natal. And of course no mention of 3DV Systems (which hasn't updated their webpage in over a year) or even how it measures the phase shift or time of flight of light pulses in a sensor array to produce depth images. Given enough processing power, the right software, and good lighting, it would be possible to do everything seen in the Natal videos with a single camera. The next step up would be stereo vision to get depth images- it's possible that's what Natal is, but it seems like they would have mentioned that since that technology is so conventional.<br /><br />But that won't stop me from speculating:<br /><br />Natal is probably a 0.5-2 megapixel webcam combined with a flash lidar with a resolution of 64x64 or 128x128 pixels, and maybe a few dozen levels of depth bins.<br /><br />The low resolution means there is a ton of software operating on the video image and the depth information to derive the skeletal structure for full body motion capture. All that processing means the speed and precision is going to be somewhat low- it would be great to buy one of these and be able to record body movements for use in 3D animation software, machinima, independent games, or full body 3D chat (there's no easy way to do intersections or collisions with other people in an intuitive way so don't get too excited), but I doubt it will capture a lot of nuance.<br /><br />The lidar might be continuous wave (CW) like the SwissRanger. This has an interesting property where beyond the maximum range of the sensor, objects appear closer again- if the range was 10 feet, an object 12 feet away is indistinguishable from one 2 feet away, or 22 feet away.<br /><br />Beyond that, hopefully MS sees the potential for this beyond an Xbox peripheral. It would be criminal not to be able to plug this into a PC, and have at least Windows drivers, an SDK + DirectX support. The next most obvious thing would be to use it to promote MS Robotics Studio, and offer a module for that software to use the Natal. If it just has a USB connection then it could be placed on a moderately small mobile robot, and software could use the depth maps for collision avoidance and with some processing power be able to computer 3D or 2D grid maps (maybe like <a href="http://vimeo.com/2383465">this</a>) and figure out when it has returned to the same location.<br /><br />The next step is to make a portable camera that takes a high megapixel normal image along with a depth image. Even with the low resolution and limited range (or range that rolls over), the depth information could be passed on to photosynth to reduce the amount of pictures needed to make a good synth. MS doesn't make cameras, but why not license the technology to Nikon or Canon? Once in dedicated cameras, it's on to cell phone integration...<br /><br />The one downside is that the worst application seems to be as a gaming device, which is bad because I'd like it to be very successful in order to inspire competing products and later generations of the same technology. It is certainly not going to have the precision of a Wii MotionPlus, and maybe not even a standard Wii controller (granted that it can do some interesting things that a Wii controller can't).<br /><br />But even if it isn't a huge success, it should be possible to get a device from the first generation, and it's only a matter of time before someone hacks it and produces Linux drivers, right?binarymilleniumhttp://www.blogger.com/profile/17419830604356775608noreply@blogger.com1tag:blogger.com,1999:blog-28093388.post-5462257462600775612009-04-06T06:05:00.000-07:002009-04-06T06:26:09.665-07:00OpenCV example, and why does Google do so poorly?Take searching for cvGetSpatialMoment:<br /><a href="http://www.google.com/search?hl=en&q=cvGetSpatialMoment&btnG=Google+Search&aq=f&oq=">http://www.google.com/search?hl=en&q=cvGetSpatialMoment&btnG=Google+Search&aq=f&oq=</a><br /><br />All the top results are nearly useless, just code that doesn't help much if you don't know what cvGetSpatialMoment does.<br /><br />The "CV Reference Manual" that comes with an install of OpenCV probably should come up first (the local html files of course aren't google searchable), or any real text explanation or tutorial of the function. So scrolling down further there are some odd but useful sites like <a href="http://www.ieeta.pt/~jmadeira/OpenCV/OpenCVdocs/ref/opencvref_cv.htm">http://www.ieeta.pt/~jmadeira/OpenCV/OpenCVdocs/ref/opencvref_cv.htm</a>. I guess the official <a href="http://opencv.willowgarage.com/wiki/CxCore">Willow Garage docs here</a> haven't been linked to enough.<br /><br />The <a href="http://books.google.com/books?id=seAgiOfu2EIC&printsec=frontcover&dq=opencv#PPP1,M1">official OpenCV book on Google</a> is highly searchable, some pages are restricted but many are not.<br /><br />Through all that frustration I did manage to learn a lot of basics to load an image and process a portion of the image to look for a certain color, and then find the center of the region that has that color.<br /><br /><blockquote><code>IplImage* image = cvLoadImage( base_filename, CV_LOAD_IMAGE_COLOR );</code></blockquote><br /><br />split it into two halves for separate processing<br /><blockquote><code>IplImage* image_left = cvCreateImage( cvSize( image->width/2, image->height), IPL_DEPTH_8U, 3 );<br />cvSetImageROI( image, cvRect( 0, 0, image->width/2, image->height ) );<br />cvCopy( image, image_left );</code></blockquote><br /><br />convert it to hsv color space<br /><blockquote><code> IplImage* image_left_hsv = cvCreateImage( cvSize(image_left->width, image_left->height), IPL_DEPTH_8U, 3 );<br />cvCvtColor(image_left,image_left_hsv,CV_BGR2HSV);</code></blockquote><br /><br />get only the hue component using the COI '[color] Channel Of Interest' function<br /><blockquote><code>IplImage* image_left_hue = cvCreateImage( cvSize(image_left->width, image_left->height), IPL_DEPTH_8U, 1 );<br />cvSetImageCOI( image_left_hsv, 1);<br />cvCopy(image_left_hsv, image_left_hue); </code></blockquote><br /><br />find only the parts of an image within a certain hue range<br /><blockquote><code>cvInRangeS(image_left_hue, cvScalarAll(huemin), cvScalarAll(huemax), image_msk);</code></blockquote><br /><br />erode it down to get rid of noise<br /><blockquote><code>cvErode(image_msk,image_msk,NULL, 3);</code></blockquote><br /><br />and then find the centers of mass of the found regions <br /><blockquote><code>CvMoments moments;<br /> cvMoments(image_msk, &moments, 1);<br /> double m00, m10, m01;<br /><br /> m00 = cvGetSpatialMoment(&moments, 0,0);<br /> m10 = cvGetSpatialMoment(&moments, 1,0);<br /> m01 = cvGetSpatialMoment(&moments, 0,1);<br /> <br /> // TBD check that m00 != 0<br /> float center_x = m10/m00;<br /> float center_y = m01/m00;</code></blockquote><br /><br />Copy the single channel mask back into a three channel rgb image<br /><code><blockquote> <br /> IplImage* image_rgb = cvCreateImage( cvSize(image_msk->width, image_msk->height), IPL_DEPTH_8U, 3 );<br /> cvSetImageCOI( image_rgb, 2);<br /> cvCopy(image_msk,image_rgb);<br /> cvSetImageCOI( image_rgb, 0);</code></blockquote><br /><br />and draw circles on a temp image where the centers of mass are<br /><blockquote><code>cvCircle(image_rgb,cvPoint(int(center_x),int(center_y)), 10, CV_RGB(200,50,50),3);</code></blockquote><br /><br />All the work of setting channels of interest and regions of interest was new to me. I could have operated on images in place rather than creating many new ones, taking up more memory (and I would need to remember to free the memory created by all of them), but for debugging it's nice to keep around the intermediate steps.binarymilleniumhttp://www.blogger.com/profile/17419830604356775608noreply@blogger.com3