2008-10-14



Hey lil feller. I'm just going to pick you up with this stick and you can hang on, take you back to the garden. I didn't mean to wake you up from your hibernation or whatever.

2008-10-07

Artoolkit rangefinding continued

I've discovered the arParamObserv2Ideal() function to correct for image distortion, and I think it has improved things. But my main problem is figuring out how to properly project the line from the origin through the laser dot in marker/fiducial space. I have a message out on the mailing list but it is not that active.



The results of my crude fillgaps processing app are shown below, using the somewhat sparse points from above.



The above results look pretty good- the points along the edge of the wall and floor are the furthest from the camera so appear black, and the floor and wall toward the top and bottom of the image are closer and get brighter.

My main problem is getting live feedback of where I've gotten points. With a live view that showed all found depth points it would be easier to achieve uniform coverage, rather than going off of memory. My problem there is that to use artoolkit I have to detect the markers, then shrink the image down and draw dots over it- not too hard sounding but the first time I tried it got all messed up.

2008-10-05

Depth Maps with ARToolkit and a Laser Pointer

Flying home from a recent trip to the east coast, I tried to figure out what the most inexpensive method for approximating scanning lidar would be. This is my answer:


ARToolKit assisted laser rangefinding from binarymillenium on Vimeo.


It's not that inexpensive, since I'm using a high resolution network camera similar to an Elphel- but it's possible I could replace it with a good consumer still camera with a linux supported usb interface for getting live images.




In the above screen shot the line projected from the found fiducial is shown, and the target where the found red laser dot is- they ought to cross each other but I need to learn more about transforming coordinates in and out of the camera space in ARToolkit to improve upon it.







This picture shows that the left side of the screen is 'further' away than the wall on the right, but that is not quite right- it is definitely further away from the fiducial, so I may be making an error.

source code

Intel Research Seattle - Open House

From 2008.10.01 Intel Research Seattle


Robotic Arm

From 2008.10.01 Intel Research Seattle


From 2008.10.01 Intel Research Seattle


This robot had a camera and em field sensors in its hand, and could detect the presence of an object within grasping distant. Some objects it had been trained to recognize, and others it did not but would attempt to grab anyway. Voice synthesis provided additional feedback- most humorously when it accidently (?) dropped something it said 'oops'. Also motor feedback sensed when the arm was pushed on, and the arm would give way- making it much safer than an arm that would blindly power through any obstacle.

From 2008.10.01 Intel Research Seattle


From 2008.10.01 Intel Research Seattle


It's insane, this application level taint

From 2008.10.01 Intel Research Seattle


I don't really know what this is about...


Directional phased array wireless networking

From 2008.10.01 Intel Research Seattle


The phased array part is pretty cool, but the application wasn't that compelling: Using two directional antennas a select zone can be provided with wireless access and other zones not overlapped by the two excluded. Maybe if it's more power efficient that way?

From 2008.10.01 Intel Research Seattle


From 2008.10.01 Intel Research Seattle


This antenna also had a motorized base, so that comparisons between physically rotating the antenna and rotating the field pattern could be made.

Haptic squeeze

From 2008.10.01 Intel Research Seattle


This squeeze thing has a motor in it to resist pressure, but was broken at the time I saw it. The presenter said it wasn't really intended to simulate handling of real objects in virtual space like other haptic interfaces might, but be used more abstractly as a interface to anything.

From 2008.10.01 Intel Research Seattle


RFID Accelerometer


From 2008.10.01 Intel Research Seattle


From 2008.10.01 Intel Research Seattle


This was one of two rfid accelerometers- powered entirely from the rfid antenna, the device sends back accelerometer data to rotate a planet on a computer screen. The range was very limited, and the update rate about 10 Hz. The second device could be charged within range of the field, then be taken out of range and moved around, then brought back to the antenna and download a time history (only 2 Hz now) of measurements taken. The canonical application is putting the device in a shipped package and then reading what it experienced upon receipt.

Wireless Resonant Energy

From 2008.10.01 Intel Research Seattle


From 2008.10.01 Intel Research Seattle


This has had plenty of coverage elsewhere but is very cool to see in person. Currently moving the receiver end more than an inch forward or back or rotating it causes the light buld to dim and go out.

Scratch Interface
From 2008.10.01 Intel Research Seattle


A simple interface where placing a microphone on a surface and then tapping on the surface in distinct ways can be used to control a device. Also a very simple demo of using opencv face tracking to reorient a displayed video to correct for distortion seen when viewing a flat screen from an angle.

Look around the building

From 2008.10.01 Intel Research Seattle


From 2008.10.01 Intel Research Seattle


From 2008.10.01 Intel Research Seattle


I found myself walking in a circle and not quite intuitively feeling I had completed a circuit when I actually had.


Also see another article about this, and Intel's flickr photos, and a a video.