2008-07-30

Velodyne Lidar- Monterey Data




There's something wrong with how I'm viewing the data, the streaks indicate every revolution of the lidar results in an angular drift- maybe I'm re-adding a rotational offset and allowing it to accumulate?

No, it turned out to be a combination of things- applying the wrong calibration data (essentially flopping the lower and upper laser blocks). Now it's starting to look pretty good, though there is a lot of blurriness because the environment is changing over time- that's a car pulling out into an intersection.




Velodyne Lidar - Monterey from binarymillenium on Vimeo.

2008-07-29

Velodyne Lidar Sample Data: Getting a .pcap into Python

Velodyne has provided me with this sample data from their HDL-64E lidar.

Instead of data exported from their viewer software, it's a pcap captured with Wireshark or a another network capture tool that uses the standard pcap format.

Initially I was trying to extract the lidar packets with libpcap and python, using pcapy or similar, and went through a lot of trouble getting the pcap library to build in the cygwin environment. Python and libpcap were able to load the data from the pcap, but rendered the binary into a long string with escape codes like '\xff'.

I then discovered that Wireshark has a function called 'follow udp stream'- right-click on the data part of a packet in Wireshark and then export as 'raw'. The exported data doesn't preserve division between packets any longer, but since each is of a consistent length (1206 bytes) it's easy to parse.



Python does work well for loading binary data from file:


import array

f = open('data.raw') # the data exported from wireshark in the raw format

bin = array.array('B') # setup an array of typecode unsigned byte

bin.fromfile(f, 1206) # each packet has 1206 bytes of data



The contents of bin now have 1206 bytes of data that looks like this:


>>> bin
array('B', [255, 221, 33, 39, 5, 9, 67, 178, 8, 116, 160, 13, 126, 222, 13, 63, 217, 8, 162, 204, ...


The 'B' element doesn't prevent indexing into the real data with bin[0] and getting 255 and so on.

The first two bytes indicate whether the data is from the upper or lower array, and the 33 39 is interpreted to mean that the reading was taken when the sensor head was rotated to (39*255+33)/100 or 99.78 degrees.

For 32 times after those first two bytes there are pairs of distance bytes followed by single intensity bytes, and then it starts over for a total of 12 times... and there are 6 more bytes with information that is unnecessary now. See here for what I currently have for parsing the raw file, later I will get into add the per laser calibration data.

2008-07-26

More HoC: Preprocessing into pngs

house of cards height
house of cards intensity

2008-07-21

Radiohead - House of Cards 2


Radiohead - House of Cards from binarymillenium on Vimeo.

This rendered slowly, less than 1 fps. One possible speedup would be to pre-process the csv data into binary heightmap files, rather than loading and processing each frame.

Processing code is here:
http://code.google.com/p/binarymillenium/source/browse/trunk/processing/hoc/hoc.pde

It would be nice if they uploaded the data from the woman singing, and the party scene.

2008-07-15

Radiohead - House of Cards



I've been playing with the data for a couple of hours in Processing. The main problem is that the points are not consistent across animation frames, so it is necessary to produce a new set of points in a regular grid that then can be tessellated easily. By the end of the week I ought to have a video up in the official group on youtube and in higher quality on vimeo.