Blogs

Testrelease for Blensor 1.0.13

We want to give you an update on the progress of Blensor development. One of the most important aspects of Blensor 1.0.13 is support for Blenders Material Textures. Which includes UV mapped as well as procedural textures. This allows a more realistic simulation of materials that do not have a uniform reflectivity. Textures can be mapped to the material color and to the intensity of the diffuse reflection. This release also includes further improvements to the Kinect simulation (its stll very slow but the results should be improved)
  • The following scene:

Mesh and Pointcloud alignment

After upgrading to Blender 2.62.3 (which is the last step before we upgrade to Blender 2.63) we added a python module to blender mathutils (mathutils.eigen). This module exposes a least squares solver from the Eigen3 library and is used by our meshalign tool. This meshalign will be available in the next BlenSor release it allows you to align two meshes from point correspondences provided by the user. A short intro can be seen here

Preparing upgrade to Blender 2.63

Since development towards Blender 2.63 is reaching it's final stages, we are working on switching BlenSor from Blender 2.60 to Blender 2.63.
Hopefully the new Bmesh system does not break the simulation. But if everything works as expected the switch will bring us polygonal faces and the
Remesh modifier.

Michael

Kinect sensor improvements

after Maurice sent me a scan of the kinect outlining the major deficits in the simulation I started adding the 9x9 matching window I mentioned in an earlier post. Combined with disparity quantization and the shadow effects caused by the displacement between the projector and the camera the simulation looks already a lot better. This is a minimalistically modeled scene it is loosely modeled after this image

Kinect sensor finished

A first version of the Kinect sensor is now finished. It quantizes the disparity to 1/8 pixel since this is what the real kinect does. Thanks to Alexandru Ichim for his info (pointcloud blog) I've scanned the human scene (you can download it from the scans section) to demonstrate the sensor. Here is a picture of the scan setup and a picture of the scan

Kinect update

The occlusions due to parralax effects (the holes around object close to the camera) are already supported properly.

I am currently adding the quantization effects and expect a release within the week.

Michael

Coming up - Kinect simulation

We are currently implementing PrimeSense sensor support like the Microsoft Kinect camera. It will properly support:
  • Holes in the dethmap due to projector shadows (explanation)
  • Quantization effects (thanks to Alexandru Ichim)

Color support

Starting with version 1.0.10 (commit 1279ec4f91) the material color is returned per return. This does not apply to textures but you can color your scene with different materials and the RGB value is stored in the PCD file (or the EVD file). In case of the PCD files the colors can be displayed in the PCD viewer.

NOTE: There is still a color problem. The colors in the PCD viewer are not the same as the material colors in blender.

Google Summer of Code 2012 proposal

Research in the field of autonomous vehicles is rapidly expanding. We believe that improving BlenSor through GSoC would help extending this research even further. Google beeing one of the biggest players in this field seems to be a perfect match for BlenSor. However since BlenSor is still a very young project we would like to participate in GSoC 2012 by using the Blender Foundation as the umbrella organisation for the project. A student should extend the core functionality of BlenSor by implementing several of the following features:

    Blensor 1.0.9 released

    Today i've merged the patches from Peter Morton and this brings you the 1.0.9 release of blensor. Changelog:
    * Add scans to blender even if a range of frames is scanned
    * Scans that are done in sensor coordinates are now transformed to their world coordinates by transforming the whole pointcloud. This means the points are still represented in sensor coordinates but shown in their correct positions in world coordinates

    Pages

    Subscribe to RSS - blogs