OFH has extensive expertise in 3D mapping, distance measurement, we have worked on LIDAR, stereo imaging, time of flight, computational photography, light coding, structured illumination and many other methods. Our clients have sold tens of millions of units and are global leaders in robotic vision.
Below we describe a new approach developed by OFH and described in US Application No. 14/256,085 this method uses a pattern projector and an astigmatic lens placed in front of an image sensor to generate a depth map.
Circular spots are projected ( via laser with lenslet array or grating)
Astigmatic spots are collected at the image sensor
Image is decoded using various methods ( Hough transform, boundary definition)
Distance determined by the ratio between long and short axis of spot
Method is not sensitive to multi-path interference
Less sensitive to changes in object reflection/scattering, or changes in ambient light within a scene. This is because method relies changes in spot shape not spot intensity to determine distance
For certain applications system maybe used without projected pattern
System can be very small (in fact performance is better when projection axis and collection path are next to each other)
By adding astigmatism to the image collecting objective lens, the shape of point spread function (PSF) becomes dependent on the distance to the object: The eccentricity ε of elliptical PSF varies with the distance to the object.
Multi-path error is characteristic feature of TOF-based rangefinders and caused by unwanted reflections from object surfaces placed at an angle.
Our proposed solution belongs to the “Depth from Defocus” class of range finding methods which are based at the picture analysis, not from analysis of light phase shifts used in TOF systems.
The multi-pass error is most perceptible when the angle between surfaces is 90° because the returning light goes exactly in the opposite direction. We have tested this worst case by means of 2 surfaces ( the glossy paper coated carton) which are placed at the 90° angle ( please, see the next slide, upper row of pictures).
The pictures which are obtained with same flat object surface, but perpendicular to the system optical axis are presented for comparison (bottom row).
Astigmatic attachment can be optimized based required distance measurement range, distance measurement (longitudinal) accuracy, working wavelength and projected pattern geometry.
The distance map transversal resolution depends on number of spots projected.
Resolution, accuracy and repeatability influenced by sensor quality and imaging lens quality.
The passive layout means that the system does not use structured illumination. In the image below test targets have grid pattern for better show astigmatic effect. The distances from the camera to the targets: left: 1.2 m middle: 2.70 m; right: 0,50 m;
Low sensitivity to the illumination variations over the entire scene.
Low sensitivity to the different colors of the scene;
Easy adaptability to off-the-shelf lens and/or specific requirements regarding range of distances to be recognized.
This principle cannot resolve the so-called “White Wall” problem: object to be measured needs surface with optically resolvable features
These disadvantages of passive layout may be reduced by further update of picture processing algorithms or practically eliminated by the adding of special structured pattern projector, i.e. transforming of this Passive system into Active one,