OFH has extensive expertise in 3D mapping, distance measurement, we have worked on LIDAR, stereo imaging, time of flight, computational photography, light coding, structured illumination and many other methods. Our clients have sold tens of millions of units and are global leaders in robotic vision.
Below we describe a new approach developed by OFH and described in US Application No. 14/256,085 this method uses a pattern projector and an astigmatic lens placed in front of an image sensor to generate a depth map.
Circular spots are projected ( via laser with lenslet array or grating)
Astigmatic spots are collected at the image sensor
Image is decoded using various methods ( Hough transform, boundary definition)
Distance determined by the ratio between long and short axis of spot
Method is not sensitive to multi-path interference
Less sensitive to changes in object reflection/scattering, or changes in ambient light within a scene. This is because method relies changes in spot shape not spot intensity to determine distance
For certain applications system maybe used without projected pattern
System can be very small (in fact performance is better when projection axis and collection path are next to each other)
By adding astigmatism to the image collecting objective lens, the shape of point spread function (PSF) becomes dependent on the distance to the object: The eccentricity ε of elliptical PSF varies with the distance to the object.
The passive layout means that the system does not use structured illumination. In the image below test targets have grid pattern for better show astigmatic effect. The distances from the camera to the targets: left: 1.2 m middle: 2.70 m; right: 0,50 m;
Copyright © 2002- 2018 Ellis Amalgamated LLC, All rights reserved.