Site icon OFH

LIDAR Lens Design

exc-605b4e99c74ec4518c85176c

Not all optical systems require sharp images and low aberrations. While imaging systems use MTF, Strehl ratio and wavefront error to evaluate quality, there are other systems like solar collectors where there is no image formation and the goal is to maximize the captured optical power.  This is the case with LIDARs (Light Detection and Ranging), fundamental components of autonomous vehicles.

LIDAR

The main function of a LIDAR is to measure the distance to an object.  This is done by sending a laser pulse from a source which is reflected by the object.  The reflected light is detected and the time-of-flight (TOF) is calculated giving an estimate of the distance to the object based on the photon return time.

In a LIDAR lens design project, delivering high efficiency in sending the light pulse and collecting the return pulse is essential. For the collection lens design, this means optimizing the captured energy over the lens’ field of view (FOV).  

Below we will discuss the lens designs used in two common LIDAR optical architectures:. Flash LIDAR sensor and scanning system.

Flash LIDAR

In a flash LIDAR, the entire field of view is illuminated by a single laser source.  The reflected light is then imaged onto a detector array and the TOF is calculated for each individual element in the detector. SInce they lack moving parts, they tend to be very robust, but they are usually used as short range sensors (<30 m) and have reduced fields of view compared to scanning systems. Flash LIDARs require homogeneous, full-area illumination of the scene. So,  the laser beam is expanded with diffusers and then projected onto the FOV.

In the image below, we can see the emission optical system for a flash LIDAR.  The EWOD (electrowetting) prism is used to scan the laser beam on a 15.6-degree arc.  The triplet lens creates a telecentric beam on the fisheye, and the fisheye increases the FOV to almost 180-degree arc

Flash LIDAR optical layout

Scanning LIDAR

Scanning LIDAR has a single collimated source that scans the system’s field of view using a MEMS-based micromirror or rotating prims.  At each new location, the light is being detected by a single photodetector and the TOF is thencalculated.  These systems can be more accurate than flash systems, allow for longer detecting ranges, but are bulkier, more complex, and expensive.  

One problem with scanning LIDAR is that if a scene is rapidly changing, the scanning system may not provide an accurate description of the viewing scene.

An example of a scanning LIDAR system is shown in the figure below.  In it, we have a focusing lens that focuses light on a MEMS and the MEMS reflects light into a f-theta lens.  The f-theta lens creates an image on a flat plane and it consists of three optical elements (with an effective focal length of 100 mm).  The next stage is a wide angle group that increases the FOV to 120-degree arc. 

Scanning LIDAR optical layout

Applications

Although we have mentioned Autonomous vehicles as an application for LIDARs, there are many interesting applications.

Agriculture:  besides the classification of crops and monitoring growth, LIDAR can be used to detect types of insects and their movement.

Archeology:  LIDAR systems mounted on drones or airplanes have been used identify ruins and settlements that are usually cover by the canopy or heavy vegetation

Robotics: Similar to autonomous vehicles, LIDAR in robotics are used to allow mapping of an environment and provide the robot with enough information to interact or avoid  obstacles.

Exit mobile version