skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Autofocusing method for a digital fringe projection system with dual projectors
This paper presents a novel technique to achieve autofocusing for a three-dimensional (3D) profilometry system with dual projectors. The proposed system uses a camera that is attached with an electronically focus-tunable lens (ETL) that allows dynamic change of camera’s focal plane such that the camera can focus on the object; the camera captures fringe patterns projected by each projector to establish corresponding points between two projectors, and two pre-calibrated projectors form triangulation for 3D reconstruction. We pre-calibrate the relationship between the depth and the current being used for each focal plane, perform a 3D shape measurement with an unknown focus level, and calculate the desired current value based on the initial 3D result. We developed a prototype system that can automatically focus on an object positioned between 450 mm to 850 mm.  more » « less
Award ID(s):
1637961
PAR ID:
10144198
Author(s) / Creator(s):
; ; ; ; ;
Publisher / Repository:
Optical Society of America
Date Published:
Journal Name:
Optics Express
Volume:
28
Issue:
9
ISSN:
1094-4087; OPEXFF
Format(s):
Medium: X Size: Article No. 12609
Size(s):
Article No. 12609
Sponsoring Org:
National Science Foundation
More Like this
  1. Full surround 3D imaging for shape acquisition is essential for generating digital replicas of real-world objects. Surrounding an object we seek to scan with a kaleidoscope, that is, a configuration of multiple planar mirrors, produces an image of the object that encodes information from a combinatorially large number of virtual viewpoints. This information is practically useful for the full surround 3D reconstruction of the object, but cannot be used directly, as we do not know what virtual viewpoint each image pixel corresponds---the pixel label. We introduce a structured light system that combines a projector and a camera with a kaleidoscope. We then prove that we can accurately determine the labels of projector and camera pixels, for arbitrary kaleidoscope configurations, using the projector-camera epipolar geometry. We use this result to show that our system can serve as a multi-view structured light system with hundreds of virtual projectors and cameras. This makes our system capable of scanning complex shapes precisely and with full coverage. We demonstrate the advantages of the kaleidoscopic structured light system by scanning objects that exhibit a large range of shapes and reflectances. 
    more » « less
  2. This Letter presents a novel, to the best of our knowledge, method to calibrate multi-focus microscopic structured-light three-dimensional (3D) imaging systems with an electrically adjustable camera focal length. We first leverage the conventional method to calibrate the system with a reference focal lengthf0. Then we calibrate the system with other discrete focal lengthsfiby determining virtual features on a reconstructed white plane usingf0. Finally, we fit the polynomial function model using the discrete calibration results forfi. Experimental results demonstrate that our proposed method can calibrate the system consistently and accurately. 
    more » « less
  3. A new lens capability for three-dimensional (3D) focal control is presented using an optofluidic system consisting ofn × narrayed liquid prisms. Each prism module contains two immiscible liquids in a rectangular cuvette. Using the electrowetting effect, the shape of the fluidic interface can be rapidly adjusted to create its straight profile with the prism’s apex angle. Consequently, an incoming ray is steered at the tilted interface due to the refractive index difference between two liquids. To achieve 3D focal control, individual prisms in the arrayed system are simultaneously modulated, allowing incoming light rays to be spatially manipulated and converged on a focal point located atPfocal(fx,fy,fz) in 3D space. Analytical studies were conducted to precisely predict the prism operation required for 3D focal control. Using three liquid prisms positioned on thex-,y-, and 45°-diagonal axes, we experimentally demonstrated 3D focal tunability of the arrayed optofluidic system, achieving focal tuning along lateral, longitudinal, and axial directions as wide as 0 ≤ fx ≤ 30 mm, 0 ≤ fy ≤ 30 mm, and 500 mm ≤ fz ≤ ∞. This focal tunability of the arrayed system allows for 3D control of the lens’s focusing power, which could not be attained by solid-type optics without the use of bulky and complex mechanical moving components. This innovative lens capability for 3D focal control has potential applications in eye-movement tracking for smart displays, autofocusing of smartphone cameras, or solar tracking for smart photovoltaic systems. 
    more » « less
  4. A lens performs an approximately one-to-one mapping from the object to the image plane. This mapping in the image plane is maintained within a depth of field (or referred to as depth of focus, if the object is at infinity). This necessitates refocusing of the lens when the images are separated by distances larger than the depth of field. Such refocusing mechanisms can increase the cost, complexity, and weight of imaging systems. Here we show that by judicious design of a multi-level diffractive lens (MDL) it is possible to drastically enhance the depth of focus by over 4 orders of magnitude. Using such a lens, we are able to maintain focus for objects that are separated by as large a distance as ∼<#comment/> 6 m in our experiments. Specifically, when illuminated by collimated light at λ<#comment/> = 0.85 µ<#comment/> m , the MDL produced a beam, which remained in focus from 5 to 1200 mm. The measured full width at half-maximum of the focused beam varied from 6.6 µm (5 mm away from the MDL) to 524 µm (1200 mm away from the MDL). Since the side lobes were well suppressed and the main lobe was close to the diffraction limit, imaging with a horizontal × vertical field of view of 40 ∘<#comment/> ×<#comment/> 30 ∘<#comment/> over the entire focal range was possible. This demonstration opens up a new direction for lens design, where by treating the phase in the focal plane as a free parameter, extreme-depth-of-focus imaging becomes possible. 
    more » « less
  5. Abstract This paper develops a position estimation system for a robot moving over a two-dimensional plane with three degrees-of-freedom. The position estimation system is based on an external rotating platform containing a permanent magnet and a monocular camera. The robot is equipped with a two-axes magnetic sensor. The rotation of the external platform is controlled using the monocular camera so as to always point at the robot as it moves over the 2D plane. The radial distance to the robot can then be obtained using a one-degree-of-freedom nonlinear magnetic field model and a nonlinear observer. Extensive experimental results are presented on the performance of the developed system. Results show that the position of the robot can be estimated with sub-mm accuracy over a radial distance range of +/−60 cm from the magnet. 
    more » « less