Patterned color filter arrays are important components in digital cameras, camcorders, scanners, and multispectral detection and imaging instruments. In addition to the rapid and continuous progress to improve camera resolution and the efficiency of imaging sensors, research into the design of color filter arrays is important to extend the imaging capability beyond conventional applications. This paper reports the use of colored SU-8 photoresists as a material to fabricate color filter arrays. Optical properties, fabrication parameters, and pattern spatial resolution are systematically studied for five color photoresists: violet, blue, green, yellow, and red. An end-to-end fabrication process is developed to realize a five-color filter array designed for a wide angle multiband artificial compound eye camera system for pentachromatic and polarization imaging. Colored SU-8 photoresists present notable advantages, including patternability, color tunability, low-temperature compatibility, and process simplicity. The results regarding the optical properties and the fabrication process for a colored SU-8 photoresist provide significant insight into its usage as an optical material to investigate nonconventional color filter designs.
To address color polarization demosaicking problems in polarization imaging with a color polarization camera, we propose a color polarization demosaicking convolutional neural network (CPDCNN), which has a two-branch structure to ensure the fidelity of polarization signatures and enhance image resolution. To train the network, we built a unique dual-camera system and captured a pairwise color polarization image dataset. Experimental results show that CPDCNN outperformances other methods by a large margin in contrast and resolution.
more » « less- PAR ID:
- 10290625
- Publisher / Repository:
- Optical Society of America
- Date Published:
- Journal Name:
- Optics Letters
- Volume:
- 46
- Issue:
- 17
- ISSN:
- 0146-9592; OPLEDP
- Format(s):
- Medium: X Size: Article No. 4338
- Size(s):
- Article No. 4338
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
We calibrate and test a division-of-focal-plane red–green–blue (RGB) full-Stokes imaging polarimeter in a variety of indoor and outdoor environments. The polarimeter, acting as a polarization camera, utilizes a low dispersion microretarder array on top of a sensor with Bayer filters and wire-grid linear polarizers. We also present the design and fabrication of the microretarder array and the assembly of the camera and validate the performance of the camera by taking multiple RGB full-Stokes images and videos. Our camera has a small form factor due to its single-sensor design and the unique capability to measure the intensity, color, and polarization of an optical field in a single shot.
-
The rapid progress in intelligent vehicle technology has led to a significant reliance on computer vision and deep neural networks (DNNs) to improve road safety and driving experience. However, the image signal processing (ISP) steps required for these networks, including demosaicing, color correction, and noise reduction, increase the overall processing time and computational resources. To address this, our paper proposes an improved version of the Faster R-CNN algorithm that integrates camera parameters into raw image input, reducing dependence on complex ISP steps while enhancing object detection accuracy. Specifically, we introduce additional camera parameters, such as ISO speed rating, exposure time, focal length, and F-number, through a custom layer into the neural network. Further, we modify the traditional Faster R-CNN model by adding a new fully connected layer, combining these parameters with the original feature maps from the backbone network. Our proposed new model, which incorporates camera parameters, has a 4.2% improvement in mAP@[0.5,0.95] compared to the traditional Faster RCNN model for object detection tasks on raw image data.more » « less
-
We present EgoRenderer, a system for rendering full-body neural avatars of a person captured by a wearable, egocentric fisheye camera that is mounted on a cap or a VR headset. Our system renders photorealistic novel views of the actor and her motion from arbitrary virtual camera locations. Rendering full-body avatars from such egocentric images come with unique challenges due to the top-down view and large distortions. We tackle these challenges by decomposing the rendering process into several steps, including texture synthesis, pose construction, and neural image translation. For texture synthesis, we propose Ego-DPNet, a neural network that infers dense correspondences between the input fisheye images and an underlying parametric body model, and to extract textures from egocentric inputs. In addition, to encode dynamic appearances, our approach also learns an implicit texture stack that captures detailed appearance variation across poses and viewpoints. For correct pose generation, we first estimate body pose from the egocentric view using a parametric model. We then synthesize an external free-viewpoint pose image by projecting the parametric model to the user-specified target viewpoint. We next combine the target pose image and the textures into a combined feature image, which is transformed into the output color image using a neural image translation network. Experimental evaluations show that EgoRenderer is capable of generating realistic free-viewpoint avatars of a person wearing an egocentric camera. Comparisons to several baselines demonstrate the advantages of our approach.more » « less
-
Abstract The broadband solar K-corona is linearly polarized due to Thomson scattering. Various strategies have been used to represent coronal polarization. Here, we present a new way to visualize the polarized corona, using observations from the 2023 April 20 total solar eclipse in Australia in support of the Citizen CATE 2024 project. We convert observations in the common four-polarizer orthogonal basis (0°, 45°, 90°, & 135°) to −60°, 0°, and +60° (
MZP ) polarization, which is homologous toR, G, B color channels. The unique image generated provides some sense of how humans might visualize polarization if we could perceive it in the same way we perceive color.