Next Article in Journal
Development of Thermal Principles for the Automation of the Thermographic Monitoring of Cultural Heritage
Next Article in Special Issue
Design of an Edge-Detection CMOS Image Sensor with Built-in Mask Circuits
Previous Article in Journal
Defect Classification Using Postpeak Value for Pulsed Eddy-Current Technique
Previous Article in Special Issue
Assessing the Influence of Temperature Changes on the Geometric Stability of Smartphone- and Raspberry Pi Cameras
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Angular Light, Polarization and Stokes Parameters Information in a Hybrid Image Sensor with Division of Focal Plane

by
Francelino Freitas Carvalho
1,2,*,
Carlos Augusto de Moraes Cruz
1,
Greicy Costa Marques
1 and
Kayque Martins Cruz Damasceno
1,3
1
Department of Electronics and Computation, Universidade Federal do Amazonas, Manaus 69077-000, Brazil
2
INFRAERO Empresa Brasileira de Infraestrutura Aeroportuária, Manaus 69041-000, Brazil
3
SIDIA Instituto de Ciências e Tecnologia, Manaus 69055-035, Brazil
*
Author to whom correspondence should be addressed.
Submission received: 29 April 2020 / Revised: 5 June 2020 / Accepted: 8 June 2020 / Published: 16 June 2020
(This article belongs to the Special Issue CMOS Image Sensors and Related Applications)

Abstract

:
Targeting 3D image reconstruction and depth sensing, a desirable feature for complementary metal oxide semiconductor (CMOS) image sensors is the ability to detect local light incident angle and the light polarization. In the last years, advances in the CMOS technologies have enabled dedicated circuits to determine these parameters in an image sensor. However, due to the great number of pixels required in a cluster to enable such functionality, implementing such features in regular CMOS imagers is still not viable. The current state-of-the-art solutions require eight pixels in a cluster to detect local light intensity, incident angle and polarization. The technique to detect local incident angle is widely exploited in the literature, and the authors have shown in previous works that it is possible to perform the job with a cluster of only four pixels. In this work, the authors explore three novelties: a mean to determine three of four Stokes parameters, the new paradigm in polarization cluster-pixel design, and the extended ability to detect both the local light angle and intensity. The features of the proposed pixel cluster are demonstrated through simulation program with integrated circuit emphasis (SPICE) of the regular Quadrature Pixel Cluster and Polarization Pixel Cluster models, the results of which are compliant with experimental results presented in the literature.

1. Introduction

Recently, there have been increases in the demand for new multimedia resources of three-dimensional content, particularly in images, games, movies and augmented reality. In this context, capturing 3D information or depth sensing is essential for many applications including object and material classification [1,2,3], navigation [4,5], image polarization contrast in biological tissues [6,7,8,9], improved vision in haze conditions [10] and diagnosis in oncology [11,12], different views of the same image [13,14], facial recognition in the surveillance area [15,16,17,18], atmospheric remote sensing and other applications. Therefore, the search for improvements in depth sensing and 3D image capture has become an important target in the field of image sensors.
It has been shown that through the aid of capturing the local angle of incidence and the intensity of light received, the whole 3D information of the recorded scene can be reconstructed. In order to provide solutions to this problem, numerous methods for 3D picture image capture have been proposed within the literature, including Time-of-Flight (ToF) [19,20], multi-apertures [21,22,23], Talbot’s diffraction pixels set [24,25,26], division of amplitude [27], division of focal plane microgrid (DoFP) with polarization pixel filters [24,28,29,30] and quadrature pixels [9,25,26]. Disadvantages of the previous techniques include additional laser source and time to process the laser signal in ToF, a large amount of micro-lens that require a very large imager array in the case of multi-apertures image sensors, and a large number of pixels to decode angle variations that involves non-trivial post-processing in the case of Talbot’s pixels. In order to avoid such issues, recent approaches propose the use of two sets of different pixel clusters [31,32,33,34], the polarization pixel cluster (PPC) and the quadrature pixel cluster (QPC). All these techniques have limitations with their benefits and disadvantages, however, in this work, only the most relevant characteristics of the sets of polarization and quadrature will be taken into account.
According to the proposition of [31,32,34], detecting and measuring the angle of incidence and polarization of light employs two sets of different pixel clusters, the PPC and the QPC. The PPC is composed of four pixels: A, B, C and D, as shown in Figure 1a. It is built with an unshielded pixel used to detect local light intensity that serves as reference value; a pixel with horizontally organized microwire which might be strongly sensitive to 90° polarized light, a pixel with vertical grids strongly sensitive to 0° polarized, and a 45° grid pixel to determine the Stokes parameters. The PPC provides high sensitivity to both light polarization and local incident angle [31,32,34]. It is shown to be much more sensitive to the variation of the angle of incidence than the classic Talbot pixel. To determine the incident angle of the local light, only the horizontal and vertical PPC grated pixels are required. However, positive and negative angles produce similar PPC output, and consequently, an extra technique is required to determine the sign of the light incidence angle, whether positive or negative.
In order to determine the angle of incidence sign, the QPC is employed. It produces a reasonably weak response, however, could be very linear and proportional to the variation of the light incident angle. The QPC, Figure 1b, is much less touchy to incidence angle variation than PPC, but its output is not symmetric around zero, as is PPC. Therefore, this cluster can be employed to determine the light incident angle sign.
As shown above, in terms of imager resolution and consumption of additional resources, the solution proposed in [31] is so far the best known solution for local intensity and angle detection. However, the configuration of the PPC and QPC pixel components can be further optimized in order to achieve improvements in terms of imager resolution. It shows the possibility to build a more optimized pixel cluster with the same functionalities presented in [31] using only four pixel components out of the PPC and QPC set of pixels.
In this paper, the behavior of the photocurrents in the QPC and PPC component pixels as a function of pixel grating, local light intensity, polarization and incident angle is evaluated in a simulation program with integrated circuit emphasis (SPICE), and the results are employed as the basis of the new compact pixel cluster that is able to replace the QPC and PPC. It is worth noting that tools and seminal analyses of the approach herein presented were already reported by the authors in [35,36,37] where details of an optimized pixel cluster were presented and discussed, however, such a configuration was unable to provide the data needed to determine Stokes parameters, as will be discussed. The new configuration of a hybrid quadrature-polarization pixel cluster (HPC) presented in this work is able to perform all the functions performed by previous solutions, and to provide the necessary data to determine the first three Stokes parameters. Moreover, the minimum resolution of an imager array employing the proposed pixel cluster is 75% higher than that of previous solutions, and it may be extended to its complete potential resolution, as will be shown along this text.
The paper is organized as follows: the background concepts of pixel details of the photocurrent and topology model used, pixels sensitive to polarization and incidence of light, and a short principle about Stokes parameters and degree of linear polarization (DoLP) are presented in Section 2. The proposed hybrid macropixel is described at the end of Section 2. The simulation outcome and the discussions showing the viability of the proposed solution are exhibited in Section 3. The conclusions drawn from this work are given in Section 4.

2. Materials and Methods

This section describes the basic concepts used all through this work.

2.1. Operating Principles of Micropolarizer Array-Based Polarization Sensors

The first simulated cluster was the PPC. It works based on the principle of polarization by absorption, also known as division of focal plane (DoFP). When the light strikes with its electric field vector parallel to the thin wire, electric currents are generated along the wire and the energy of light is absorbed, simply because the waves are absorbed by the microwires [33,38,39]. If the electric field is perpendicular to the grids, the maximum level light is transmitted, as shown in Figure 2a.
The sensor array of photo detectors is covered with an array of wire-grid polarizer matched microwire polarization filters. A pattern of the polarization filter array is shown in Figure 2b,c. It consists of four distinct filters which can be offset way of by 45 degrees: 0°, 45°, 90° and 135° (the same angle of −45°).

2.2. Photodiode Model

All the pixels of the clusters have the same schematic topology, presented in Figure 3 known as of the 3T-APS. BSIM3v3 simulations have been executed with an active pixel sensor in a pattern of 6-metal 1-poly, complementary metal oxide semiconductor (CMOS) 0.18 μm TSMC technology. The pixels of these clusters operate in the linear mode that provides excellent sensitivity in the direction of low light intensities, and a reduced dynamic range towards high light intensities. In the SPICE circuit electronic simulator, the pixel photodiode (PD) of the 3T-APS was modeled consistent with the schematic detailed on the right of Figure 3, in red dashed detail, wherein Rs is the PD series resistance, Rsh is the PD shunt resistance, Iph is the PD photocurrent, Idark is the PD current level at dark situation, Cj is the PD junction capacitance. For the analyses herein presented, the noise contributions, current In in the Figure 3, was not taken into account. The photocurrent Iph is the current generated by the photons reaching the depletion region of the PN junction [40], which in this case is a function of the pixel grating and of the light polarization, incident angle and intensity.
The light output information is transduced and treated as a voltage or current signal. It was chosen to work with voltage as the output signal, wherein all stages of the linear mode CMOS active pixel sensor (APS) operation cycle may be observed. The linear mode CMOS APS operating cycle can be divided into three special intervals: the reset time, the exposure time and the sampling time detailed in [40]. The principal parameters are given in Table A1 in the Appendix A.

2.3. Polarization Pixel Cluster

The light irradiated along the transmission axis of a polarizer with division of focal plane reaches the surface of the photodetector PD according to the extended Malus’s law [33], in which the maximum irradiance is given by Equation (1):
I p h ( ψ , α ) = I ( 0 ) cos 2 ( ψ ) k A cos ( α ) + I d a r k
where ψ is the angle between the maximum electric field and the transmitter axis of the analyzer, and I(0) is the initial irradiation incident at the grid. In the case of the wire-grid polarizer, the transmission axis of the grid is perpendicular to the microwaves, as shown in Figure 2b. To achieve this polarization target using the photocurrent model shown in Figure 3, the value of Iph is modeled to be proportional to the square of the cosine of the polarization angle ψ. Moreover, a cosine factor multiplier k proportional to the area of the photodiode A and angle of incidence α was added according to Lambert’s law [31], as an additional dark current Idark.
The pixel A of the PPC cluster is completely unshielded and thus exposed to the entire external incident light, as shown in Figure 1a. Apart from the grating scheme, the circuit topology of each PPC pixel is the same as that presented in Figure 3. In Pixel B with horizontal grids, the maximum transmission will take place when the light is polarized at 90° and in Pixel D with vertical grids, the maximum transmission will occur when the light is at 0°, as can be seen in Figure 1a. In the grid arranged at −45°, the maximum transmission will occur when the polarization of the light is 90° displaced, that is, at 45° with respect to the chosen orientation of the grids, which is exactly the Pixel C in Figure 1a.

2.4. Quadrature Pixel Cluster

The second simulated cluster was the QPC illustrated in Figure 1b. This is a 3D view of the cluster, consisting of a block of metal on the top of the four photodiodes in such a way that the area which receives the incidence of light is proportional to the light incident angle.
As detailed in [25], on this cluster, the direction of the incident light is determined by the unshaded area, i.e., by the area of the photodiode that has incidence of light. Knowing that the total area is given by A and the areas without shade are given by AushD, that is, the areas that receive light will be given by Equation (2), where αx and αy are angle on x-axes and y-axes, respectively. The other parameters are given in Table A1, in the Appendix A.
These mathematical equations were embedded in the Iph photocurrent model proportional to the area A, thus enabling the simulation of the QPC according to a topology just like that shown in Figure 1b.
A ushD = A { X D 0 + [ ( T i m ε + T M ) tan ( arcsin ( n ar n ε sin α x ) ) ] } × { X D 0 + [ ( T i m ε + T M ) tan ( arcsin ( n ar n ε sin α y ) ) ] }

2.5. Stokes Parameters and Degree of Polarization

The parameters S0, S1, S2 and S3 are referred to as the Stokes polarization parameters for a plane wave [1,5,33]. The polarization state of an electromagnetic wave can be defined more easily through this set of parameters. They were first introduced in optics by Sir George Gabriel Stokes in 1852. The Stokes parameters are real quantities and are simply the variable observables of the polarization ellipse, and consequently, the optical field [41,42].
The first Stokes parameter S0 is the overall light intensity with the unfiltered intensity value [43]. The parameter S1 describes the amount of linear or vertical linear polarization. The parameter S2 describes the quantity of linear polarization +45° or −45° (135°). Finally, the parameter S3 describes the amount of right or left circular polarization contained in the beam. It is possible to observe that the four Stokes parameters are expressed in terms of intensity, and again it is emphasized that the Stokes parameters are measurable real quantities and provide the polarization properties of light [41,42].
The Stokes parameters are computed based on the intensity measurements from the photodiodes and are presented by Equations (3)–(7) in which I is the intensity of the light after passing through a linear polarizer of 0°, and I90°, I45° and I135° are the intensity after light passing of 90°, 45° and 135°, respectively. Thus, the parameters can be given by:
S 0 = I n t e n s i t y = I t o t
S 0 = I 0 ° + I 90 °
S 1 = I 0 ° I 90 °
S 2 = I 45 ° I 135 °
S 3 = I R H C I L H C
The circular polarization element S3 describes the excess of left-hand circularly polarized portion over the right-hand circularly polarized portion, which will not be used in this work.
In the works of [28,37,43,44], they show that the Stokes parameters can be expressed by the following equations:
S 1 = 2 I 0 ° I t o t
S 2 = 2 I 45 ° I t o t
With the Stokes parameters, it is possible to determine the degree of polarization (DoP) of any optical signal directly through its S components. The DoP quantifies the fraction of the optical signal that is really polarized, and the DoP of a totally polarized optical signal is equal to the unit, while the DoP of a non-polarized optical signal is zero, given by Equation (10).
If a beam of light is linearly polarized, the circular and elliptical polarizations are zero. Its degree of polarization is therefore regularly known as the linear polarization degree (DoLP). The degree of linear polarization of a light beam is described by Equation (11).
In order to obtain all optical properties of partially polarized light, three parameters are important to know: wave intensity, degree of linear polarization (DoLP) and polarization angle (AoP). The vector AoP can be represented by way of Equation (12).
D o P = ( S 1 2 + S 2 2 + S 3 2 ) / S 0
D o L P = ( S 1 2 + S 2 2 ) / S 0
A o P = ( 1 / 2 ) arctan ( S 2 / S 1 )
An alternative way to visualize the Stokes parameters is through the Poincaré sphere shown in Figure 4, used to view the four independent S components as points on or within the sphere [5,41,45].
In Figure 4, it is possible to observe the polarization azimuthal angle ψ and the ellipticity angle χ. The surface of the Poincaré sphere is typically used to give a top-level view representation of all the possible polarization states of completely polarized light. In this situation, the polarization azimuthal angle ψ and the ellipticity angle χ can be expressed in terms of the Stokes parameters following Equations (13) and (14):
sin 2 χ = S 3 / S 1 2 + S 2 2 + S 3 2
tan 2 ψ = ( S 2 / S 1 )

2.6. The Hybrid Pixel Cluster Proposed

Employing different pixel clusters as in [31] reduces the resolution of the imager sensor, thus more compact pixel clusters are desirable. To purposefully mitigate this issue, a new approach using features of both the PPC and QPC is proposed in this paper. The new HPC pixel cluster is a hybridization of the two preceding pixel clusters. The proposition is to replace the 90° sensitive pixel B in Figure 1a by a 45° sensitive pixel and update both the light intensity detection pixel A of the PPC cluster and the pixel C by two pixels of QPC in opposite location, therefore resulting in the cluster of pixels shown in Figure 5a.
The two HPC grated pixels still have the same functions as those of the PPC cluster, one is a filter out to 0° of polarization and the other to 45°. These two PPC pixels were chosen in order to enable the calculation of the third Stokes parameter S2 as described in Equation (6) or (9). With the previous solution of the work of [37] in which the 0° and 90° pixel were used, it was not possible to determine S2. Whereas the two other pixels A and C have two different functions, the first is to compose two different QPC clusters and the other is to detect the local light intensity. An instance of an array pattern shaped by 16 devices of the proposed HPC cluster is presented in Figure 5b. All the four pixels of the proposed pixel cluster have the schematic topology, shown in Figure 3. A 3D view of the HPC is presented in Figure 5c.
Besides being a more compact pixel cluster, the other main advantage of the HPC is its capability to detect the light intensity using the pixels A and C. For the same light intensity level, the output results of pixels A and C will be dependent upon the local light incident angle. Nevertheless, either the sum or averages of the two output results are independent of the local light incident angle in the complete angle detection range. The outcomes presented in [25] show that the complete angle detection range for the QPC cluster is between −45° and 45°.
The shielded area within the pixels A and C reduces their sensitivity, and for a given light intensity, the average of the two outputs will be lower than the output of the unshielded pixel A of the PPC in Figure 1a. Although there is a reduction in sensitivity, it is not so bad, as it will be shown, and does not restrain the usability of the proposed procedure for local light intensity detection.
The intensity pixel A within the PPC, in Figure 1a, was employed in [5] and [43] in order to normalize the response of the polarization pixels B, C and D. The pixel cluster presented in [43] has the same structures of the HPC proposed in this work, but it does not have the ability to determine the incident angle sign as does the QPC. However, it is possible to determine the local light intensity in a polarization pixel even without the specialized light intensity detection pixel as inside the PPC.
The proposed HPC produces a virtual light intensity response in the middle of the pixel cluster in the position shown by the red square in Figure 5b instead of producing it in one of its pixels as in the case of the PPC. Despite the reduced sensitivity of light intensity, the resolution of a matrix employing the HPC cluster is 75% higher than that of one employed the solution proposed in [34]. This happens because the information of a single point in an imager employing the cluster of [34] requires seven pixels to be completely represented, resulting in 1/7th of the total array resolution, whereas using the proposed cluster solution requires only four pixels, resulting in 1/4th of the total array resolution.
The problem with the reduced light intensity sensitivity of proposed technique will be discussed in the next section.

3. Results and Discussion

3.1. Circuit Simulation Data

The SPICE simulation results of pixel D with vertical grids, in Figure 5a, are presented in Figure 6. In this situation, the maximum transmission takes place when the light is polarized at 0° and the higher output level takes place at 0° of incidence angle. The simulation results of the proposed model are compliant with those of experimental reports found in the literature, as can be verified by comparing the dashed and dotted plots against the experimental results extracted from [31]. Such comparison is possible because two of the four pixels that compose the HPC are similar to two of the pixels of the QPC cluster, whereas the other two are similar to two of the pixels of the PPC cluster. Therefore, similar pixels even in different clusters might yield the same output results. Thus, the new proposed HPC model can be regarded as quite reliable, as will be discussed in the following statistical analysis.
The PPC together with the QPC results for the light polarization of 0° and range of incident angle varying from −45° to 45° are plotted in Figure 6 with a step of 5°, where the dotted plot is the result of the differential output of the QPC pixels A and C (Vout_AVout_C) and the dashed plot is the output result of the PPC pixel D with 0° polarization. It is worth noting that using the difference between the output of QPC pixels A and C results in higher signal swing than using each one individually [36]. The difference between the two output signal yields and output range from −0.2 to 0.2 V allows the measurement of the incident angle from −45° to 45°. This information complements the PPC symmetry deficiency. These results are consistent with those demonstrated in [31]. Similar results are generated in the proposed HPC.
The Minitab statistical software, Version 16 was used to manage the data. The R2 ratio between the 0° PPC pixel data simulated with our model and the experimental results in the literature was 0.969. The value of Pearson’s correlation coefficient between these data was 0.984. Knowing that the coefficient value is 1 means that the correlation between the variables is perfectly positive. If this value has a value of −1, it means that the relationship between the variables is perfectly negative. If this value is 0, it means that the variables are independent of each other. As the value obtained was very close to the unit with a relative error of 1.6%, we can conclude that the simulated data presented good accuracy in relation to the experimental data, therefore, the model becomes reliable. For the QPC data, the value of R2 was 0.934 among the simulation data in our proposed model in relation to the experimental data obtained from [31]. The value of Pearson’s correlation coefficient was 0.966. As the value obtained was also very close to the unit with a relative percentage error of 3.4%, this result also makes the model reliable due to this good accuracy.
When doing a linear regression between the PPC data, to obtain a relationship of the type y = ax + b, considering y as the simulated data and x the measured data from the literature, we obtained y = 0.940x + 0.042, emphasizing that in the ideal case this equation would be y = x. This result can be considered satisfactory to validate the PPC model. Applying the linear regression to the QPC data, considering the same convention of the previous paragraph for y and x, we obtained y = 1.08x − 0.0103. Again, this result is considered satisfactory for the validation of the QPC model.
The results of the 45° polarization sensitive of the proposed HPC model are the same as the PPC; that is, when the light polarization was 45°, maximum light transmission was obtained, and when it was −45°, parallel to the grids, the minimum light transmission was obtained. The result of the pixel sensitive to the 0° polarization of the HPC was similar to that of the PPC, shown in Figure 6, since they have the same configuration of sensitivity to the polarization of 0°. With regard to the proposed HPC solution, the output voltage values for the difference between pixel A and pixel C show that these pixels behave in the same way as QPC pixels. The difference between the two pixel output indicates the angle of light incidence.
Now, Figure 7 depicts a three-dimensional simulated pixel level voltage output for each pixel as a function of the sign of incidence angle and polarization angle to the pixels D (0° polarizer) and B (45° polarizer) of the HPC, respectively. In these plots, the incident angle varies from −45° to 45° with 5° step and polarization angle varies from 0° to 360° with a step of 5°.
It is possible to see that in a particular situation in which the polarization angle is 0° in Figure 7a with variation only of the incidence angle, it produces precisely the concave plot shown in Figure 6.
The simulated voltage distribution map on the focal plane to light polarized at 0° is shown in Figure 8a and of light polarized at 45° in Figure 8b. This is a two-dimensional distribution graph of the output voltage of the light intensity pixel and a corresponding voltage line profile of 0° polarization (left) and 45° polarization (right) as a function of the incidence angle and the polarization angle. In this figure it is possible to examine the symmetry around the axis of light incidence at 0°, and consequently, the QPC is essential to determine the incidence sign.
In a specific situation where the polarization angle varies from 0° to 360° with step of 5° and only the normal incident angle is taken into account, i.e., the perpendicular beam in the surface of the sensor, the results match those displayed in Figure 9. These results are also similar to the experimental reports of [46] considering the higher light transmission. The light intensity plots are shown in Figure 9, where it is possible to compare the difference between the outputs of pixel A of PPC of 0.456 V and that of pixel A of the proposed HPC of 0.344 V. The HPC voltage level is lower than that of the PPC because of the metal shielding that reduces the pixel light incident area and therefore its sensitivity. This issue can be overcome, for example, by the sum of two or four adjacents pixels.
Using the graphic Poincaré sphere to visualize different types of polarized light and using Equations (3), (8) and (9) in which only the intensity pixels Itot, I0° and I45° are used, it is possible to determine the Stokes parameters S0, S1 and S2. Although the HPC does not produce the same intensity levels of PPC, it is possible to produce a proportional Itot value through normalization.
Our proposed cluster model in which the 90° grated pixel was replaced by the 45° grated pixel is analyzed. Thus, all parameters are acquired using Equations (3), (8) and (9). However, in this model, there is no combination of I0° and I90° to obtain Itot, and, consequently, the Equation (4) cannot be used, as it was the work of [46] which used these two pixels to determine the value of Itot. Although the intensity value is lower than in the unshielded pixel (PPC pixel A), the following equations can be employed to correct this distortion: Itot = cte × I′tot, in which I′tot is the light intensity in the partially shielded pixels of the HPC cluster and cte is a constant that represents the ratio between the light sensitive area of a fully exposed pixel and a partially shielded pixel. For example, if the metal shield covers 25% of the pixel light sensitive area, cte is determined by Itot/I′tot = 1/0.75 = 1.333.
The plots in Figure 9 show the voltage output signal of the 0° and 45° grated pixels, the intensity pixel the PPC and of the intensity of the HPC, considering normal light incidence with light polarization angle varying from 0° to 360° with steps of 5°. In this case where Vtot = 0.456 V and V′tot = 0.344 V are equivalent to 75.4% of the total incident light with a percentage error of 0.53%, resulting in a cte value of 1.326. The real value of this parameter can only be experimentally determined, as it depends on the sensitivity conditions of the photodiode, the size and mismatches of the manufacturing process of the metal shielding, as well as the losses in the dielectric medium of SiO2.

3.2. 3D Simulation Results

To show the feasibility of the proposed HPC concept, polarized images at 0° and 45°, as well as an image equivalent to 3/4 of the total intensity equivalent to the partially shielded, were acquired. With these three images, the equivalent operation of the proposed pixel cluster is presented and discussed. Figure 10 shows an image with different plastic materials including a black plastic bowl. On the right side of the black bowl, a linearly polarized film plastic at 135° was included for testing.
For the acquisition of the polarized images, a 2304 × 3456 pixels CMOS polarimetric photographic sensor was used, with the addition of a linear polarization filter in the front position of the image acquisition device, as can be seen in an illustrative way in Figure 11. A set of two images was collected using a Canon EFS lens with 15–85 mm with a linear polarization filter (72CP, Tiffen) oriented at 0° and 45° in front of the CMOS sensor device and properly rotated at the appropriate angles to obtain the two polarizations desired, as seen in Figure 11a,b. All lighting, positioning, framing and sensor settings were kept constant for each set of images to ensure that there were no differences between the images. A third image was obtained without polarization and a computational treatment was performed with MATLAB using digital image processing to obtain an image with 75% light intensity transmittance. Figure 11c is an illustrative manner to represent that the light was attenuated.
The images of the scenario with polarizations 0° and 45° can be observed in Figure 12a,b, which are equivalent to a whole image by HPC type D and B pixels, shown previously in Figure 5a, i.e., both I and I45° images, respectively. In Figure 12c, it has the image equivalent to pixels A and C in Figure 5a, in which there is a metal covering that blocks 25% of the photodiode, herein named I′tot. These images were used as inputs, simulating what we would obtain with the HPC shown in Figure 5a, and will be used to calculate Stokes parameters.
There are small noticeable differences among the images when visual inspection is carried out, but changes are observed in the black bowl that shows slight visual variations. It is emphasized here that, although human beings have a well-developed vision that detects variations of color and brightness in several orders of magnitude, they are unable to detect polarization information [32], and therefore, we do not observe many differences between the images. The most noticeable visual change is in the 135° polarized plastic film on the right side of the bowl, which has a darker color in Figure 12b which is the 45° polarized image. The plastic film is polarized at 135° and the image is polarized at 45°, that is, the image is polarized with an absolute difference of 90° with respect to the film. Thus, the polarizations are perpendicular to each other, and consequently, the film features a color approaching black. In the image of Figure 12a, the plastic film is polarized at 135° (−45°), but the image is polarized at 0°, i.e., the image is polarized with an absolute difference of 45° with respect to the plastic film, which implies an intermediate color between white and black. In Figure 12c, there is an image with 3/4 of the pixel intensity level of the original image without polarization filter, which corresponds to 75% of the total incident light intensity.
It is noticeable that the variations in intensity along the figure of the black plastic bowl are minimal in most of the image and the shape of the figure is difficult to determine simply with the image of two-dimensional intensity as shown in Figure 10.
A 700 × 1938 resolution image was chosen that included the bowl, a piece of plastic cover on the left and the 135° polarized film on the right, as highlighted in Figure 13. This image was chosen with a tangent cut to the upper part of the bowl, as shown in Figure 13, to improve the formation of the three-dimensional image, as will be shown next.
The MATLAB R2015a software was used to perform the processing of polarized images, running on a DELL computer model INSPIRON N4110 with Windows 10 operating system (64 bits) and Intel Core i3-processor 2350M CPU @ 2.3 MHz with 8 GB of RAM. Initially, the image of Figure 12a was cut and transformed into a computational grayscale. To obtain the points equivalent to those of 0° polarization (pixel D of the HPC in Figure 5a), the pixel value of intensity was collected in those specific positions and zero values were assigned to the other positions, resulting in Figure 14a. Considering the image without zeros, we have the result shown in Figure 14b, which is a 0° pixel polarized image without the zeros points.
Then, the image in Figure 12b was also cut and transformed into a grayscale. To obtain the points equivalent to those of 45° polarization (pixel B of the HPC in Figure 5a) in the same way as before, the pixel value of intensity was collected in those specific positions and zero values were assigned to the other positions, resulting in Figure 14c. With the removal of the zero points, the image of Figure 14d was obtained.
To obtain the third image, a similar process was performed in which the image of Figure 12c was cut and transformed into grayscale to obtain pixels A and C of the proposed HPC pixel of Figure 5a. Similar to the previous procedure, the other pixels are set to zero, resulting only in pixels of 75% intensity, as can be seen in Figure 14e. Again, the zero points were removed, with Figure 14f as the result.
Thus, in the images of Figure 14 on the right side: Figure 14b,d,f, the images I, I45° and I′tot of the black bowl are shown, respectively.
In the image sensor shown in Figure 15, a 2 by 2 pixel HPC cluster is addressed and accessed simultaneously. In the 2 by 2 calculation window, one pixel records the projected polarized image at 0 degree (I), another records the polarized image projected at 45 degrees (I45°) and two partially covered pixels record the intensity image unfiltered (I′tot). Polarimetric parameters are estimated by reading all four pixels in parallel and scaling them individually at the periphery, that is, far from the image matrix.
With the weighted multiplications shown in Figure 15, the Stokes parameters S0, S1 and S2 are obtained, which is equivalent to using Equations (3), (8) and (9), previously given in Section 2.5.
The original image of the scene cut and already in grayscale can be seen in Figure 16a next to the result of the intensity information in Figure 16b obtained by the procedure explained above, in which the values close to zero represent the darkest areas and values close to the unit represent the brightest areas, as can be seen in the gray scale bar on the right side of the image. Thus, it has the intensity equivalent Itot, which is equal to the first Stokes parameter S0 from Equation (3).
In Figure 16c, the second Stokes parameter S1 was obtained, in which the brighter portions correspond to those of horizontal polarization, that is, they are the areas that present more polarization oriented at 0°, whose values are close to the value +1 and the darker values represent the areas with vertical polarization (90°) with the values closer to −1, as can be seen in the grayscale bar on the right side of the image.
It is possible to observe in Figure 16d, the third Stokes parameter S2 where the brighter portions correspond to the 45° polarization areas, that is, they are the areas that present more polarization oriented at 45° whose values are close to the +1 value and the darker values represent the areas with polarization at 135°, with values closer to −1, as can be seen in the grayscale on the right side of the image. It is observed that the 135° polarized filter that was added on the right side of the image presented the values closest to −1 in contrast to the other pixels of the image, that is, the area with the closest 135° polarization which corresponds to the same −45° polarization. It is also observed that the plastic bowl presents the largest variation range, from −1 to +1, which means the polarization ranges from −45° to +45°.
DoLP normalized parameter for the scenario was obtained using Equation (11), as shown in Figure 16e. In order to increase the visibility of certain resources in the scene, a false color, also called pseudocolor [41,47,48], was applied to these images, resulting in Figure 16e, similar to what was done in the works of [49,50] where the bias percentage was shown as a different color. Pixels with black color represent a low degree of linear polarization and with those colors closer to red represent a high degree of linear polarization for two images, according to the color scale on the right side of Figure 16e. The linear polarization filter located on the right side of Figure 16e also has a high DoLP value due to the intrinsic properties of the filters.
In Figure 16f, there is the representation of the linear polarization angle (AoP) with the application of the pseudocolor for the scenario, where 0° of linear polarization is presented in red, 90° of linear polarization is presented in light blue and again 180° linear polarization is shown in red. It is observed that the plastic bowl has a wide range of polarization angles, but these linear polarizing angles along the plastic figure exhibit smooth variations due to the curvature of the figure since the light reflected from a surface changes its polarization state based on the angle of reflection. Thus, observing the state of polarization allows the recovery of the normal surface [47]. The polarization-sensitive sensor captures image information on the intrinsic environment, i.e., surface curvature and refractive index represented in the data space of the polarization angles.
It is observed that the variations in intensity along the figure of the black plastic bowl are minimal in most of the image and the shape of the figure is difficult to determine simply with the image of two-dimensional intensity as shown in Figure 13.
In Figure 16e, in which the false color was applied to improve the visual perception of the image, the difference between the plastic bowl and the polarized linear filter is more evident when compared to the rest of the image, that is, it presented high percentages degree of polarization.
As the region of the image of the plastic bowl showed high variations in polarization angles (AoP) and presented greater degrees of linear polarization (DoLP), this image was chosen with a section tangent to the top of the bowl to improve the formation of the 3D image, as will be shown next.
The information obtained from the polarization image sensor was used for the reconstruction of a 3D image using a single camera, instead of what is usually done, in which several cameras are used to obtain various points of view of the same image and, consequently, form the three-dimensional image. The degree of polarization captured by the CMOS polarization image sensor is directly related to the angle of incidence of the incoming light and the normal surface of the image object [46]. The polarization measurements of the CMOS Image Sensor () are converted to normal surface information of the object contained in the image. The DoLP shown in Figure 16e provides an estimation of the surface profile of the plastic bowl, as well as the polarized plastic film and some of the plastic cover on the left.
In Figure 17, there is a three-dimensional reconstruction of this scenario, in which the normalized amplitude of the DoLP was multiplied by a thousand to improve the relationship between the dimensions of the width, thickness and height of the image. The three-dimensional image reconstruction from the DoLP in the CIS is presented from different viewpoints. It is possible to see the formation of the 3D image of the bowl, the polarized film and the piece of plastic cover, which have high relief in relation to the base of the image (in black). This result is important, as it is similar to those obtained in [46,47], which obtained reconstructions of an image of a plastic horse and a PET bottle, respectively, using the same principles of this work. In this case, it was possible to reach the state-of-the-art in the area of polarized sensors, extraction of Stokes parameters from an image and reconstitution of a three-dimensional image from these.

4. Conclusions

In this paper, a compact hybrid pixel cluster was presented with the capacity to detect local light intensity, local incident angle, and local light polarization, which also enables it to determine Stokes parameters. The proposed hybrid pixel cluster embeds the functionality of the two-pixel clusters previously presented in the literature, the polarization pixel cluster and the quadrature pixel cluster. The greatest advantage of the proposed solution is its potential to perform the same functions of the previous pixel cluster solutions with 75% improved resolution. The biggest downside of the proposed solution is the need for employing more than one pixel to determine the local light intensity level, however, this was not a problem to determine the Stokes parameters. Analyses over a set of images captured with a polarizer system on the front of the image sensor are presented and discussed, showing the expected 3D image reconstruction capabilities of an imager employing the proposed pixel cluster. The simulation results show that the proposed solution, based on simple modification of well-established pixel cluster topologies, results in an improved CMOS image sensor for depth sensing and 3D image reconstruction purposes at no additional fabrication cost of the integrated circuit.

Author Contributions

The contributions of the authors of this work are pointed as follows: Conceptualization, F.F.C. and C.A.d.M.C.; data curation, F.F.C., C.A.d.M.C. and G.C.M.; formal analysis, F.F.C., C.A.d.M.C. and K.M.C.D.; funding acquisition, C.A.d.M.C.; investigation, F.F.C. and C.A.d.M.C.; methodology, F.F.C. and C.A.d.M.C.; project administration, C.A.d.M.C.; resources, C.A.d.M.C.; software, F.F.C., C.A.d.M.C. and K.M.C.D.; supervision, C.A.d.M.C.; validation, F.F.C. and C.A.d.M.C.; visualization, F.F.C. and C.A.d.M.C.; writing—original draft, F.F.C., C.A.d.M.C., G.C.M. and K.M.C.D.; writing—review & editing, F.F.C., C.A.d.M.C., G.C.M. and K.M.C.D. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by the Fundação de Pesquisa do Estado do Amazonas—FAPEAM under PAPAC program EDITAL N° 005/2019. This research, according to Article 48 of Degree N° 6.008/2006, was also funded by Samsung Electronics of Amazonia Ltda, under the terms of Federal Law N° 8.387/1991, through agreement N° 004, signed with CETELI/UFAM.

Acknowledgments

The authors would like to acknowledge the PPGEE—Postgraduate Program of Electrical Engineering of the Universidade Federal do Amazonas, Manaus, AM, Brazil. The authors would like to express our thanks to Martin How and Shelby Temple, for sharing information and knowledge about polarized images, which contributed to the authors’ expertise. The authors would like to express our thanks to Fundação de Pesquisa do Estado do Amazonas and Samsung Electronics of Amazonia Ltda.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Important sensor parameters.
Table A1. Important sensor parameters.
ParameterValue
Technology0.18 µm standard TSMC CMOS process
Pixel size (A)5 µm × 5 µm
Pixel typePhotodiode 3T APS
Width/Length of the CMOS Transistors: W1/L1, W2/L2, W3/L3, W4/L40.4/0.2, 5/0.2, 5/0.2, 0.4/0.2 µm
Fill Factor40%
Capacitance of the photodiode junction (Cj)20 fF
Dark Current0.1 fA
Supply Voltage1.8 V
Height of the metal M (Timε)6.360 µm
Thickness of the metal M (Tm)1.720 µm
Refractive index of air (nair)1
Refractive index of SiO2 (nε)1.46
Widths of the shaded regions of diodes under normal illumination (XA0, XB0, XC0, XD0)2.3 µm
Light intensity500 klux = 732 W/m2 [λ = 555 nm]

References

  1. Sarkar, M.; Bello, D.; Hoof, C.; Theuwissen, A. Integrated Polarization Analyzing CMOS Image Sensor for Material Classification. IEEE Sens. J. 2011, 11, 1692–1703. [Google Scholar] [CrossRef]
  2. Sarhangnejad, N.; Katic, N.; Xia, Z.; Wei, M.; Gusev, N.; Dutta, G.; Gulve, R.; Li, P.Z.X.; Ke, H.F.; Haim, H. Dual-Tap Computational Photography Image Sensor with Per-Pixel Pipelined Digital Memory for Intra-Frame Coded Multi-Exposure. IEEE J. Solid-State Circuits 2019, 54, 3191–3202. [Google Scholar] [CrossRef]
  3. Hu, F.; Cheng, Y.; Gui, L.; Wu, L.; Zhang, X.; Peng, X.; Su, J. Polarization-based material classification technique using passive millimeter-wave polarimetric imagery. Appl. Opt. 2016, 55, 8690–8697. [Google Scholar] [CrossRef] [PubMed]
  4. Karman, S.B.; Diah, S.Z.M.; Gebeshuber, I.C. Bio-inspired polarized skylight-based navigation sensors: A review. Sensors 2012, 12, 14232–14261. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Sarkar, M.; Bello, D.S.S.; Van Hoof, C.; Theuwissen, A.J. Integrated polarization-analyzing CMOS image sensor for detecting the incoming light ray direction. IEEE Trans. Instrum. Meas. 2011, 60, 2759–2767. [Google Scholar] [CrossRef]
  6. Dung, L.-R.; Wu, Y.-Y. A wireless narrowband imaging chip for capsule endoscope. IEEE Trans. Biomed. Circuits Syst. 2010, 4, 462–468. [Google Scholar] [CrossRef] [PubMed]
  7. Faramarzpour, N.; Munir, E.-D.; Deen, M.J.; Fang, Q.; Liu, L. CMOS imaging for biomedical applications. IEEE Potentials 2008, 27, 31–36. [Google Scholar] [CrossRef]
  8. Parikh, S.; Gulak, G.; Chow, P. A CMOS image sensor for DNA microarrays. In Proceedings of the 2007 IEEE Custom Integrated Circuits Conference, San Jose, CA, USA, 16–19 September 2007; IEEE: Piscataway, NJ, USA, 2007; pp. 821–824. [Google Scholar]
  9. Zhang, M.; Ihida-Stansbury, K.; Van der Spiegel, J.; Engheta, N. Polarization-based non-staining cell detection. Opt. Express 2012, 20, 25378–25390. [Google Scholar] [CrossRef] [PubMed]
  10. Schechner, Y.Y.; Narasimhan, S.G.; Nayar, S.K. Instant dehazing of images using polarization. In Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001, Kauai, HI, USA, 8–14 December 2001; IEEE: Piscataway, NJ, USA, 2001; pp. 325–332. [Google Scholar]
  11. Dias, R.; Correia, J.; Minas, G. CMOS optical sensors for being incorporated in endoscopic capsule for cancer cells detection. In Proceedings of the 2007 IEEE International Symposium on Industrial Electronics, Vigo, Spain, 4–7 June 2007; IEEE: Piscataway, NJ, USA, 2007; pp. 2747–2751. [Google Scholar]
  12. Song, H.; Kono, H.; Seo, Y.; Azhari, A.; Somei, J.; Suematsu, E.; Watarai, Y.; Ota, T.; Watanabe, H.; Hiramatsu, Y. A radar-based breast cancer detection system using CMOS integrated circuits. IEEE Access 2015, 3, 2111–2121. [Google Scholar] [CrossRef]
  13. Sansoni, G.; Trebeschi, M.; Docchio, F. State-of-the-art and applications of 3D imaging sensors in industry, cultural heritage, medicine, and criminal investigation. Sensors 2009, 9, 568–601. [Google Scholar] [CrossRef] [PubMed]
  14. Li, D.; Cao, Y.; Shi, G.; Cai, X.; Chen, Y.; Wang, S.; Yan, S. An overlapping-free leaf segmentation method for plant point clouds. IEEE Access 2019, 7, 129054–129070. [Google Scholar] [CrossRef]
  15. Eriksson, J.; Bergström, D.; Renhorn, I. Characterization and performance of a LWIR polarimetric imager. In Proceedings of the Electro-Optical Remote Sensing XI, Warsaw, Poland, 5 October 2017; International Society for Optics and Photonics: Bellingham, WA, USA; p. 1043407. [Google Scholar]
  16. Gurton, K.P.; Yuffa, A.J.; Videen, G.W. Enhanced facial recognition for thermal imagery using polarimetric imaging. Opt. Lett. 2014, 39, 3857–3859. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Yuffa, A.J.; Gurton, K.P.; Videen, G. Three-dimensional facial recognition using passive long-wavelength infrared polarimetric imaging. Appl. Opt. 2014, 53, 8514–8521. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Iranmanesh, S.M.; Dabouei, A.; Kazemi, H.; Nasrabadi, N.M. Deep cross polarimetric thermal-to-visible face recognition. In Proceedings of the 2018 International Conference on Biometrics (ICB), Gold Coast, QLD, Australia, 20–23 February 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 166–173. [Google Scholar]
  19. Kim, S.-J.; Han, S.-W.; Kang, B.; Lee, K.; Kim, J.D.; Kim, C.-Y. A three-dimensional time-of-flight CMOS image sensor with pinned-photodiode pixel structure. IEEE Electron Device Lett. 2010, 31, 1272–1274. [Google Scholar] [CrossRef]
  20. Kato, Y.; Sano, T.; Moriyama, Y.; Maeda, S.; Yamazaki, T.; Nose, A.; Shina, K.; Yasu, Y.; van der Tempel, W.; Ercan, A. 320× 240 Back-illuminated 10μm CAPD pixels for high speed modulation Time-of-Flight CMOS image sensor. Kyoto, Japan, 5–8 June 2017; IEEE: Piscataway, NJ, USA, 2017; pp. C288–C289. [Google Scholar]
  21. Fife, K.; El Gamal, A.; Wong, H.-S.P. A 3MPixel multi-aperture image sensor with 0.7 μm pixels in 0.11 μm CMOS. In Proceedings of the 2008 IEEE International Solid-State Circuits Conference-Digest of Technical Papers, San Francisco, CA, USA, 3–7 February 2008; IEEE: Piscataway, NJ, USA, 2008; pp. 48–594. [Google Scholar]
  22. Lumsdaine, A.; Georgiev, T. The focused plenoptic camera. In Proceedings of the 2009 IEEE International Conference on Computational Photography (ICCP), San Francisco, CA, USA, 16–17 April 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 1–8. [Google Scholar]
  23. Mochizuki, F.; Kagawa, K.; Okihara, S.; Seo, M.-W.; Zhang, B.; Takasawa, T.; Yasutomi, K.; Kawahito, S. Single-event transient imaging with an ultra-high-speed temporally compressive multi-aperture CMOS image sensor. Opt. Express 2016, 24, 4155–4176. [Google Scholar] [CrossRef] [PubMed]
  24. Wang, A.; Molnar, A. A light-field image sensor in 180 nm CMOS. IEEE J. Solid-State Circuits 2011, 47, 257–271. [Google Scholar] [CrossRef]
  25. Varghese, V.; Qian, X.; Chen, S.; ZeXiang, S.; Jin, T.; Guozhen, L.; Wang, Q.J. Track-and-tune light field image sensor. IEEE Sens. J. 2014, 14, 4372–4384. [Google Scholar] [CrossRef]
  26. Varghese, V.; Qian, X.; Chen, S.; ZeXiang, S. Linear angle sensitive pixels for 4D light field capture. In Proceedings of the 2013 International SoC Design Conference (ISOCC), Busan, Korea, 17–19 November 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 072–075. [Google Scholar]
  27. Morel, O.; Seulin, R.; Fofi, D. Handy method to calibrate division-of-amplitude polarimeters for the first three Stokes parameters. Opt. Express 2016, 24, 13634–13646. [Google Scholar] [CrossRef]
  28. Perkins, R.; Gruev, V. Signal-to-noise analysis of Stokes parameters in division of focal plane polarimeters. Opt. Express 2010, 18, 25815–25824. [Google Scholar] [CrossRef] [PubMed]
  29. Garcia, M.; Gao, S.; Edmiston, C.; York, T.; Gruev, V. A 1300× 800, 700 mW, 30 fps spectral polarization imager. In Proceedings of the 2015 IEEE International Symposium on Circuits and Systems (ISCAS), Lisbon, Portugal, 24–27 May 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 1106–1109. [Google Scholar]
  30. Zhang, J.; Luo, H.; Liang, R.; Ahmed, A.; Zhang, X.; Hui, B.; Chang, Z. Sparse representation-based demosaicing method for microgrid polarimeter imagery. Opt. Lett. 2018, 43, 3265–3268. [Google Scholar] [CrossRef] [PubMed]
  31. Varghese, V.; Chen, S. Polarization-based angle sensitive pixels for light field image sensors with high spatio-angular resolution. IEEE Sens. J. 2016, 16, 5183–5194. [Google Scholar] [CrossRef]
  32. Varghese, V.; Chen, S. Angle sensitive imaging: A new paradigm for light field imaging. Ph.D. Dissertation, Nanyang Technological University, Singapore, 2016. [Google Scholar]
  33. Hecht, E. Optics; Addison Wesley: San Francisco, CA, USA, 2002; Volume 4. [Google Scholar]
  34. Varghese, V.; Chen, S. Incident light angle detection technique using polarization pixels. In Proceedings of the SENSORS, Valencia, Spain, 2–5 November 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 2167–2170. [Google Scholar]
  35. Carvalho, F.F.; Souza, A.K.P.; de Moraes Cruz, C.A. A novel hybrid polarization-quadrature pixel cluster for local light angle and intensity detection. In Proceedings of the 30th Symposium on Integrated Circuits and Systems Design Chip on the Sands (SBCCI ’17), Fortaleza, Brazil, 28 August–1 September 2017. [Google Scholar]
  36. Carvalho, F.F.; Souza, A.K.P.; Cruz, C.A.d.M. Hybrid grated pixel cluster for local light angle and intensity detection. In Proceedings of the 2017 2nd International Symposium on Instrumentation Systems, Circuits and Transducers (INSCIT), Fortaleza, Brazil, 28 August–1 September 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 1–6. [Google Scholar]
  37. Carvalho, F.F.; de Moraes Cruz, C.A.; Marques, G.C.; Bezerra, T.B. A Novel Hybrid CMOS Pixel-Cluster for Local Light Angle, Polarization and Intensity Detection with Determination of Stokes Parameters. J. Integr. Circuits Syst. 2018, 13, 1–10. [Google Scholar] [CrossRef] [Green Version]
  38. Ferreira, M. Óptica e fotónica; Lidel: Lisboa, Portugal, 2003; ISBN 972-757-288-X. [Google Scholar]
  39. Yu, X.; Kwok, H.S. Optical wire-grid polarizers at oblique angles of incidence. J. Appl. Phys. 2003, 93, 4407–4412. [Google Scholar] [CrossRef] [Green Version]
  40. De Moraes Cruz, C.A. Simplified Wide Dynamic Range CMOS Image Sensor with 3T APS Reset-Drain Actuation. Ph.D. Dissertation, Universidade Federal de Minas Gerais, Belo Horizonte, Brazil, 2014. [Google Scholar]
  41. Goldstein, D.H. Polarized Light; CRC Press: Boca Raton, FL, USA, 2017; ISBN 1-351-83367-7. [Google Scholar]
  42. Sarkar, M.; Theuwissen, A. A Biologically Inspired CMOS Image Sensor; Springer: Berlin, Germany, 2012; ISBN 3-642-34901-3. [Google Scholar]
  43. Gruev, V.; Spiegel, J.; Engheta, N. Integrated polarization image sensor for cell detection. In Proceedings of the International Image Sensor Workshop, Bergen, Norway, 26–28 June 2009. [Google Scholar]
  44. Guo, J.; Brady, D. Fabrication of thin-film micropolarizer arrays for visible imaging polarimetry. Appl. Opt. 2000, 39, 1486–1492. [Google Scholar] [CrossRef] [PubMed]
  45. Yang, R.; Sen, P.; O’connor, B.; Kudenov, M. Intrinsic coincident full-Stokes polarimeter using stacked organic photovoltaics. Appl. Opt. 2017, 56, 1768–1774. [Google Scholar] [CrossRef] [PubMed]
  46. Gruev, V.; York, T. High resolution CCD polarization imaging sensor. In Proceedings of the International Image Sensor Workshop, Sapporo, Japan, 8–11 June 2011. [Google Scholar]
  47. Garcia, N.M.; De Erausquin, I.; Edmiston, C.; Gruev, V. Surface normal reconstruction using circularly polarized light. Opt. Express 2015, 23, 14391–14406. [Google Scholar] [CrossRef] [PubMed]
  48. Kruse, A.W.; Alenin, A.S.; Tyo, J.S. Review of visualization methods for passive polarization imaging. Opt. Eng. 2019, 58, 082414. [Google Scholar] [CrossRef] [Green Version]
  49. Zhanghao, K.; Chen, X.; Liu, W.; Li, M.; Liu, Y.; Wang, Y.; Luo, S.; Wang, X.; Shan, C.; Xie, H. Super-resolution imaging of fluorescent dipoles via polarized structured illumination microscopy. Nat. Commun. 2019, 10, 1–10. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  50. Cronin, T.W.; Garcia, M.; Gruev, V. Multichannel spectrometers in animals. Bioinspir. Biomim. 2018, 13, 021001. [Google Scholar] [CrossRef] [PubMed]
Figure 1. (a) Polarization Pixel Cluster (PPC) [31]; (b) Quadrature Pixel Cluster (QPC) [31].
Figure 1. (a) Polarization Pixel Cluster (PPC) [31]; (b) Quadrature Pixel Cluster (QPC) [31].
Sensors 20 03391 g001
Figure 2. (a) Three-dimensional view of a division of focal plane polarization sensors on the imaging plane. (b) A micropolarizer array-based polarimeter with orientation. (c) Top view of the microgrid with the four main filters.
Figure 2. (a) Three-dimensional view of a division of focal plane polarization sensors on the imaging plane. (b) A micropolarizer array-based polarimeter with orientation. (c) Top view of the microgrid with the four main filters.
Sensors 20 03391 g002
Figure 3. Schematic of a single hybrid quadrature-polarization pixel cluster (HPC) pixel in the 3T-APS topology designed in a 0.18 µm standard TSMC CMOS process with the electrical model of a photodiode in red dashed detail.
Figure 3. Schematic of a single hybrid quadrature-polarization pixel cluster (HPC) pixel in the 3T-APS topology designed in a 0.18 µm standard TSMC CMOS process with the electrical model of a photodiode in red dashed detail.
Sensors 20 03391 g003
Figure 4. Poincaré sphere representation.
Figure 4. Poincaré sphere representation.
Sensors 20 03391 g004
Figure 5. (a) Layout of proposed HPC-hybrid polarization-quadrature pixel cluster sensor. (b) Example of an array with 16 HPC units with one selected in the red square. (c) A 3D view of the HPC.
Figure 5. (a) Layout of proposed HPC-hybrid polarization-quadrature pixel cluster sensor. (b) Example of an array with 16 HPC units with one selected in the red square. (c) A 3D view of the HPC.
Sensors 20 03391 g005
Figure 6. Pixel voltage versus incidence angle variation of differential (Vout_AVout_C) quadrature pixel of QPC, dotted plot, and 0° polarization pixel D of PPC, dashed plot, illustrating the angle detection technique in simulation compared to experimental data from [31].
Figure 6. Pixel voltage versus incidence angle variation of differential (Vout_AVout_C) quadrature pixel of QPC, dotted plot, and 0° polarization pixel D of PPC, dashed plot, illustrating the angle detection technique in simulation compared to experimental data from [31].
Sensors 20 03391 g006
Figure 7. Pixel output voltage of intensity response of the 3T APS (a) pixel D of 0° polarized and (b) pixel B of 45° polarized versus sign of incidence angle and polarization angle.
Figure 7. Pixel output voltage of intensity response of the 3T APS (a) pixel D of 0° polarized and (b) pixel B of 45° polarized versus sign of incidence angle and polarization angle.
Sensors 20 03391 g007
Figure 8. Two-dimensional distribution map of the output voltage of the light intensity pixel and a corresponding voltage line profile (right column) of (a) 0° polarization and (b) 45° polarization as a function of the incidence angle and the polarization angle.
Figure 8. Two-dimensional distribution map of the output voltage of the light intensity pixel and a corresponding voltage line profile (right column) of (a) 0° polarization and (b) 45° polarization as a function of the incidence angle and the polarization angle.
Sensors 20 03391 g008
Figure 9. Pixel output voltage of intensity response of: PPC Pixel A, HPC Pixel A with normal angle of incidence to the sensor, linearly polarized to 0° and 45°.
Figure 9. Pixel output voltage of intensity response of: PPC Pixel A, HPC Pixel A with normal angle of incidence to the sensor, linearly polarized to 0° and 45°.
Sensors 20 03391 g009
Figure 10. Image with different plastic materials including a black plastic bowl. On the right side, a linearly polarized film plastic at 135° was also added for testing.
Figure 10. Image with different plastic materials including a black plastic bowl. On the right side, a linearly polarized film plastic at 135° was also added for testing.
Sensors 20 03391 g010
Figure 11. (a) Polarized image detection system with a polarizer on the front of the image sensor to 0°. (b) The polarization filter was rotated with 45° step to get 45° polarized image. (c) Representation of an image with 75% of transmittance without polarization.
Figure 11. (a) Polarized image detection system with a polarizer on the front of the image sensor to 0°. (b) The polarization filter was rotated with 45° step to get 45° polarized image. (c) Representation of an image with 75% of transmittance without polarization.
Sensors 20 03391 g011
Figure 12. (a) Polarized images of the scenario with polarizations 0°, (b) 45° and (c) 75% of intensity without polarization.
Figure 12. (a) Polarized images of the scenario with polarizations 0°, (b) 45° and (c) 75% of intensity without polarization.
Sensors 20 03391 g012
Figure 13. Image highlighting the black bowl, a piece of the plastic cover on the left and the 135° polarized film with dimensions 700 × 1938, which are used for the formation of the three-dimensional image.
Figure 13. Image highlighting the black bowl, a piece of the plastic cover on the left and the 135° polarized film with dimensions 700 × 1938, which are used for the formation of the three-dimensional image.
Sensors 20 03391 g013
Figure 14. (a) Image of 0° polarized pixels with zeros in other positions. (b) The same results of 0° polarized pixels without the zero points. (c) Image of 45° polarized pixels with zeros in other positions. (d) Image of 45° polarized pixels without zero points. (e) Image of partially covered pixels (75% free) with zeros in other positions. (f) Image of the partially covered pixels without zero points.
Figure 14. (a) Image of 0° polarized pixels with zeros in other positions. (b) The same results of 0° polarized pixels without the zero points. (c) Image of 45° polarized pixels with zeros in other positions. (d) Image of 45° polarized pixels without zero points. (e) Image of partially covered pixels (75% free) with zeros in other positions. (f) Image of the partially covered pixels without zero points.
Sensors 20 03391 g014
Figure 15. Block diagram of a complete focal plane polarization imaging system.
Figure 15. Block diagram of a complete focal plane polarization imaging system.
Sensors 20 03391 g015
Figure 16. (a) Original image. (b) Image of the Stoke Parameter S0. (c) Image resulting of Stoke Parameter S1. (d) Stoke Parameter S2. (e) Degree of Linear Polarization (DoLP) with pseudo-color. (f) Angle of Polarization (AoP) with pseudo-color.
Figure 16. (a) Original image. (b) Image of the Stoke Parameter S0. (c) Image resulting of Stoke Parameter S1. (d) Stoke Parameter S2. (e) Degree of Linear Polarization (DoLP) with pseudo-color. (f) Angle of Polarization (AoP) with pseudo-color.
Sensors 20 03391 g016aSensors 20 03391 g016b
Figure 17. Three-dimensional surface reconstitution of a bowl using the degree of linear polarization (DoLP) in several points of view (ai).
Figure 17. Three-dimensional surface reconstitution of a bowl using the degree of linear polarization (DoLP) in several points of view (ai).
Sensors 20 03391 g017

Share and Cite

MDPI and ACS Style

Freitas Carvalho, F.; Augusto de Moraes Cruz, C.; Costa Marques, G.; Martins Cruz Damasceno, K. Angular Light, Polarization and Stokes Parameters Information in a Hybrid Image Sensor with Division of Focal Plane. Sensors 2020, 20, 3391. https://0-doi-org.brum.beds.ac.uk/10.3390/s20123391

AMA Style

Freitas Carvalho F, Augusto de Moraes Cruz C, Costa Marques G, Martins Cruz Damasceno K. Angular Light, Polarization and Stokes Parameters Information in a Hybrid Image Sensor with Division of Focal Plane. Sensors. 2020; 20(12):3391. https://0-doi-org.brum.beds.ac.uk/10.3390/s20123391

Chicago/Turabian Style

Freitas Carvalho, Francelino, Carlos Augusto de Moraes Cruz, Greicy Costa Marques, and Kayque Martins Cruz Damasceno. 2020. "Angular Light, Polarization and Stokes Parameters Information in a Hybrid Image Sensor with Division of Focal Plane" Sensors 20, no. 12: 3391. https://0-doi-org.brum.beds.ac.uk/10.3390/s20123391

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop