Next Article in Journal
A Linkage Representation of the Human Hand Skeletal System Using CT Hand Scan Images
Previous Article in Journal
Analysis of Seismic Wavefield Characteristics in 3D Tunnel Models Based on the 3D Staggered-Grid Finite-Difference Scheme in the Cylindrical Coordinate System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Imaging Attitude Control and Image Motion Compensation Residual Analysis Based on a Three-Axis Inertially Stabilized Platform

1
Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences, Changchun 130033, China
2
Key Laboratory of Airborne Optical Imaging and Measurement, Chinese Academy of Sciences, Changchun 130033, China
3
University of Chinese Academy of Sciences, No. 19, Yuquan Rd., Beijing 100049, China
*
Author to whom correspondence should be addressed.
Submission received: 30 April 2021 / Revised: 21 June 2021 / Accepted: 21 June 2021 / Published: 24 June 2021
(This article belongs to the Section Earth Sciences)

Abstract

:

Featured Application

The airborne area camera imaging technique introduced in this article can be used in fields that need wide-area coverage with high resolution for accurate inspection, such as photogrammetry mapping, environmental reconnaissance, and efficient natural disaster surveying.

Abstract

The airborne area camera has received broad application in aerial reconnaissance, land resource surveying, environmental monitoring, photogrammetry mapping, and natural disaster information acquisition. A three-axis, inertially stabilized platform with a large rotation range for the roll axis is designed, which is based on the cantilever structure, in order to realize a large-angle sweep imaging function for airborne area cameras. An image attitude control algorithm in the inertial space is proposed, which can regulate the line of sight (LOS) as well as the image orientation. The area camera image motion calculation model and image motion compensation residual computing method are proposed, utilizing space position and velocity vector transformation mathematics and derivations. The variation of linear velocity of the image motion in the sensor frame is analyzed, and the changing laws of the maximum deviation of image motion with the image attitude are studied. Flight tests imply that the vertical imaging technique correctly regulates the LOS along the local geodetic vertical. The along-flight overlap rate is greater than 65%, which meets the stereo mapping requirement. The sweep imaging technique considerably enlarges the cross-flight angle of view. The LOS and image orientation during sweep imaging are correctly controlled, and gap-free coverage of the survey area is maintained. The image’s azimuth or roll deviation is less than 0.1°, and the image pitch deviation is less than 0.35°. The quality of the test images is superior. Black and white line pairs for evaluation can be clearly distinguished. The image’s motion is well compensated, and the image motion compensation residual is well constrained. These verify the validity of the proposed imaging technique and the image motion analysis model.

1. Introduction

The area camera is one of the most important visible light sensors for aerial remote sensing. The sensitive area of an area camera has surpassed two thousand square millimeters, and the number of pixels is greater than one hundred million with the rapid development of manufacturing techniques [1,2,3]. An area camera can achieve a higher ground resolution from a higher flight altitude and provide larger aerial coverage, resulting in higher aerial survey productivity. These advantages make area cameras very popular in the aerial reconnaissance and photogrammetry mapping fields. An inertially stabilized platform (ISP) is essential for airborne area cameras [4,5,6]. Its primary function is to isolate the aerial camera from aircraft vibrations and disturbances, regulate the line of sight (LOS), and compensate for image motion during the exposure process [7,8,9]. A platform with a two-axis platform is described in [10], which is designed for aerial imaging and tracking. In [11], an inertially stabilized two-axis airborne camera platform is presented, which is applied to object pointing and tracking. The two-axis platform has broad applications in searching and surveillance. However, the two-axis platform cannot control the image orientation, which is essential for photogrammetry mapping and surveying. Therefore, a three-axis ISP [12,13] has been developed, which is equipped with a position and orientation system (POS) to control the LOS and image orientation in inertial space.
In order to increase the efficiency of airborne area camera imaging and meet the urgent aerial sensing demand of natural disasters, the cross-flight angle of view should be enlarged. Sweep imaging through the roll axis is an effective way to enlarge the cross-flight angle of view. The rotation range of the roll axis for an available three-axis ISP, such as the PAV30, PAV80, and GSM3000, is usually within 30°, which cannot support wide-range sweep imaging. A novel three-axis ISP based on the cantilever structure is designed, the rotation range of which is greater than 150°. The aircraft’s attitude will vary during imaging, and the gimbal angle for each axis should be adjusted so that the area camera can maintain the desired inertial attitude. LOS pointing control and stabilization techniques for a two-axis platform have already been studied in [9,14]. An attitude control strategy for a three-axis platform is presented in [15], which provides the stabilization of the camera’s LOS to the desired attitude. To the authors’ knowledge, the image attitude control technique for a three-axis ISP, which is essential for gap-free coverage in sweep imaging, has not been reported in the literature. In accordance with the rotation coupling laws of the three gimbal axes and the space vector decomposition and transformation theorems, an image attitude control algorithm for a three-axis ISP is proposed in this paper.
Image motion [14,16,17,18,19] is the primary factor that affects the quality of an aerial image. Image motion on the sensor plane will significantly vary with the position while the angle of view for an area camera is large and the area camera tilts remarkably. Image motion variation (IMV) on the sensor plane makes it impossible to completely compensate for the image motion for all pixels. Therefore, the quantity of image motion compensation (IMC) residual resulting from IMV should be quantitatively analyzed, and the exposure parameters should be adjusted correspondingly. Held K.J. and Brendan H.R. proposed an image motion calculation model and LOS stabilization techniques using a fast steering mirror for aerial imaging [14]. Their LOS stabilization techniques can also be used in three-axis platforms. However, their image motion calculation model is based on the LOS and cannot be used for calculating the IMV. The image motion calculation model for analyzing IMV and the IMC residual computing method for area cameras are proposed, utilizing space position and velocity vector transformation mathematics and derivations. The distribution of image motion vectors on the sensor plane is presented, and the changing laws of the maximum IMV with the image attitude are studied.
The layout of the rest of this paper is as follows. Section 2 describes the structure design of the three-axis ISP and the imaging control process. The calculation formulas of the gimbal angles for controlling the imaging attitude are deduced in Section 3. The image motion analysis model is established in Section 4, and the IMV, as well as the residual of the IMC, are quantitatively studied. Section 5 presents the flight test results that verify the validity of the proposed imaging techniques. Finally, conclusions are made in Section 6.

2. Imaging System Design

2.1. Platform Mechanism

The platform (Figure 1) is composed of three axes, namely the pitch axis, roll axis, and azimuth axis. The three axes are orthogonal and independently rotate so that the LOS and imaging orientation can be simultaneously adjusted. The pitch mechanism is assembled on the rolling mechanism, and the rolling mechanism is assembled on the azimuth mechanism. The roll axis and pitch axis will rotate accordingly while the azimuth axis rotates; therefore, the azimuth axis conforms to the rule of rigid body rotation. The orientation of the pitch axis will change with the rotation of the roll axis. Rotation dependencies are essential for image attitude calculation, which will be further discussed in Section 3. The rolling mechanism employs a cantilever structure. Its bearing connects the pitch and azimuth mechanism. A back-to-back angular contact bearing is applied to the rolling mechanism, which can support radical torque and increase the bearing weight of the payload. The cantilever structure can increase the payload space of the pitch frame and enlarge the rotating range of the roll axis. The rotation range of the azimuth axis is from −180° to +180°, the rotation range of the pitch axis is from −45° to +45°, and the rotation range of the roll axis is from −85° to +85°. The rolling mechanism of the platform sets up the foundation for large-angle sweep imaging.
Permanent magnet, direct-current torque motors are used to drive the shaft of each axis directly, which eliminate transmission clearance and guarantee rotation accuracy. The direct drive mechanism improves structural stiffness and increases the closed-loop bandwidth of the servo system accordingly. A gyro is mounted on each axis to provide the angular rate in inertial space. The POS is mounted on the pitch frame, which can provide the position, velocity, pitch angle, roll angle, and azimuth angle in inertial space. The area camera is loaded on the bearing base of the pitch frame so that the inertial attitude of the area camera can be measured directly.

2.2. Imaging Control Principle

The block diagram of the imaging control is shown in Figure 2. The imaging controller first calculates the imaging cycle t p using Equation (1). The imaging controller adjusts the timing sequence of each imaging cycle to guarantee ρ , which is essential for photogrammetry mapping to be satisfied:
t p = L C ( 1 ρ ) H v W f
where ρ is the image overlap rate along flight, L C is the along-flight length of the sensor, f is the focal length of the optics, v W is the velocity of the aircraft, and H is the relative height of the aircraft.
During the preparation period of each imaging cycle, the gimbal angle calculation model calculates the azimuth gimbal angle κ , pitch gimbal angle ω , and roll gimbal angle φ based on the attitude of the three-axis platform and the desired imaging attitude (refer to Section 3.2). An adaptive fast terminal sliding mode control strategy subjected to the uncertain disturbances is used for the Servo Control module [20,21,22], which estimates the parameters of the lumped uncertain disturbances online and drives each axis tracking the desired gimbal angle. The IMC Calculation module calculates the longest exposure duration T M according to the image motion compensation residual. Then, the Imaging Controller module sets the exposure duration T E and other parameters of the area camera for the current imaging cycle (refer to Section 4.2). During the photographing period, the IMC Calculation module computes the angular rate ω F (refer to Section 4.1), the Imaging Controller module changes the servo state of the pitch axis to the velocity model, and the Servo Control module makes the pitch axis rotate at angular velocity ω F . Meanwhile, the Imaging Controller module sends a level signal of exposure pulse to the area camera to trigger exposure.
The primary function of the velocity control for the Servo Control module is to compensate the image motion angular velocity. The lead–lag compensation algorithm is used for the velocity control, which not only increases the bandwidth of the velocity servo system but also achieves high-precision velocity tracking performance. The velocity in inertial space sensed by the gyro is used for the velocity feedback in the velocity servo system. The uncertain disturbances of the aircraft can be also sensed by the gyro and will be compensated by the closed loop of the velocity control of the servo system.

3. Imaging Attitude Control

3.1. Coordinate Frames

Coordinate frames are used to express the position of a point in relation to a certain reference. Five principal coordinate frames (Figure 3) were used for image attitude control and IMC analysis:

3.1.1. World (W) Frame

This frame has its origin O W at the local horizontal plane. The Z W axis lies along the local geodetic vertical and is positive upward. The X W axis points toward true north along the local horizontal plane. The Y W axis forms an the orthogonal right-hand (RH) set.

3.1.2. Navigation (N) Frame

The navigation frame is the reference in which navigation computations are performed. The center O N for this frame is located at the POS location, and the frame is rotated at the Earth’s inertial rate. Z N lies along the local geodetic vertical and is positive downard. X N is normal to Z N in the local horizontal plane and points to the north. Y N completes the RH orthogonal set pointing to the east.

3.1.3. Target (T) Frame

This frame defines the target imaging attitude and has its origin O T coincident with O N . σ is the direction of the aircraft’s velocity. Rotating the N frame about Z N by σ will result in the target frame, while the target pitch angle and target roll are zero degrees.

3.1.4. Aircraft Body (AC) Frame

This frame also has its origin O A C at the POS location. The X A C direction points out the nose of the aircraft and is aligned with the aircraft body roll axis. Z A C points out the bottom of the aircraft and is aligned with the aircraft body’s heading axis. Y A C forms an RH orthogonal set and nominally points out the right wing.

3.1.5. Sensor (S) Frame

The origin O S of this frame is also at O N , and it is the center of the sensor’s sensitive area. While the boresighting has been done and the three gimbal angles are zero, the roll axis X S is coincident with the aircraft roll axis and the longitudinal direction of the pixel array. The sensor pitch axis Y S is coincident with the aircraft pitch axis and the latitudinal direction of the pixel array. Z S is aligned with the LOS and forms an RH orthogonal set.

3.2. Imaging Attitude Analysis

3.2.1. Gimbal Angle Calculation

During the sweep imaging process, the attitude of each image should be controlled to satisfy the overlap rate and gap-free coverage requirement. For a three-axis platform, the attitude of the area camera can be adjusted by rotating the gimbal axes. Let us define R K M as the matrix that transforms the vector from the K frame to the M frame. Then, R S A C represents the matrix that transforms the vector from the S frame to the AC frame. According to the mechanism design of the three-axis ISP, the x gimbal axis and y gimbal axis rotate with the z gimbal axis, and the y gimbal axis rotates with the x gimbal axis. The transformation order from the S frame to the AC frame is pitch, roll, and azimuth. Therefore, R S A C can be calculated as follows:
R S A C = R z y a w ( κ ) T R x r o l l ( φ ) T R y p i t c h ( ω ) T = [ cos ( κ ) cos ( ω ) sin ( φ ) sin ( ω ) sin ( κ ) sin ( κ ) cos ( φ ) cos ( κ ) sin ( ω ) sin ( κ ) sin ( φ ) sin ( ω ) sin ( κ ) cos ( ω ) + cos ( κ ) sin ( φ ) sin ( ω ) cos ( κ ) cos ( φ ) sin ( κ ) sin ( ω ) cos ( κ ) sin ( φ ) cos ( ω ) cos ( φ ) sin ( ω ) sin ( φ ) cos ( φ ) cos ( ω ) ]
where R x r o l l is the direction cosine matrix (DCM) for the x axis; R y p i t c h is the DCM for the y axis; R z y a w is the DCM for the z axis [23]; Ψ , Θ , and Φ are the azimuth angle, pitch angle and roll angle from the POS, respectively; and κ , ω , and φ are the gimbal angles of the z axis encoder, y axis encoder, and x axis encoder, respectively.
The matrix R N A C , which transforms the vector from the N frame to the AC frame, can be defined using Equation (3):
R N A C = R S A C R x r o l l ( Φ ) R y p i t c h ( Θ ) R z y a w ( Ψ )   N > S
The projection of Y S on the ground should be perpendicular to the aircraft velocity’s direction to achieve gap-free coverage, which is essential for reconnaissance. Therefore, the transform matrix from the T frame to the N frame should be calculated using Equation (4), where Ψ V stands for the aircraft velocity’s direction, Θ T is the target pitch angle, and Φ T is the target roll angle. The LOS first rotates about the y axis by Θ T and then rotates about the x axis by Φ T in turn:
R T N = R z y a w ( Ψ V ) T R y p i t c h ( Θ T ) T R x r o l l ( Φ T ) T
The unit vector of the x axis of the T frame can be transformed to vector r X A C in the AC frame, and the unit vector of the y axis of the T frame can be transformed to vector r Y A C in the AC frame. After rotating about the platform’s azimuth gimbal axis by κ , the roll gimbal axis by φ , and the pitch gimbal axis by ω simultaneously, the unit vector of the x axis of the S frame should be equal with r X A C , and the unit vector of the y axis of the S frame should be equal with r Y A C . Therefore, the following equation set can be established:
{ R S A C [ 1 0 0 ] T = [ u 1 u 2 u 3 ] T R S A C [ 0 1 0 ] T = [ v 1 v 2 v 3 ] T
where
[ u 1 u 2 u 3 ] T = r X A C = R N A C R T N [ 1 0 0 ] T
[ v 1 v 2 v 3 ] T = r Y A C = R N A C R T N [ 0 1 0 ] T
According to Equations (2) and (5), κ , φ , and ω can be solved as follows:
κ = tan 1 ( v 1 / v 2 )
φ = sin 1 ( v 3 )
ω = sin 1 ( u 3 / 1 ( v 3 ) 2 )

3.2.2. Imaging Attitude Analysis

An example is illustrated to analyze the characteristics of the imaging attitude for the three-axis platform and two-axis platform. The longitudinal angle of the view of the area camera is 15° in the example, the latitudinal angle of view of the area camera is 20°, and the relative height of the aircraft is 5000 m. Three images will be taken during each sweeping cycle. The cross-flight overlap angle for two successive images is 4°, and the along-flight distance between two successive images is 500 m. Table 1 presents the three-axis platform gimbal angles for the imaging attitude control using Equations (6–8) and the two-axis platform gimbal angles using the algorithm proposed by Held K.J. and Brendan H.R. [14]. According to the gimbal angles and aircraft attitudes in Table 1, the coverage area on the ground for each imaging step during one cycle could be obtained (Figure 4). The quadrangle, colored blue or red in Figure 4, represents the coverage area for one image, the black dot in the center of the quadrangle represents the projection of the LOS, and the dashed line represents the projection of Y S .
It can be drawn from Figure 4 that the proposed gimbal angle calculation algorithm could adjust the LOS to the target direction and control the projection of Y S on the ground, which could ensure the along-flight overlap rate was satisfied and realize gap-free surveying. The three dotted lines representing three-axis platform imaging are parallel and perpendicular to the flight direction, and the direction of the LOS is aligned with the target direction, which verifies the validity of the imaging attitude control. The algorithm for the two-axis platform controlled the LOS correctly, but the orientation of the projection maps could not be controlled, and full coverage of the survey area could not be guaranteed.

4. Image Motion Modeling and Analysis

It is inevitable to have image motion in aerial camera photography [16] because of attitude variations and forward flight of the aircraft. It is necessary to calculate and compensate the image motion; otherwise, the image quality will be degraded. Part of the image motion is induced by attitude variations of the aircraft, which can be sensed in real time by the gyros mounted on the axes of the platform and automatically compensated by the closed-loop velocity of the servo system, while the other part resulting from forward flight should be modeled and calculated. Therefore, the angular rate of the IMC should be derived and set as the given value to the servo system so that the servo system drives the motor to rotate the corresponding axis at a given angular rate to compensate for the image motion. Unfortunately, it is impossible to compensate image motion completely for all pixels due to the IMV of the airborne area camera. Thus, the residual IMC resulting from the IMV has to be calculated and limited during the aerial imaging process.

4.1. Image Motion Analysis

4.1.1. Image Motion Model

The mapping relation of the airborne area camera based on geometrical optics is demonstrated in Figure 5. The origin O W of the W frame is the projection of O L on local, level ground. O L is the center of the exit pupil of the camera lens. The supposition is made that the aircraft stands still and the scene on the ground moves at the velocity of v W , while the aircraft flies at the velocity of v W . A S is a random point on the sensor plane, and A is the corresponding point on the ground. A moves at v W to B on the ground during a short time interval Δ t . B S is the corresponding point of B on the sensor. Therefore, A S B S is the image motion vector of A S during Δ t .
O L O S is first rotated about y S by the angle α and then rotated around x S by the angle β . Then, O L O S is transformed to the vector which has the same direction as the primary light vector O L A S . The angle between O L A S and v W , represented by θ A , can be calculated as follows:
θ A = cos 1 ( ( r A S S v S ) / ( | r A S S | | v S | ) )
where
r A S S = R x r o l l ( β ) T R y p i t c h ( α ) T [ 0 0 1 ] T
v S = R x r o l l ( Φ T ) R y p i t c h ( Θ T ) R W N v W
Therefore, the image motion angular rate ω A of A can be calculated as follows:
ω A = | v W | sin ( θ A ) / | r A W r O L W |
where r O L W = [ o 1 o 2 o 3 ] T represents the coordinate of O L in the W frame, r A W = [ o 1 o 3 m 1 m 3 o 2 o 3 m 2 m 3 0 ] T represents the coordinate of A in the W frame, and r A S W = R S W r A S S = [ m 1 m 2 m 3 ] T .
According to geometrical optics law, the direction of the intersecting line between the O L A B plane and the sensor plane represented by r A d S is the direction of the image motion’s linear velocity v A S S of A S . According to the vector cross product law, r A d S can be described as in the formula below:
r A d S = v S × r A s S × [ 0 0 1 ] T
Then,
θ A S = cos 1 ( ( r A S S r A d S ) ( | r A S S | | r A d S | ) )
The magnitude of the image motion’s angular rate of A S equals that of A . Therefore, v A S S can be expressed as
v A S S = r A d S | r A d S | ( ω A | O L A S | sin ( θ A S ) ) = r A d S f ω A | r A d S | cos ( α ) cos ( β ) sin ( θ A S )

4.1.2. Image Motion Distribution

The distribution of the image motion’s linear velocity (IMLV) on the sensor plane could be analyzed by Equation (13), which is illustrated in Figure 6. The parameters were set as follows: f = 100 mm, H = 2000 m, v W = [ 150 0 0 ] T m/s, the pixel size is 10 μm, the camera array scale is 6000 × 6000, the cross-flight angle of view is 33.4°, the along-flight angle of view is 33.4°, Φ T = 30 ° , and Θ T = 20 ° . The green square E S F S G S H S represents the sensitive area of the camera. The blue line with an arrow represents the vector of the image motion’s linear velocity. The starting point of each vector is the corresponding image point of the vector. The image projection area on the ground is shown in Figure 7, which is indicated by the green quadrangle EFGH. The green line segment in the middle of quadrangle EFGH represents the ground projection of Y S , which is perpendicular to v W . The red line with an arrow represents the primary light vector of the ground point which is the ending point of the vector. The corresponding image points for E, F, G, and H are E S , F S , G S , and H S , respectively. It can be determined that the IMLV varied significantly on the sensor plane. The magnitude of the IMLV increased with increasing values of Y S , and the phase angle of the IMLV increased with increasing values of X S . E S  had the minimum magnitude of the IMLV, which was 3.81 mm/s, and had the maximum phase angle of the IMLV, which was 16.98°.  had the maximum magnitude of the IMLV, which was 8.18 mm/s, and had the minimum phase angle of the IMLV, which was 4.57°.
The variation of the IMLV on the sensor plane will degrade the image motion’s compensation performance, so it is necessary to study the variation laws of the maximum deviation of the IMLV. The variations of the maximum deviation of the IMLV with the target roll and pitch angles are shown in Figure 8. The parameters were the same as those in Figure 6, except for the target roll angle and target pitch angle. Φ T increased from −30° to −5°, while Θ T = −5°. Θ T increased from −30° to −5°, while Φ T = −5°. The magnitude of the maximum deviation of the IMLV would increase with | Φ T | or | Θ T | . The absolute value of the phase angle of the maximum deviation of the IMLV would increase with | Φ T | but decrease when | Θ T | increased. The attitude of the camera provided by the POS had measuring error; the azimuth, pitch, and roll measuring errors were 0.08°, 0.05°, and 0.05°, respectively, in the system. The sensitivity of the maximum deviation of the IMLV for the attitude error is illustrated in Figure 9. The magnitude error of the maximum deviation of the IMLV would decrease with | Φ T | or | Θ T | , and the maximum magnitude error was 0.011 mm/s. If the exposure duration was 10 ms, the maximum image motion arising from the attitude error was 0.11 μm. That is one hundredth the size of one pixel, which has little impact on the imaging quality.

4.2. Residual of Image Motion Compensation

4.2.1. Residual Model Establishment

The linear velocity of the image motion will vary with the position on the sensor plane, while the target roll angle or pitch angle is a nonzero value. The residual of the IMC had to be analyzed quantitatively so that the exposure parameters could be set accordingly to limit the residual of the IMC. According to the image motion model established in the previous section, the component of the image motion velocity along the x S axis plays the dominant role. Therefore, the servo system of the three-axis ISP will rotate the pitch axis to compensate the image motion on the x S axis.
The image motion angular rate of O S about the pitch axis can be expressed as in Equation (14). ω F is set as the given value for the pitch axis servo system during the imaging process to compensate for the image motion:
ω F = r x ω D ( | r O d S | sin ( θ O S ) )
where, ω F is the image motion angular rate of O S about the pitch axis, r O d S is the direction vector of the image motion’s linear velocity of O S , r O d S = v S × r O S S × [ 0 0 1 ] T = [ r x r y r z ] T , r O S S is the direction vector of O L O S , θ O S is the intersection angle for the image motion’s linear velocity of O S and O L O S which can be computed in analogy with θ A S , and ω D is the image motion angular rate of object D which can be computed by analogy with ω A .
The angular rate of A S while rotating the pitch axis at ω F , represented by ω A S I M C , can be described as in the formula below:
ω A S I M C = [ ω x ω y ω z ] T = R y p i t c h ( β ) R x r o l l ( α ) [ 0 ω F 0 ] T
The linear velocity of A S corresponding to ω A S I M C , represented by v A L S , is expressed as
v A L S = R x r o l l ( α ) T R y p i t c h ( β ) T [ ω y | O L A S | ω x | O L A S | 0 ] T
According to the vector cross product law, the direction of the linear velocity of the IMC for A S , represented by r A F S , can be given as follows:
r A S F S = v A L S × r A S S × [ 0 0 1 ] T
The intersection angle between r A F S and r A S S , represented by θ A F , is calculated using the vector dot product law:
θ A F = cos 1 ( ( r A S S r A F S ) / ( | r A S S | | r A F S | ) )
The linear velocity of the IMC for A S , represented by v A F S , is calculated as in Equation (19):
v A F S = r A F S | v A L S | / ( | r A F S | sin ( θ A F ) )
Then, the linear velocity residual of the IMC for A S , represented by v A E S , can be calculated according to Equations (13) and (19):
v A E S = r A d S f ω A | r A d S | cos ( α ) cos ( β ) sin ( θ A S ) r A F S | v A L S | | r A F S | sin ( θ A F )

4.2.2. Distribution of the IMC Residual

The magnitude and phase angle of v A F S while A S was located at a different position on the sensor plane were calculated using Equation (19), with the results shown in Figure 10. The parameters were the same as those in Figure 6. ω F was 3.29°/s, according to Equation (14). It can be learned that the linear velocity of the IMC varied on the sensor plane. The maximum magnitude of the IMC linear velocity was 6.252 mm/s, while x S = ±30 mm and y S = 0 mm. The minimum magnitude of the IMC linear velocity was 5.73 mm/s, while x S = 0 mm. The phase angle of the IMC linear velocity was 0°, while y S = 0 mm. The absolute value of the phase angle of the IMC linear velocity would increase with | x S | or | y S | , and the maximum value of it was 4.79°. The distribution of v A E S is illustrated in Figure 11 according to Equation (20). v A E S is represented by a purple line with an arrow on the sensor plane. The starting point of each purple line stands for the corresponding pixel. The phase angle of v A E S varied with x S and y S , so it was impossible to fully compensate the image motion for every pixel. E S had the maximum magnitude of the IMC residual, which was 2.60 mm/s. If the image motion permitted was one half the size of one pixel, then the maximum exposure duration was 1.9 ms.

5. Experiments and Results

In order to verify the airborne area camera imaging control technique and image motion calculation algorithm, a series of flight tests were performed. The three-axis ISP (Figure 12) was developed using the mechanism structure presented in Section 2.1. The roll axis of the platform could rotate from −85° to +85°, and large-scale sweep imaging could be performed. The iXU-RS1000 full-frame aerial camera [24] and video module (Figure 12) were integrated as a replaceable combination, which was mounted on the pitch frame. The focal length was 50 mm, the pixel size was 4.6 μm, the array scale was 11,608 × 8708, the cross-flight angle of view was 56.2°, and the along-flight angle of view was 43.6°. The three-axis ISP was loaded into the payload chamber of an unmanned aircraft (Figure 13). The elevating mechanism descended the camera and video out of the payload chamber before imaging. The aircraft velocity was 45 m/s, and the relative flight height was 1000 m during flight tests. The vertical imaging model and sweep imaging mode were both tested in the flight tests.

5.1. Vertical Imaging Mode

The vertical imaging mode was used for photogrammetry mapping. The LOS of the area camera lied along the local geodetic vertical. Therefore, the target roll angle and target pitch angle were 0°, X S was parallel with the aircraft velocity’s direction, and the along-fight overlap rate of two successive images should have been greater than 56%. The imaging cycle t p was calculated using Equation (1) in real time, and the time sequence in each imaging cycle was adjusted to meet the overlap rate requirement. During the preparation period, φ , ω , and κ were calculated in real time using Equations (6–8), respectively. The servo control module swiftly drove the three axes of the platform to the given gimbal angles. During the photographing period, ω F was calculated using Equation (14) and set as the given value for the velocity loop of the pitch axis to compensate for the image motion. The maximum magnitude of v A E S was 0.49 mm/s according to Equation (20). The maximum image motion was 2.3 μm during the flight tests. Therefore, the exposure duration should have been no longer than 4.7 ms. Figure 14 shows the mosaic image series of six successive images in the vertical imaging mode. The circled number on each image in Figure 14 represents the imaging order. The enlarged images in the square imply that the image quality was excellent and the image motion was well compensated. The image attitude deviations of thirty successive images are shown in Figure 15. HD in Figure 15 stands for the heading deviation, which is the deviation from X S to the flight direction. The image’s azimuth or roll deviation was less than 0.1°, and the image’s pitch deviation was less than 0.33°. The imaging attitude in the vertical imaging mode was accurately controlled, and the overlap rate was greater than 65%.

5.2. Sweep Imaging Mode

The sweep imaging mode can enlarge the cross-flight angle of view considerably and has higher aerial survey productivity. During flight tests, the roll gimbal stepped two times, and the iXU-RS1000 camera took three images in one sweeping cycle. The target pitch angle was 0° for each image, and X S paralleled the aircraft velocity direction. The cross-flight overlap rate was 5% for two cross-flight adjacent images, and the target roll angles for the successive three images in one sweeping cycle were −50.58°, 0°, and +50.58°, respectively. The total cross-flight angle of view for one sweeping cycle was enlarged to 157.36°. The along-flight overlap rate was 5% between two successive sweeping cycles. The maximum exposure duration was 4.7 ms when the target roll angle was 0°, which was the same as the one in the vertical imaging mode. The maximum magnitude of v A E S was 1.02 mm/s according to Equation (20), while the target roll angle was −50.58° or +50.58°. Therefore, the maximum exposure duration was 2.3 ms. Figure 16 shows the mosaic image series of two adjacent sweeping cycles. Image 1, image 2, and image 3 are successive images of one sweeping cycle, and image 4, image 5, and image 6 are successive images of the next sweeping cycle. Table 2 presents the imaging attitude of each image in Figure 16. The imaging attitude was accurately controlled, and gap-free coverage of the survey area was maintained. The enlarged image in image 4 shows that the black and white line pairs for evaluation can be clearly distinguished, which verifies that the image motion was well compensated. The enlarged areas in image 1 and image 3 are distinct, which implies that the residual of the IMC was well constrained. Figure 17 shows the image attitude deviation over 180 s. Ten sweeping cycles were included, and thirty images were taken. The image’s azimuth or roll deviation was less than 0.1°, and the image’s pitch deviation was less than 0.35°. The image attitude in the sweep imaging mode was correctly controlled.

6. Conclusions

In this paper, a three-axis ISP based on a cantilever structure was designed which supported a rolling range from −85° to +85° in sweep imaging mode. The techniques were developed for airborne area camera vertical imaging and sweep imaging based on a three-axis platform. The image attitude is regulated in inertial space by rotating three axes to derivated gimbal angles. The area camera image motion calculation model was proposed, and the distribution of image motion on the sensor plane was analyzed. The linear velocity of the image motion may vary considerably at different positions of the sensor plane while the target roll or pitch angle is nonzero, so it is impossible to fully compensate for image motion for every pixel. The linear velocity of the IMC residual was analyzed using the proposed method. The exposure duration was controlled to limit the residual of the image motion compensation. The flight tests implied that the imaging attitude was correctly controlled. The image azimuth or roll deviation was less than 0.1°. The pitch axis was rotated at the given angular rate to compensate for the image motion while imaging, and the deviation of the image pitch was less than 0.35°. The along-fight overlap rate in the vertical imaging mode was greater than 65%, which meets the stereo mapping requirement. The sweep imaging technique considerably enlarged the cross-flight angle of view, and gap-free coverage of the survey area was maintained. The image quality was excellent, and the black and white line pairs for evaluation could be clearly distinguished, which verified that the image motion was well compensated for and the residual of the image motion compensation was well constrained. The image attitude control method and the IMC residual model proposed in this paper have promising application prospects in the fields of aerial sweeping surveys and oblique photogrammetry mapping for area cameras.

Author Contributions

Conceptualization, Y.Y.; software, Y.Y., C.Y. and N.H.; validation, Y.Y., C.Y., Y.W., N.H. and H.K.; writing—original draft preparation, Y.Y. and C.Y.; writing—review and editing, Y.Y., Y.W. and C.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the National Science and Technology Major Project of China (No. 2016YFC0803000) and the Key Laboratory of Airborne Optical Imaging and Measurement at the Chinese Academy of Sciences.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ben-Ezra, M. A digital gigapixel large-format tile-scan camera. IEEE Comput. Graph. Appl. 2011, 31, 49–61. [Google Scholar] [CrossRef] [PubMed]
  2. Ultracam Eagle Mark 3 One System for Endless Possibilities; Vexcel Imaging GmbH: Graz, Austria, 2019.
  3. Phase One Full Frame Aerial Cameras iXM-RS150F/iXM-RS100F; Phase One A/S: Frederiksberg, Denmark, 2018.
  4. Hilkert, J.M. Inertially stabilized platform technology Concepts and principles. IEEE Control Syst. Mag. 2008, 28, 26–46. [Google Scholar]
  5. Masten, M.K. Inertially stabilized platforms for optical imaging systems. IEEE Control Syst. Mag. 2008, 28, 47–64. [Google Scholar]
  6. Li, J.; Zhou, X. Study on Dynamic Modeling and Simulation for an Air-Based Three-Axis ISP System. Pervasive Comput. Signal Process. Appl. 2010, 175–178. [Google Scholar] [CrossRef]
  7. Hilkert, J.M. Kinematic algorithms for line-of-sight pointing and scanning using INS/GPS position and velocity information. Proc. SPIE 2005, 5810, 11–22. [Google Scholar]
  8. Miller, R.; Mooty, G.; Hilkert, J.M. Gimbal system configurations and line-of-sight control techniques for small UAV applications. Proc. SPIE 2013, 8713, 871308. [Google Scholar]
  9. Xiu, J.; Huang, P.; Li, J.; Zhang, H.; Li, Y. Line of sight and image motion compensation for step and stare imaging system. Appl. Sci. 2020, 10, 7119. [Google Scholar] [CrossRef]
  10. Helble, H.; Cameron, S. OATS: Oxford aerial tracking system. Robot. Auton. Syst. 2007, 55, 661–666. [Google Scholar] [CrossRef]
  11. Hurak, Z.; Rezac, M. Image-based pointing and tracking for inertially stabilized airborne camera platform. IEEE Trans. Control Syst. Technol. 2012, 20, 1146–1159. [Google Scholar] [CrossRef]
  12. Zhou, X.; Gong, G.; Li, J.; Zhang, H.; Yu, R. Decoupling control for a three-axis inertially stabilized platform used for aerial remote sensing. Trans. Inst. Meas. Control 2015, 37, 1135–1145. [Google Scholar] [CrossRef]
  13. Chen, L.; Mu, Q. A novel disturbance estimation and compensation approach applied to three-axis inertially stabilized platform. In Proceedings of the 2nd IEEE Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC), Xi’an, China, 25–27 May 2018; pp. 1420–1428. [Google Scholar] [CrossRef]
  14. Held, K.J.; Brendan, H.R. Tier II Plus Airborne EO sensor LOS control and image geolocation. IEEE Aerosp. Conf. 1997, 2, 377–405. [Google Scholar]
  15. Königseder, F.; Kemmetmüller, W.; Kugi, A. Attitude control strategy for a camera stabilization platform. Mechatronics 2017, 46, 60–69. [Google Scholar] [CrossRef]
  16. Liu, M.; Kuang, H.P.; Wu, H.S.; Liu, G.; Xiu, J.H.; Zhai, L.P. Survey on the image motion compensation technology. Electron. Opt. Control 2004, 11, 46–49. [Google Scholar]
  17. Olson, G.G. Image motion compensation with frame transfer CCDs. Proc. SPIE 2002, 4567, 153–160. [Google Scholar]
  18. Lu, P.; Li, Y.; Jin, L.; Li, G.; Wu, Y.; Wang, W. Image motion velocity field model of space camera with large field and analysis on three-axis attitude stability of satellite. Opt. Precis. Eng. 2016, 24, 2173–2182. [Google Scholar]
  19. Ren, H.; Hu, T.; Song, Y.; Sun, H.; Liu, B.; Gao, M. An improved electronic image motion compensation method of aerial full-frame-type area array CCD camera based on the CCD multiphase structure and hardware implementation. Sensors 2018, 18, 2632. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  20. Wang, Y.; Yang, Y.; Kuang, H.; Yuan, D.; Yu, C.; Chen, J.; Hua, N.; Hou, H. High Performance Both in Low-Speed Tracking and Large-Angle Swing Scanning Based on Adaptive Nonsingular Fast Terminal Sliding Mode Control for a Three-Axis Universal Inertially Stabilized Platform. Sensors 2020, 20, 5785. [Google Scholar] [CrossRef] [PubMed]
  21. Wang, Y.; Tian, D.; Dai, M. Composite hierarchical anti-disturbance control with multisensor fusion for compact optoelectronic platforms. Sensors 2018, 18, 3190. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  22. Che, X.; Tian, D.; Jia, P. Terminal sliding mode control with a novel reaching law and sliding mode disturbance observer for inertial stabilization imaging sensor. Sensors 2020, 20, 3107. [Google Scholar] [CrossRef] [PubMed]
  23. Noureldin, A.; Karamat, T.B.; Georgy, J. Fundamentals of Inertial Navigation, Satellite-Based Positioning and Their Integration; Springer: Berlin/Heidelberg, Germany, 2013; pp. 21–63. [Google Scholar]
  24. Phase One IXU/IXU-RS Aerial Camera Series; Phase One A/S: Frederiksberg, Denmark, 2016.
Figure 1. Mechanism of the three-axis inertially stabilized platform.
Figure 1. Mechanism of the three-axis inertially stabilized platform.
Applsci 11 05856 g001
Figure 2. Block diagram of the imaging control.
Figure 2. Block diagram of the imaging control.
Applsci 11 05856 g002
Figure 3. Fundamental coordinate frames.
Figure 3. Fundamental coordinate frames.
Applsci 11 05856 g003
Figure 4. Image projection maps on the ground plane for the three-axis and two-axis platforms.
Figure 4. Image projection maps on the ground plane for the three-axis and two-axis platforms.
Applsci 11 05856 g004
Figure 5. Schematic diagram of airborne area camera photography.
Figure 5. Schematic diagram of airborne area camera photography.
Applsci 11 05856 g005
Figure 6. Image motion’s linear velocity distribution on the sensor plane.
Figure 6. Image motion’s linear velocity distribution on the sensor plane.
Applsci 11 05856 g006
Figure 7. Ground imaging area and primary light distribution.
Figure 7. Ground imaging area and primary light distribution.
Applsci 11 05856 g007
Figure 8. Maximum deviation of the IMLV with the target roll and pitch angles.
Figure 8. Maximum deviation of the IMLV with the target roll and pitch angles.
Applsci 11 05856 g008
Figure 9. Magnitude error of the maximum deviation of the IMLV with the target roll and pitch angles.
Figure 9. Magnitude error of the maximum deviation of the IMLV with the target roll and pitch angles.
Applsci 11 05856 g009
Figure 10. Magnitude or phase angle of the linear velocity of the IMC with a position on the sensor plane.
Figure 10. Magnitude or phase angle of the linear velocity of the IMC with a position on the sensor plane.
Applsci 11 05856 g010
Figure 11. Distribution of the residual of the IMC linear velocity on the sensor plane.
Figure 11. Distribution of the residual of the IMC linear velocity on the sensor plane.
Applsci 11 05856 g011
Figure 12. Three-axis ISP and iXU-RS1000 camera.
Figure 12. Three-axis ISP and iXU-RS1000 camera.
Applsci 11 05856 g012
Figure 13. Three-axis ISP loaded in the payload chamber of an unmanned aircraft.
Figure 13. Three-axis ISP loaded in the payload chamber of an unmanned aircraft.
Applsci 11 05856 g013
Figure 14. Mosaic image series of six successive images.
Figure 14. Mosaic image series of six successive images.
Applsci 11 05856 g014
Figure 15. Image attitude deviation in the vertical imaging mode over time.
Figure 15. Image attitude deviation in the vertical imaging mode over time.
Applsci 11 05856 g015
Figure 16. Series of mosaic images of two adjacent sweeping cycles.
Figure 16. Series of mosaic images of two adjacent sweeping cycles.
Applsci 11 05856 g016
Figure 17. Image attitude deviation in sweep imaging mode over time.
Figure 17. Image attitude deviation in sweep imaging mode over time.
Applsci 11 05856 g017
Table 1. Three-axis and two-axis platform gimbal angles.
Table 1. Three-axis and two-axis platform gimbal angles.
Aircraft Attitude (°)Target Pointing (°)Three-Axis Platform Gimbal Angles (°)Two-Axis Platform Gimbal Angles (°)
Image 1Φ = 3.0;
Θ = 1.5;
Ψ = 6.0
Φ T = 15.0 ;
Θ T = 20.0
φ = −16.7;
ω = −22.8;
κ = −11.8
φ = −21.0;
ω = −19.0
Image 2Φ = 2.0;
Θ = −2.5;
Ψ = −5.0
Φ T = 31 ;
Θ T = 20
φ = −31.2;
ω = −20.4;
κ = −5.3
φ = −32.8;
ω = −17.5
Image 3Φ = −2.5;
Θ = 5.5;
Ψ = 4.0
Φ T = 47 ;
Θ T = 20
φ = −38.8;
ω = −33.2;
κ = −27.6
φ = −47.7;
ω = −14.0
Table 2. Image attitude in sweep imaging mode.
Table 2. Image attitude in sweep imaging mode.
Image 1Image 2Image 3Image 4Image 5Image 6
Roll (°)50.570.02−50.5650.54−0.02−50.55
Pitch (°)0.260.210.250.310.270.28
HD (°)0.060.010.030.01−0.03−0.04
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Yang, Y.; Yu, C.; Wang, Y.; Hua, N.; Kuang, H. Imaging Attitude Control and Image Motion Compensation Residual Analysis Based on a Three-Axis Inertially Stabilized Platform. Appl. Sci. 2021, 11, 5856. https://0-doi-org.brum.beds.ac.uk/10.3390/app11135856

AMA Style

Yang Y, Yu C, Wang Y, Hua N, Kuang H. Imaging Attitude Control and Image Motion Compensation Residual Analysis Based on a Three-Axis Inertially Stabilized Platform. Applied Sciences. 2021; 11(13):5856. https://0-doi-org.brum.beds.ac.uk/10.3390/app11135856

Chicago/Turabian Style

Yang, Yongming, Chunfeng Yu, Yuanchao Wang, Nan Hua, and Haipeng Kuang. 2021. "Imaging Attitude Control and Image Motion Compensation Residual Analysis Based on a Three-Axis Inertially Stabilized Platform" Applied Sciences 11, no. 13: 5856. https://0-doi-org.brum.beds.ac.uk/10.3390/app11135856

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop