Next Article in Journal
Characterization of Asphalt Mixtures Produced with Coarse and Fine Recycled Asphalt Particles
Previous Article in Journal
From Theory to Field Evidence: Observations on the Evolution of the Settlements of an Earthfill Dam, over Long Time Scales
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Tracking the 6-DOF Flight Trajectory of Windborne Debris Using Stereophotogrammetry

1
Computer Science Department, Missouri University of Science and Technology, Rolla, MO 65409, USA
2
Department of Civil & Environmental Engineering, Colorado State University, Fort Collins, CO 80523-1372, USA
*
Author to whom correspondence should be addressed.
Submission received: 16 September 2019 / Revised: 22 October 2019 / Accepted: 22 October 2019 / Published: 24 October 2019

Abstract

:
Numerous post-windstorm investigations have reported that windborne debris can cause costly damage to the envelope of buildings in urban areas under strong winds (e.g., during hurricanes or tornados). Thus, understanding the physics of debris flight is of critical importance. Previously developed numerical models describing debris flight physics have not been validated for the complex urban flow environment; such a validation requires experimentally measuring the debris flight trajectory in wind tunnel tests. In this context, this paper proposes a debris measurement algorithm using stereophotogrammetry. This algorithm aims to determine the six-degree-of-freedom (6-DOF) trajectory and velocity of flying debris, addressing the research gap, i.e., the lack of an algorithm/software for measuring three-rotational-DOF using stereophotogrammetry. This is a civil engineering problem, but computer graphics is the foundation to solve it. This paper focuses on the theoretical development of the algorithm. The developed algorithm can be readily implemented in modern wind tunnel experiments.

Graphical Abstract

1. Introduction

Windborne debris damage to the cladding and façades of buildings has been identified as a major contributor to damage in urban areas after windstorms (e.g., hurricanes, tornados, thunderstorms) in numerous studies [1,2,3,4,5,6,7,8,9,10,11]. To physically model debris-induced damage, one needs to understand debris flight behavior. In recent years, there have been a number of experimental and numerical studies investigating the mechanics of debris flight. Tachikawa [12] pioneered fundamental research on the two-dimensional (2D) trajectory of a flying plate in a uniform flow through both wind tunnel experiments and numerical simulation. Following Tachikawa’s work, Lin et al. [13,14] conducted wind tunnel studies to investigate the trajectories of compact-, rod-, and plate-type debris in a uniform wind field and provided experimental support for the validation of numerical simulations using Tachikawa’s equations. However, the numerical models developed in the literature have not been validated for the complex 3D urban wind field. Researchers have found that a complex turbulent flow structure in an urban environment can dramatically alter the typical flight distance/speed and lead to unusual debris trajectory, through both experimental and full-scale investigations [15,16,17,18]. Thus, experimental validation of the numerical debris flight model for an urban built environment is critical for understanding complex debris flight behavior and predicting the debris impact location, energy/momentum, and ultimately the damage state of the building envelope components. Such a validation first requires the measurement of the debris flight trajectory in a 3D field in wind tunnel tests. In the hardware aspect, the high-speed cameras that have become more accessible in modern wind tunnels can be used for the measurement; however, to the authors’ knowledge, there is no off-the-shelf algorithm/software for extracting the entire six-degree-of-freedom (6-DOF) trajectory of flying debris from the images. The existing literature on trajectory measuring of windborne debris using high-speed cameras either only measures the three-translational-DOF (without the three-rotational-DOF) [19] or qualitatively analyzes the different flight modes of the debris instead of resolving the coordinates of the full trajectory [17]. In this context, this paper proposes a debris trajectory measuring algorithm that can reconstruct all 6-DOF displacements and the velocity of debris in a 3D field from pairs of stereoscopic 2D images captured by two high-speed, high-resolution cameras. The concept of using pairs of stereoscopic 2D images to track debris is similar to stereoscopic particle image velocimetry (PIV), which is a non-intrusive laser optical technique for measurement of flow velocity. However, it is distinct from stereoscopic PIV in the following three aspects: (i) the proposed method tracks a single piece of debris, while PIV tracks the patterns of a group of flow particles; (ii) the proposed method does not rely on a laser sheet to illuminate the debris, and thus can track the debris flight with larger lateral motion, while PIV can only track particle motions within the thickness of a laser sheet; and (iii) the proposed method is developed to track the 6-DOF trajectory of debris, while PIV can only measure the 3-DOF motions of flow particles. In wind engineering, three types of debris are typically concerned, i.e., compact, rod, and thin plate. In this paper, we focus on the thin plate. The theoretical development of the three-translational-DOF of compact debris has been included in this discussion as the foundation for the algorithm for thin plates. Furthermore, this algorithm can be extended to consider rod-like debris in the future.
The paper is organized as follows. Section 2 includes the problem formulation and notation. Section 3 focuses on the reconstruction of the 6-DOF coordinates of debris from stereopairs of camera images. Section 4 derives the differential operator for velocity reconstruction. Section 5 introduces procedures for the reconstruction of time histories of debris motion trajectory, including both displacements and velocity. Section 6 includes concluding remarks and a discussion of future directions.

2. Background

Problem Formulation and Notation

In this problem (depicted in Figure 1), first, a background grid is utilized to calibrate the corresponding pixel locations for two high-speed cameras; it can be removed once the calibration is completed. Then the two cameras are used to measure the 2D coordinates of the projection of flying debris on the grid wall by lights emanating from the camera locations. Using the stereopair of 2D coordinates (i.e., d 1 , h 1 , d 2 , h 2 measured by the two cameras) of the debris, we try to determine both the position (three-translational-DOF, x, y, z) and orientation (three-rotational-DOF) of the debris, given the relative locations of the two cameras to the grid wall (i.e., d and l ).
This problem will be solved in two steps: First, for a single compact piece of debris, considered as a point object (only three-translation-DOF is considered), then for rigid body thin plate debris. The former lays the theoretical foundation for the latter.
The notation used in the problem is defined as follows. The notation of points and matrices are in upper case letters, the position vectors are in bold lowercase letters, and vectors are column vectors by default. For example, z j is the position vector of point Z j , while p j is the position vector of point P j . The list of variables used in Figure 1 is given below.
Z 1 , Z 2 –points at the lens positions of two cameras
P –position of a point debris
P 1 , P 2 –projected locations of the debris on the grid wall seen from two cameras (i.e., a stereopair), respectively
d –distance between cameras
l –distance between camera lens and the wall along the y-direction; the wall is parallel to the zx plane
d 1 , d 2 –distances of points on the wall along the x-axis from the y-axis
h 1 , h 2 –heights of points on the wall along the z-axis from the xy plane
β 1 , β 2 –angles that Z 1 P 1 , Z 2 P 2 rays make with the xy plane
θ 1 , θ 2 –angles between the x-axis and the projections of Z 1 P 1 , Z 2 P 2 rays on the xy plane
We need to determine the 3D position of the debris, i.e., x, y, and z, in terms of these parameters.

3. Debris Position and Spatial Orientation (6-DOF) in 3D Space

3.1. The 2D Stereopairs’ Positions

In Figure 1, Z 1 is the origin and Z 2 is the origin offset by d units along the x-axis: z 2 = z 1 + [ d , 0 , 0 ] T . The position vector for point P 1 on the wall is
p 1 = [ h 1 cot ( β 1 ) cos ( θ 1 ) , h 1 cot ( β 1 ) sin ( θ 1 ) , h 1 ] T ,
with a distance from Z 1 of:
t 1 = h 1   cos ec ( β 1 ) = | C 1 P 1 | .
Alternatively, p 1 = [ d 1 , d 1 tan ( θ 1 ) , h 1 ] , t 1 = d 1 2 sec ( θ 1 ) 2 + h 1 2 . Similarly, for point P 2 on the wall:
p 2 = [ d 1 , 0 , 0 ] T + [ h 2 cot ( β 2 ) cos ( θ 2 ) , h 2 cot ( β 2 ) sin ( θ 2 ) , h 2 ] T ,
with a distance from Z 2 of:
t 2 = h 2   cos ec ( β 2 ) = | Z 2 P 2 | .
Alternatively,
p 2 = [ d , 0 , 0 ] T + [ ( d d 2 ) , ( d d 2 ) tan ( θ 2 ) , h 2 ] T = [ d 2 , ( d d 2 ) tan ( θ 2 ) , h 2 ] T
t 2 = ( d d 2 ) 2   cos ec ( θ 2 ) 2 + h 2 2 .
In the simplest terms, the vectors along Z 1 P 1 and Z 2 P 2 are normalized to unit vectors, d 1 and d 2 , where
d 1 = [ cos ( β 1 ) cos ( θ 1 ) , cos ( β 1 ) sin ( θ 1 ) , sin ( β 1 ) ] T
d 2 = [ cos ( β 2 ) cos ( θ 2 ) , cos ( β 2 ) sin ( θ 2 ) , sin ( β 2 ) ] T .
Then the position vectors for the stereo-pair P 1 and P 2 are
p 1 = z 1 + t 1 d 1 ,   p 2 = z 2 + t 2 d 2 .
This is valid even if the cameras are not at the same distance from the wall.

3.2. The Relationship between the 3D Spatial Position of Debris and the 2D Stereopairs’ Positions

Consider compact debris or a point object. The camera beams intersect at the true location of the debris in 3D space with position vector: z 1 + s 1 d 1 = z 2 + s 2 d 2 . This leads to the following equations:
s 1 d 1 s 2 d 2 = z 2 z 1
[ d 1 , d 2 ] [ s 1 s 2 ] = z 2 z 1 .
This is an overdetermined system of three equations involving two unknowns, s 1 and s 2 . Let M = [ d 1 d 2 ] , which is a 3 × 2 matrix. Since a 3 × 2 matrix does not have an inverse, we exploit the generalize inverse, denoted by M + , to determine the least squares error solution. Now M T = [ d 1 T d 2 T ] it is a 2 × 3 matrix, so we create a 2 × 2 matrix, M T M :
M T M = [ d 1 T d 2 T ] [ d 1 T , d 2 T ] = [ d 1 T d 1 d 1 T d 2 d 2 T d 1 d 2 T d 2 ] = [ d 1 d 1 d 1 d 2 d 1 d 2 d 2 d 2 ] .
Since d 1 and d 2 are unit vectors, let k = d 1 d 2 = cos ( θ ) , where θ is the angle between d 1 and d 2 . Then,
M T M = [ 1 k k 1 ] ,   det ( M T M ) = 1 k 2
M + = ( M T M ) 1 M T = [ 1 k k 1 ] 1 k 2 [ d 1 T d 2 T ] ,   M + = [ d 1 T k d 2 T k d 1 T d 2 T ] 1 k 2
Then,
[ s 1 s 2 ] = [ d 1 , d 2 ] + ( z 2 z 1 ) = [ d 1 T k d 2 T k d 1 T d 2 T ] 1 k 2 ( z 2 z 1 ) .
This form is useful when the camera direction changes and the location is fixed; otherwise, it can be further simplified to
[ s 1 s 2 ] = [ ( d 1 T k d 2 T ) ( z 2 z 1 ) ( k d 1 T d 2 T ) ( z 2 z 1 ) ] 1 k 2 .
We computed vector forms z 1 + t 1 d 1 , z 2 + t 2 d 2 for input points on the wall. Now we have computed [ s 1 s 2 ] for the corresponding object in 3D space. The actual object is at z 1 + s 1 d 1 = z 2 + s 2 d 2 for these s 1 and s 2 . This suggests that the object size at z 1 + t 1 d 1 can be scaled by s 1 t 1 , with s 1 computed above, along direction d 1 to get the actual size at z 1 + s 1 d 1 in 3D space. Similarly, for camera 2, the object size at z 2 + t 2 d 2 is scaled by s 2 t 2 , ( s 2 computed above), along the direction d 2 . The rationale for the scaling of an object at P 1 by a scale factor of s 1 t 1 along the line Z 1 P 1 is as follows:
z 1 + s 1 t 1 ( Z 1 P 1 ) = z 1 + s 1 t 1 ( p 1 z 1 ) = z 1 + s 1 t 1 ( z 1 + t 1 d 1 z 1 ) = z 1 + s 1 t 1 ( t 1 d 1 ) = z 1 + s 1 d 1 .
Similarly, for camera 2,
z 2 + s 2 t 2 ( Z 2 P 2 ) = z 2 + s 2 d 2 .

3.3. Explicit Expressions of Debris Position in 3D Space

Once s 1 is determined for 3D position, we can compute the desired parameters x, y, and z as in Figure 1. Now s 1 d 1 becomes
s 1 d 1 = [ z cot ( β 1 ) cos ( θ 1 ) , z cot ( β 1 ) sin ( θ 1 ) , z ] T .
Since d 1 is a unit vector, this implies that
s 1 = z   cos ec ( β 1 ) ,   z = s 1 sin ( β 1 ) ,   r 1 = s 1 cos ( β 1 )
x = r 1 cos ( θ 1 ) = s 1 cos ( β 1 ) cos ( θ 1 ) ,   y = r 1 sin ( θ 1 ) = s 1 cos ( β 1 ) sin ( θ 1 ) .
Similarly,
s 2 = z   cos ec ( β 2 ) , z = s 2 sin ( β 2 ) , r 2 = s 2 cos ( β 2 )
x = d 1 r 2 cos ( θ 2 ) = d 1 s 2 cos ( β 2 ) cos ( θ 2 ) ,   y = r 2 sin ( θ 2 ) = s 2 cos ( β 2 ) sin ( θ 2 ) .

3.4. Cartesian Position and Orientation for Thin Plate Debris

Now we derive the position and orientation for rigid body thin plate debris, based on the 3D positions of the point objects constructed from the stereo images in the previous section. In Figure 2, let C be the centroid (also called the pivot) of the rigid body plate with vertices Q 1 , Q 2 , Q 3 , Q 4 , Q 5 , Q 6 , Q 7 , and Q 8 . Assume that the vertices of the debris are identifiable in the stereo images using marker tracking. The 3D positions of the vertices can then be determined using the expressions in the previous section based on the stereo measurements. The problem now becomes using the 3D positions of vertices to determine the orientation (three-rotational-DOF) of the rigid body.
Firstly, a local coordinate system needs to be defined using any three non-collinear points on the debris. Here we use the points C , Q 1 , and Q 2 to create a local coordinate system centered at C and axis directions with unit vectors u , v , and w , where u is along C Q 1 , v is orthogonal to C Q 1 and in the plane of C , Q 1 , and Q 2 , with a projection of C Q 2 on v positive, and w is naturally u × v so that u , v , and w form a right-handed system of orthonormal vectors: u = C Q 1 | C Q 1 | , v = C Q 2 u C Q 2 u | C Q 2 u C Q 2 u | , w = u × v (equivalently w = u × C Q 2 | u × C Q 2 | , v = w × u ).
The matrix R = [ u v w ] of axis unit vectors is the rotation matrix and is just the 3D local coordinate system at C . For simplicity in the notation for time history of the plate trajectory determined from vertices reconstructed using stereopairs, the 3D plate is represented as a frame F ( t ) at time t that contains both the position and the orientation for the plate. Since at each time point of the trajectory, there is a frame, the frame sequence will later be denoted by F k , k = 1 , 2 , , N . This is similar to frames in a robotic system of N - links.
The matrix F = [ R c 0 1 ] represents the local coordinate system centered at C . The frame can be written interchangeably as
F = [ R c 0 1 ] = [ u v w c 0 0 0 1 ] = [ u 1 v 1 w 1 c 1 u 2 v 2 w 2 c 2 u 3 v 3 w 3 c 3 0 0 0 1 ] .
This matrix F transforms [ i j k 0 0 0 0 1 ] to [ u v w c 0 0 0 1 ] . The inverse of F transforms the coordinate system [ u v w c 0 0 0 1 ] , located at the centroid C of the debris, to the universal coordinate system [ i j k 0 0 0 0 1 ] . The inverse of F can be written as follows:
F 1 = [ u T v T w T 0 - u T c - v T c - w T c 1 ] = [ u 1 v 1 w 1 - u T c u 2 v 2 w 2 - v T c u 3 v 3 w 3 - w T c 0 0 0 1 ] = [ u 1 v 1 w 1 ( u 1 c 1 + u 2 c 2 + u 3 c 3 ) u 2 v 2 w 2 ( v 1 c 1 + v 2 c 2 + v 3 c 3 ) u 3 v 3 w 3 ( w 1 c 1 + w 2 c 2 + w 3 c 3 ) 0 0 0 1 ] .

3.5. Three-Degree-of-Freedom (3-DOF) Frames in Universal Coordinate System

Here we describe the relationship between the local coordinate system and the axis of rotation and the amount of rotation in a universal coordinate system. The local coordinate system at the centroid of the plate is a frame
F = [ R c 0 1 ] = [ u v w c 0 0 0 1 ] ,
where R is the rotation matrix about an axis, which is a composite of rotations about the x, y, and z axes. If n is the unit vector and θ is the angle amount of counter clockwise rotation about n , the axis through the origin, then [20]
R = n n T + ( I n n T ) cos ( θ ) + n × I sin ( θ ) .
This is directly related to the matrix representation, R = [ u v w ] . The angle θ and vector n are given by
θ = cos 1 ( u 1 + v 2 + w 3 1 2 )
n = [ n 1 n 2 n 3 ] = [ v 3 w 2 w 1 u 3 u 2 v 1 ] / 2 sin ( θ ) .
The rotation vector and amount of rotation about n is denoted by n θ , giving the 3-DOF Euler angles, components of rotation about the x, y, and z axes in the universal coordinate system.

3.6. Differential Operators for 6-DOF Motion

To facilitate the determination of the velocity, differential operators for 6-DOF motion with respect to the universal coordinate system are derived in this section. Let d F = F k + 1 F k or explicitly 3-DOF translational and 3-DOF rotational differentials are
d F = [ d R d c 0 0 ] ,
where d R and d c are differential changes in rotation ( R ) and translation ( c ). The differential d c is a linear differential in the x, y, and z coordinates of c ; it is simple and trivial. However, differential change, d R , in the angle of rotation is not straightforward because the differential of the rotation matrix is a matrix operator. Recall that matrix R [21] is
R = n n T + ( I n n T ) cos ( θ ) + n × I sin ( θ ) .
Differentiating with respect to θ , we have
d R / d θ = ( I n n T ) sin ( θ ) + n × I cos ( θ ) .
This derivative of the rotation matrix can be expressed as a matrix multiplication, the differential operator [21]. This is accomplished by cross-multiplying Equation (28) by n as follows:
n × R = n × ( n n T + ( I n n T ) cos ( θ ) + n × I sin ( θ ) ) .
Using vector manipulation identities, this convoluted expression is simplified as follows:
n × R = n × n n T + n × ( I n n T ) cos ( θ ) + n × ( n × I ) sin ( θ ) = 0 + ( n × I 0 ) cos ( θ ) + ( n n I I n n ) sin ( θ ) = n × I cos ( θ ) + ( n n T I ) sin ( θ ) = ( I n n T ) sin ( θ ) + n × I cos ( θ )
which is identical to d R / d θ , as obtained in Equation (29).
Recalling that the amount of rotation in the universal coordinate system is represented by θ n , the differential becomes d θ n , which yields the rotation about the x, y, and z axes. This is consistent with the differential operator derived from the matrix:
d R / d θ = n × R = n × I R ,
d R = n × R d θ = n × I R d θ = n × I d θ R = n d θ × I R = Δ R ,
where Δ is the differential operator, defined below. Thus, the differential d R of R is a matrix multiplication of R by a matrix, called the differential operator and denoted by Δ .
Δ = n d θ × I = [ 0 d θ n 3 d θ n 2 d θ n 3 0 d θ n 1 d θ n 2 d θ n 1 0 ] ,   or   simply   = [ 0 d θ z d θ y d θ z 0 d θ x d θ y d θ x 0 ] .
Summarizing, if θ is the angle of rotation about n and θ n represents the amount of rotation in the universal coordinate system, then the differential becomes d θ n , which yields the rotation about the x, y, and z axes; it is consistent with the differential operator derived from the matrix. Here Δ is the differential operator that represents the differential change from R k to R k + 1  with respect to the universal coordinate system, where R k  is the rotational submatrix of frame F k . The rotational part is [ 0 d θ z d θ y d θ z 0 d θ x d θ y d θ x 0 ] with the angular changes about the x, y, and z axes. In short, the rotational differential is [ d θ x d θ y d θ z ] T and the translational differential is [ d x d y d z ] T . The composite differential is 6-DOF vector [ d x d y d z d θ x d θ y d θ z ] T , representing a composite vector for the translational and rotational velocity about the x, y, and z axes, similar to the notation used in Jacobians for the velocity of robotic link velocity applications.

4. 6-DOF Motion Consideration

4.1. Motivation

It is not necessary to know the initial velocity in this case; only the differential change matters. At any time t , from the position of the plate, we can get the orientation from the frame F ( t ) . Also, during the motion, some features invisible to the camera may become visible, while some that are visible may become invisible. We need to identify at least three vertices, which will be flagged at all times for the sake of calculating the local coordinate system and the rotational motion, in addition to the translational motion.
As the debris moves, frame F ( t ) at time t moves to the next frame F ( t + h ) with time step h ; the differential change in the frame, denoted by d F = F ( t + h ) F ( t ) , can be used to derive the numerical instantaneous translational and rotational velocities. In this case, the debris is flying behind a building; the average velocity can be derived from visible positions (after the time lag of the hidden period) or continuously from the flags identified earlier. In such cases, we may opt to have one continuous trajectory, or multiple segments of trajectory.

4.2. The Relationship between Differential Transformations in Universal and Local Coordinate Systems

4.2.1. General Transformation

As a transformation is accomplished using two coordinate systems, the transformation can be expressed between them in terms of universal and local coordinate systems. In general, if T is the transformation of frame F with respect to the universal coordinate system, this is represented by U, while T is its counterpart in the local coordinate system. They are related by the equation T = F T F 1 . If a point or matrix X has coordinates with respect to frame F , then T F ( X ) = F ( T ( X ) ) . Equivalently, if Y has coordinates with respect to the universal frame, then T ( Y ) = F ( T ( F 1 ( Y ) ) ) . Thus, T and T accomplish the same task from different coordinate systems, and are related by the equations T = F T F 1 and T = F 1 T F .

4.2.2. Rotational Differential Transform with Respect to Universal Coordinate System

We have shown that for rotation matrix R , the rotation differential operator with respect to the universal coordinate system is Δ = n d θ × I . Let ω = n d θ be called the angular differential, then Δ = ω × I .

4.2.3. Differential Transform in Universal Coordinate System

The translational and rotational velocities are first computed in the universal coordinate system.
Definition 1.
For F = [ R c 0 1 ] and the universal differential transformation matrix T , the differential change in frames is defined as follows:
d F = T F = [ d R d c 0 0 ] = [ Δ R d c 0 0 ] ,
where Δ = ω × I , ω = n d θ , d c = [ d x d y d z ] .
Now we define T as the differential operator with respect to the universal coordinate system. d F = T F implies T = d F F 1 , then T = [ Δ d c Δ c 0 0 ] .

4.2.4. Differential Transform in Local Coordinate System

For studying the different flight modes of a plate, it is desirable to have the computation of velocities also available in the local coordinate system, and is more conceptually reasonable. Before deriving the differential transformation in the local coordinate system, we will first show a few auxiliary results.
  • Show that the rotation differential operator Δ = R T Δ R with respect to the local frame system, such that
    Δ = [ ω u ω v ω w ] × I .
    Proof. 
    Δ = R T Δ R = R T ω × I R = R T ω × R = R T [ ω × u , ω × v , ω × w ] = [ R T ω × u , R T ω × v , R T ω × w ]
    = [ [ ω × u u ω × u v ω × u w ] , [ ω × v u ω × v v ω × v w ] , [ ω × w u ω × w v ω × w w ] ] = [ 0 ω w ω v ω w 0 ω u ω v ω u 0 ] = [ ω u ω v ω w ] × I .
     □
  • Show that the linear differential d c = R T d c in the local frame simplifies to
    d c = [ d c u d c v d c w ] .
    Proof. 
    d c = R T d c = [ u T v T w T ] ,   d c = [ u T d c v T d c w T d c ] = [ u d c v d c w d c ] = [ d c u d c v d c w ] .
     □
  • Differential operator with respect to local frame coordinate system.
    Theorem 1.
    The frame differential transformation matrix T in the local coordinate system is
    T = [ Δ d c 0 0 ] ,
    where Δ = [ ω u ω v ω w ] × I , d c = [ d c u d c v d c w ] .
    Proof. 
    From T = F 1 T F (Section 4.2.1) and T F = [ Δ R d c 0 0 ] (Equation (34)), we get
    T = [ R T - R T c 0 1 ] [ Δ R d c 0 0 ] = [ R T Δ R R T d c 0 0 ] = [ Δ d c 0 0 ] ( recall   that   Δ = R T Δ R   and   d c = R T d c ) .
    Hence, the theorem is proved. □
Finally, the 6-DOF local translational and rotational velocities in the local frame are derived from the differential transformation as follows:
d F = T F = [ Δ d c 0 0 ] [ R c 0 1 ] = [ Δ R Δ c + d c 0 0 ] .

5. 6-DOF Motion Trajectory with Velocity

5.1. Velocity

We have determined the 6-DOF frame for position and orientation, as well as the differentials for translational and rotational motion of the debris. High-speed, high-resolution cameras are fast enough to provide data on very small intervals of time. Assuming that the time history of 6-DOF frames has been reconstructed according to Section 3, we now determine the numerical derivatives from this sequence of 3D temporal positions and orientations (i.e., displacements) of the debris using the differentials derived in Section 4. This gives the translational and rotational velocities from the numerical time histories of the displacements.

5.2. Experimenting with 6DOF Position and Orientation and 6DOF Motion Methods

For the experiments, the data structure of debris is represented as a thin m × n × q plate, where m × n is a polygonal face shape and q is the thickness of the plate. The vertices of the plate are represented by a matrix array d e b = d e b ( x , y , z ) , where x = x ( 1 : m ) , y = y ( 1 : n ) , z = z ( 1 : q ) . For example, in Figure 2, with a 4 × 4 × 2 simple rectangular plate, we may define z ( 1 ) = 0 , z ( 2 ) = 2 , x ( 1 : 4 ) = [ 1 , 50 , 50 , 1 ] , y ( 1 : 4 ) = [ 1 , 1 , 50 , 50 ] ; thus, one flat face is f a c e 1 = { ( x ( k ) , y ( k ) , z ( 1 ) ) ; k = 1 , 2 , 3 , 4 } , and the second face is f a c e 2 = { ( x ( k ) , y ( k ) , z ( 2 ) ) ; k = 1 , 2 , 3 , 4 } .

5.3. Implementation Procedure for Debris Tracking with Both Displacement and Velocity Time Histories Determined

Conceptual algorithms and implementation go hand in hand. From the stereopairs of images captured by two cameras, we computed the 3D spatial positions of flagged points on the debris. Then, using three non-collinear points (e.g., the centroid and two vertices), we constructed a frame matrix F = [ R c 0 1 ] to represent the position and orientation of the debris. Here R = [ u v w ] represents the rotation matrix about unit vector n through the origin by angle θ , where θ = cos 1 ( u 1 + v 2 + w 3 1 2 ) and n = [ v 3 w 2 w 1 u 3 u 2 v 1 ] / 2 sin ( θ ) . F changes its state over time as the debris moves. The motion trajectory is then captured by successive camera frames F ( t ) to frame F ( t + h ) , where h is the time step, depending on the camera speed. For translational and rotational velocity, we calculated the differential of frame F as d F = F ( t + h ) F ( t ) = [ d R d c 0 1 ] , where d c is the differential of centroids of frame F ( t ) and subsequent frame F ( t + h ) , and d R is the differential of orientation of frame F ( t ) and subsequent frame F ( t + h ) . We have seen that d R = Δ R with Δ = ω × I , ω = n d θ , d θ = cos 1 ( u 1 + v 2 + w 3 1 2 ) ( t ) cos 1 ( u 1 + v 2 + w 3 1 2 ) ( t + h ) .
Now d F can be written in terms of frame differential operator T such that d F = T F , where T = [ Δ d c Δ c 0 0 ] .
Now for the trajectory of N frames F k , k = 1 , 2 , , N . Let the k th stereopair debris vertices be denoted by Q k 1 i = ( x i 1 , y i 1 , z i 1 ) , and Q k 2 i = ( x i 2 , y i 2 , z i 2 ) , i = 1 , 2 , , 8 .
The 3D spatial location corresponding to the points Q k 1 i and Q k 2 i is denoted by P k i , whose position vector is p k i .
Let p k i be the position vectors of vertices P k i , and c k be the centroid of vertices P k i for i = 1 , 2 , , 8 .
Create frame F k with u k , v k , w k , c k as follows:
c k = ( i = 1 , 8 p k i ) / 8 , u k = p k 1 c k , normalize u k = u k | u k | , w k = u k × ( p k 2 c k ) , normalize w k = w k | w k |
v k = w k × u k , θ k = cos 1 ( u k 1 + v k 2 + w k 3 1 2 ) ,   n k = [ v k 3 w k 2 w k 1 u k 3 u k 2 v k 1 ] / 2 sin ( θ k )
d θ k = cos 1 ( u k 1 + v k 2 + w k 3 1 2 ) cos 1 ( u k + 1 , 1 + v k + 1 , 2 + w k + 1 , 3 1 2 ) , Δ k = ω k × I , ω k = n k d θ k .
This creates a sequence of consecutive frames F k , k = 1 , 2 , , N representing the position and orientation of debris at any point. To derive the translational velocity of frame F k , it is sufficient to know the linear differential change from c k to c k + 1 , the centroids of the consecutive coordinate frames. The rotational velocity is more complex, as seen earlier. In the universal coordinate system:
The differential translational change is d c k = c k + 1 c k .
The differential rotational change is d R k = Δ k R k , where Δ k = ω k × I .
The frame differential change is d F k = T k F k = [ Δ k R k d c k 0 0 ] ,
T k = [ Δ k d c k Δ k c k 0 0 ] .
With respect to the local coordinate system of frame F k ,
T k = [ Δ k d c k 0 0 ] .
The differential rotational change is d R = Δ k R , where
Δ k = [ ω k u k ω k v k ω k w k ] × I .
The differential translational change is d c k , where
d c k = [ ( c k + 1 c k ) u k ( c k + 1 c k ) v k ( c k + 1 c k ) w k ] .
The frame differential change is d F = T k F k = [ Δ k R k d c k + Δ k c k 0 0 ] .
With this we complete the derivation of the formulations for determining the 6-DOF position and orientation along with the 6-DOF translation and rotational velocity of the rigid body debris. The full process of the proposed method for measuring the trajectory of plate-type debris is illustrated in Figure 3.

6. Conclusions

In this paper, we propose an algorithm to reconstruct the 6-DOF (three translational and three rotational) trajectory of flying debris using stereopairs of images from two high-speed cameras. Both the time histories of displacements and velocities can be determined by this algorithm. Specifically, this algorithm addresses the research gap in terms of a lack of algorithms/software for measuring three-rotational-DOF using stereophotogrammetry in the literature, and thus contributes to the new measurement science enabled by high-speed cameras and computer graphics theory. It is intended to be used for experimentally measuring the debris trajectory in a 3D wind field to support the investigation of the debris flight physics in an urban built environment and ultimately contributes to enhancing the reliability of building envelopes under windborne debris hazards. However, it may also be applied to reconstructing the 3D motion of people using surveillance cameras in buildings (e.g., to see how a robber is approaching a victim) or of vehicles in car accidents using traffic monitoring cameras. In this paper, we focused on the wind engineering application and developed this algorithm for two common types of debris, i.e., compact and thin-plate-like, in particular. For the former, the three-translational-DOF were reconstructed. Then the results were used to derive the three-rotational-DOF of the thin plate. In future studies, the algorithm can also be extended for rod-like debris, another type of debris commonly observed during wind risk. The current paper dealt with the theoretical development of the algorithm, while its implementation in conjunction with image processing techniques in wind tunnel experiments will be explored in our future studies.

Author Contributions

Conceptualization, C.L.S. and Y.G.; methodology, C.L.S. and Y.G.; writing—original draft preparation, C.L.S. and Y.G.; writing—review and editing, C.L.S. and Y.G.; visualization, C.L.S. and Y.G.

Funding

This research received no external funding.

Acknowledgments

The help from Yue Dong (a graduate student at Colorado State University) with preparing the manuscript was greatly appreciated.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Holmes, J.D. Windborne debris and damage risk models: A review. Wind Struct. 2010, 13, 95–108. [Google Scholar] [CrossRef]
  2. Minor, J.E. Lessons learned from failures of the building envelope in windstorms. J. Archit. Eng. 2005, 11, 10–13. [Google Scholar] [CrossRef]
  3. Lee, B.E.; Wills, J. Vulnerability of fully glazed high-rise buildings in tropical cyclones. J. Archit. Eng. 2002, 8, 42–48. [Google Scholar] [CrossRef]
  4. Behr, R.A.; Minor, J.E. A survey of glazing system behavior in multi-story buildings during hurricane andrew. Struct. Des. Tall Build. 1994, 3, 143–161. [Google Scholar] [CrossRef]
  5. Sparks, P.R.; Schiff, S.D.; Reinhold, T.A. Wind damage to envelopes of houses and consequent insurance losses. J. Wind Eng. Ind. Aerodyn. 1994, 53, 145–155. [Google Scholar] [CrossRef]
  6. Dade County. Building Code Evaluation Task Force Final Report; Dade County: Miami, FL, USA, 1992.
  7. Dade County. Final Report of the Dade County Grand Jury, Dade County Grand Jury—Spring Term A.D. 1992; Dade County: Miami, FL, USA, 1992.
  8. Federal Emergency Management Agency. Building Performance: Hurricane Andrew in Florida; Federal Insurance Administration, FEMA: Washington, DC, USA, 1992.
  9. Wind Engineering Research Council (WERC). Preliminary Observations of WERC Post-Disaster Team; Wind Engineering Research Council (WERC): College Station, TX, USA, 1992. [Google Scholar]
  10. Oliver, C.; Hanson, C. Failure of residential building envelopes as a result of Hurricane Andrew in Dade County, Florida. In Hurricanes of 1992: Lessons Learned and Implications for the Future; Cook, R.A., Soltani, M., Eds.; ASCE: New York, NY, USA, 1994; pp. 496–508. [Google Scholar]
  11. Beason, W.L.; Meyers Gerald, E.; James Ray, W. Hurricane related Window glass damage in Houston. J. Struct. Eng. 1984, 110, 2843–2857. [Google Scholar] [CrossRef]
  12. Tachikawa, M. Trajectories of flat plates in uniform flow with application to wind-generated missiles. J. Wind Eng. Ind. Aerodyn. 1983, 14, 443–453. [Google Scholar] [CrossRef]
  13. Lin, N.; Letchford, C.; Holmes, J. Investigation of plate-type windborne debris. Part I. Experiments in wind tunnel and full scale. J. Wind Eng. Ind. Aerodyn. 2006, 94, 51–76. [Google Scholar] [CrossRef]
  14. Lin, N.; Holmes, J.D.; Letchford, C.W. Trajectories of wind-borne debris in horizontal winds and applications to impact testing. J. Struct. Eng. 2007, 133, 274–282. [Google Scholar] [CrossRef]
  15. Jain, A. Wind borne debris impact generated damage to cladding of high-rise building. In Proceedings of the 12th Americas Conference on Wind Engineering (12ACWE), Seattle, WA, USA, 16–20 June 2013. [Google Scholar]
  16. Visscher, B.T.; Kopp, G.A. Trajectories of roof sheathing panels under high winds. J. Wind Eng. Ind. Aerodyn. 2007, 95, 697–713. [Google Scholar] [CrossRef]
  17. Kordi, B.; Kopp, G.A. Effects of initial conditions on the flight of windborne plate debris. J. Wind Eng. Ind. Aerodyn. 2011, 99, 601–614. [Google Scholar] [CrossRef]
  18. Kordi, B.; Traczuk, G.; Kopp, G.A. Effects of wind direction on the flight trajectories of roof sheathing panels under high winds. Wind Struct. 2010, 13, 145–167. [Google Scholar] [CrossRef]
  19. Crawford, K. Experimental and Analytical Trajectories of Simplified Debris Models in Tornado Winds. Master’s Thesis, Iowa State University, Ames, IA, USA, 2012. [Google Scholar]
  20. Hughes, J.F.; Dam, A.V.; McGuire, M.; Sklar, D.F.; Foley, J.D.; Feiner, S.K.; Akler, K. Computer Graphics: Principle and Practice, 3rd ed.; Addison Wesley Professional: Boston, MA, USA, 2013. [Google Scholar]
  21. Niku, S.B. Introduction to Robotics Analysis, Control, Applications, 2nd ed.; Wiley: Hoboken, NJ, USA, 2011. [Google Scholar]
Figure 1. The setup of parameters for the trajectory tracking problem (adapted from [19]).
Figure 1. The setup of parameters for the trajectory tracking problem (adapted from [19]).
Infrastructures 04 00066 g001
Figure 2. Rectangular plate representing debris.
Figure 2. Rectangular plate representing debris.
Infrastructures 04 00066 g002
Figure 3. Flowchart of debris-tracking algorithm.
Figure 3. Flowchart of debris-tracking algorithm.
Infrastructures 04 00066 g003

Share and Cite

MDPI and ACS Style

Sabharwal, C.L.; Guo, Y. Tracking the 6-DOF Flight Trajectory of Windborne Debris Using Stereophotogrammetry. Infrastructures 2019, 4, 66. https://0-doi-org.brum.beds.ac.uk/10.3390/infrastructures4040066

AMA Style

Sabharwal CL, Guo Y. Tracking the 6-DOF Flight Trajectory of Windborne Debris Using Stereophotogrammetry. Infrastructures. 2019; 4(4):66. https://0-doi-org.brum.beds.ac.uk/10.3390/infrastructures4040066

Chicago/Turabian Style

Sabharwal, Chaman Lal, and Yanlin Guo. 2019. "Tracking the 6-DOF Flight Trajectory of Windborne Debris Using Stereophotogrammetry" Infrastructures 4, no. 4: 66. https://0-doi-org.brum.beds.ac.uk/10.3390/infrastructures4040066

Article Metrics

Back to TopTop