Next Article in Journal
A Dynamic Navigation Model for Unmanned Aircraft Systems and an Application to Autonomous Front-On Environmental Sensing and Photography Using Low-Cost Sensor Systems
Next Article in Special Issue
A Probabilistic Feature Map-Based Localization System Using a Monocular Camera
Previous Article in Journal
Mach-Zehnder Interferometer Biochemical Sensor Based on Silicon-on-Insulator Rib Waveguide with Large Cross Section
Previous Article in Special Issue
Received Signal Strength Database Interpolation by Kriging for a Wi-Fi Indoor Positioning System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Heading Estimation for Indoor Pedestrian Navigation Using a Smartphone in the Pocket

School of Information Science and Technology, Dalian Maritime University, Dalian 116026, China
*
Author to whom correspondence should be addressed.
Sensors 2015, 15(9), 21518-21536; https://0-doi-org.brum.beds.ac.uk/10.3390/s150921518
Submission received: 26 June 2015 / Revised: 20 August 2015 / Accepted: 21 August 2015 / Published: 28 August 2015
(This article belongs to the Special Issue Sensors for Indoor Mapping and Navigation)

Abstract

:
Heading estimation is a central problem for indoor pedestrian navigation using the pervasively available smartphone. For smartphones placed in a pocket, one of the most popular device positions, the essential challenges in heading estimation are the changing device coordinate system and the severe indoor magnetic perturbations. To address these challenges, we propose a novel heading estimation approach based on a rotation matrix and principal component analysis (PCA). Firstly, through a related rotation matrix, we project the acceleration signals into a reference coordinate system (RCS), where a more accurate estimation of the horizontal plane of the acceleration signal is obtained. Then, we utilize PCA over the horizontal plane of acceleration signals for local walking direction extraction. Finally, in order to translate the local walking direction into the global one, we develop a calibration process without requiring noisy compass readings. Besides, a turn detection algorithm is proposed to improve the heading estimation accuracy. Experimental results show that our approach outperforms the traditional uDirect and PCA-based approaches in terms of accuracy and feasibility.

1. Introduction

Though various indoor pedestrian navigation systems, such as WiFi localization [1], ultra-wideband [2] and radio frequency identification (RFID) [3], have been developed, how to achieve accurate and seamless navigation at low cost is still a challenging task. Most existing technologies depend on some form of dedicated infrastructure, which is expensive for large scale deployment and always not continuously available during pedestrian walking. Recently, a pedestrian dead reckoning (PDR) system [4] using a smartphone has been considered as a promising solution to seamless indoor navigation. The smartphone plays an indispensable role in our daily lives and can be carried almost everywhere we go. This, coupled with the fact that inertial sensors are typically installed in even the low-cost smartphones, has made smartphones ideal devices for continuous indoor navigation.
A central problem for PDR using a smartphone [5] is the user heading estimation. When the user heading and related walking distance per step are obtained, the user location can be determined by computing relative displacement, starting from the initial location that is assumed to be known. If the heading estimation problem is solved well, it can also benefit many other application areas. Particularly, the user heading may be transferred into the user facing direction, which is a critical component in augmented reality, social activities, and human computer interactions [6].
Existing heading estimation approaches [7,8,9] using smartphone inertial sensors always assume that the heading misalignment between device forward direction and the user heading remains constant. Thus, the user heading can be directly determined by device attitude estimation once the offset is known. The assumption is true when the user holds the smartphone in hand or against the ear during phone calling. However, for a smartphone placed in a pocket [10], the yaw angle may change dynamically even when the pedestrian is walking straight, thus rending previous heading estimation approaches inapplicable. Practically, it is more convenient and realistic to enable user acceptance of PDR by assuming a more natural device positions, such as in the pocket. Thus, this work focuses on a smartphone put in a pant pocket, since it is likely the most popular wearing positions, especially for young males [11].
Recently, principal component analysis (PCA) has been utilized for heading estimation by exploiting the acceleration signal patterns in the horizontal plane. Though reasonable accuracy performance has been reported in these experiments, some critical problems still need to be addressed to enhance accuracy and feasibility. Firstly, it is a challenging task to obtain the horizontal plane of acceleration signals, since the local device coordinate system may vary all the time while a pedestrian is walking. When placed in a pant pocket, the smartphone swings with the femur bone, continuously altering the device coordinate system. Secondly, even if the local walking direction is accurately extracted in the local device coordinate system, translating it into the global one is still a challenge. The PCA-based approach implements this translation by deploying a compass, which suffers from severe magnetic perturbations in modern man-made structures [12] and is unavailable for most indoor environments.
In order to overcome the aforementioned challenges, we present a novel heading estimation approach based on Rotation Matrix and Principal Component Analysis (RMPCA). The proposed RMPCA may achieve accurate user heading estimation, regardless of the smartphone orientation and placement within the pocket. Our work makes the following three contributions:
First, to adapt the changing device coordinate system and achieve an accurate estimation of the horizontal plane of acceleration signals, we project all the acceleration signals into a reference coordinate system using a related rotation matrix. The rotation matrix is computed by an extended Kalman filter (EKF)-based attitude estimation model.
Second, after extracting the local walking direction by utilizing PCA over the horizontal plane, we develop a calibration process to translate the local walking direction into the global one. Compared with the traditional approaches, the major advantage of our calibration process is that we avoid introducing the noisy compass, which is almost always unavailable in most indoor environments.
Third, we propose a turn detection algorithm to improve the heading estimation accuracy. PCA for local walking direction extraction suffers from significant performance degradation when a user turn occurs. Thus, we deploy relative heading change for heading estimation once a user turn is detected.
In the rest of this paper, we first discuss the related works in Section 2. Section 3 introduces an overview and some key concepts of the proposed approach. Section 4 describes the approach in detail. The experimental results are reported and discussed in Section 5. Conclusions and our future works are presented in the last section.

2. Related Works

Many existing heading estimation approaches for PDR deploy wearable sensors fixed on the human body. In [13], zero velocity updates (ZUPTs) are introduced into the EKF to achieve a reliable user heading estimation when foot-mounted sensors are used. In [14], the user heading is obtained by computing the locations of two infrared ray receiver modules mounted on the subject’s shoulders. In [15], stereo cameras attached on the user’s chest are deployed for heading estimation during pedestrian navigation. Compared with the unconstrained wearable sensors, those carefully attached on fixed human positions always provide more accurate and robust heading estimation results. This is because not only some more extra information such as ZUPT can be exploited, but also the heading misalignment between device forward direction and user heading is always constant during pedestrian walking. However, carrying dedicated devices, which are typically useless for users in their daily lives, with a fixed position for long duration makes them impractical for PDR applications. For the mass market, it is more practical and less intrusiveness to deploy the pervasively available smartphone as a common device for heading estimation.
Most existing smartphone-based user heading estimation approaches [16,17] are based on the highly simplified assumption that the heading misalignment between device forward direction and the user heading remains constant. For a hand-held smartphone, the assumption is true, since the forward device direction is always the same as the user heading. Thus, traditional attitude estimation techniques such as Kalman filter [18] or its variants may be directly applied to achieve reliable heading estimation. However, for a smartphone freely placed in other positions, the assumption is seriously corrupted by the changing device coordinate system. As a result, the smartphone attitude estimation is insufficient for heading estimation, since the misalignment angle may vary with body locomotion.
Recently, studies [5,19] have demonstrated that the positions of the device on the user’s body can be successfully detected by analyzing acceleration patterns generated during walking locomotion. Therefore, it is feasible to determine the user heading by assuming the known device placements. Kunze et al. [20] have deployed PCA for walking direction extraction with the device placed in the user’s pant pocket. They first obtain the horizontal plane of acceleration signals by related vertical acceleration component, which is the acceleration signal when the user is approximately static. Then, the local walking direction at device coordinate system is extracted by applying PCA, and finally translated into the global walking direction by a digital compass. Steinhoff et al. [21] have conducted an experimental study of such PCA-based techniques, and further improves the heading estimation accuracy by properly filtering acceleration signals. However, the PCA based approach is prone to the inaccurate estimation of horizontal acceleration plane caused by changing device coordinates. Hoseinitabatabaei et al. [22] have proposed an uDirect approach to extract the user heading directly within a specific region, where the forward acceleration dominates the horizontal acceleration components. Unfortunately, such a region is always corrupted by sideway acceleration components according to our experimental study. More importantly, neither the PCA nor the uDirect approach can be used indoors, because they both rely on a compass to translate the local direction into the global one, which is almost always unavailable due to the serve indoor magnetic perturbations [23,24].
This work proposes a novel RMPCA approach for user heading estimation using a smartphone placed in the user’s pant pocket. We also utilize PCA due to its success in local walking direction extraction. Compared with the PCA-based approach, RMPCA achieves more accurate estimation of the horizontal plane, since we obtain the vertical acceleration component in the projected reference coordinate system. Furthermore, we choose a specific reference coordinate system so that the local walking direction can be easily transformed into the global one without requiring noisy compass readings. Experimental results show that the proposed RMPCA approach outperforms the previous PCA and uDirect approaches in terms of accuracy and feasibility.

3. Overview

The user heading estimation is defined as the process of seeking the relative orientation of a user’s coordinate system (UCS) with respect to the global coordinate system (GCS). Let GCS be an Earth coordinate system defined by axes X G , Y G , and Z G , which point east, north and the opposite direction of the gravity vector, respectively. As shown in Figure 1a, The UCS is defined by axes X U , Y U , and Z U , with Y U being tangential to the trajectory, Z U coinciding with Z G , X U being the right side of user body and given by the cross product of Y U and Z U . In order to define the user’s orientation in GCS, we exploit the acceleration pattern generated during pedestrian walking. The inertial signals are measured on a smartphone with a three-axis accelerometer and a three-axis gyroscope built in, thus referring the third coordinate system called device coordinate system (DCS). DCS is defined by axes X D , Y D , and Z D , where X D and Y D axes are parallel to phone screen and point rightward and forward, respectively, and Z D is given by the cross product of X D and Y D . The GCS and DCS are depicted in Figure 1d and Figure 1b, respectively. The final coordinate system is the reference coordinate system (RCS), as shown in Figure 1c. RCS is the DCS at a specific moment when the user holds the smartphone in hand and initially starts the PDR application.
The proposed RMPCA mainly includes three steps. Firstly, acceleration signals measured at the DCS are projected into the RCS, by computing the related rotation matrix. Secondly, PCA over the horizontal plane of projected acceleration signals at RCS is applied for local walking direction extraction. Finally, the user’s local walking direction is transformed into a global one by a calibration process. There are three assumptions used in RMPCA. First, the smartphone is initially held and gazed at by the user for a few seconds, while the smartphone’s forward axis Y D is aligned with the user’s walking direction axis Y U . This duration is always available, since the user needs to gaze at and manipulate the smartphone when he starts the application and sets related parameters. Second, the user’s initial orientation is assumed to be known as a priori, as in many other works [25,26]. This assumption is always reasonable, since the user’s continuous trajectories and related orientations can be initially obtained by several external positioning systems, such as Global Position System (GPS) tracking [25] when the user enters a building, WiFi localization [27] or landmarks [28]. Third, the user walks forward and relatively straight during a short period most of the time. Practically, since the user sways sideways while taking each step during straight walking, the walking direction is determined within a stride (i.e., two adjacent steps). If the third assumption is invalid due to a user turn, we combine a turn detection algorithm to improve the heading estimation performance.
Figure 1. Illustration of the defined coordinate system: (a) UCS (b) DCS (c) RCS (d) GCS.
Figure 1. Illustration of the defined coordinate system: (a) UCS (b) DCS (c) RCS (d) GCS.
Sensors 15 21518 g001

4. Methodology

4.1. Projection of Acceleration Signals into the RCS by Rotation Matrix

To extract the local walking direction, all the acceleration signals within a stride should be firstly projected into the horizontal plane. The PCA-based approach achieves this by deploying a recognized gravity vector. However, the DCS may vary a lot, even within one step, due to the leg rotational movements, thus rendering the recognized gravity vector at a certain DCS unusable directly. To adapt the changing DCS, we project all the acceleration signals into a RCS, and then apply the gravity vector at RCS to obtain the horizontal plane. To achieve this projection, we exploit the rotation matrix between DCS and RCS, since all inertial signals are measured at varying DCS.
We develop an EKF-based attitude estimation model to compute the related rotation matrix. Instead of deploying Euler angles, we apply a rotation quaternion to describe the smartphone orientation, since it can avoid the singularity problems [29]. Firstly, we construct the relationship between a rotation quaternion and the smartphone orientation. The projection of acceleration signals at RCS into DCS can be represented as follows:
a D C S ( t ) = ( C R C S D C S ( q ( t ) ) ) T a R C S ( t )
where C R C S D C S ( q ( t ) ) is the rotation matrix of DCS with respect to RCS at time t , a R C S ( t ) and a D C S ( t ) are the 3 × 1 acceleration vectors at time t relative to RCS and DCS . For the sake of simplicity, we will omit the argument t . The rotation matrix can be described by a quaternion:
C R C S D C S ( q ) = [ q 0 2 + q 1 2 q 2 2 q 3 2 2 ( q 1 q 2 q 0 q 3 ) 2 ( q 1 q 3 + q 0 q 2 ) 2 ( q 1 q 2 + q 0 q 3 ) q 0 2 q 1 2 + q 2 2 q 3 2 2 ( q 2 q 3 q 0 q 1 ) 2 ( q 1 q 3 q 0 q 2 ) 2 ( q 0 q 1 + q 2 q 3 ) q 0 2 q 1 2 q 2 2 + q 3 2 ]
where q [ q 0 q 1 q 2 q 3 ] T is the normalized quaternion, q 0 is the scalar part of the quaternion and [ q 1 , q 2 , q 3 ] is the vector part.
Secondly, according to the rigid body kinematic equations [29], the discrete-time model of rotation quaternions can be given as:
q k + 1  = exp ( 0.5 × Ω ( w k T s ) ) q k = ( I cos ( 0.5 × Δ θ k ) + Ω ( w k T s ) sin ( 0.5 × Δ θ k ) / Δ θ k ) q k
where T s is the system interval, q k and q k + 1 are the quaternions at time instants k T s and ( k + 1 ) T s respectively, w k = [ w k x w k y w k z ] T is the angular velocity vector at time instants k T s relative to DCS, I is an 3 × 3 identity matrix, Δ θ k = T s ( w k x ) 2 + ( w k y ) 2 + ( w k z ) 2 , and Ω ( w k T s ) is given by:
Ω ( w k T s ) = T s [ 0 w k x w k y w k z w k x 0 w k z w k y w k y w k z 0 w k x w k z w k y w k x 0 ]
The quaternion q k + 1 is determined when the initial condition is set as q 0 = [ 1 0 0 0 ] T .
Finally, the EKF is applied to fuse the gyroscope data with the accelerometer data for device attitude estimation. We deploy the smartphone rotation quaternion as a state vector. The state transition vector equation can be given by:
q k + 1  =  F k q k + w k q
where the state transition matrix F k = exp ( Ω ( w k T s ) ) , and:
w k q = Ξ k w k g y r o = T s 2 [ [ e k × ] + q 0 k I e k T ] w k g y r o
q k [ q 0 k q 1 k q 2 k q 3 k ] T is the rotation quaternion at time instants k T s , q 0 k is the scalar part and e k [ q 1 k q 2 k q 3 k ] T is the vector part, w k g y r o is the white Gaussian measurement noise vector for gyroscope at time instants k T s , and [ e k × ] is a standard vector cross-product operator. Equations (5) and (6) are derived from Equation (3), and can be considered as a first order approximation of the “noisy” transition matrix [30]. The approximation always performs well, since the gyroscope measurement noise vector w k g y r o is small enough in the application area. Consequently, the process noise covariance matrix Q k is given by:
Q k = w k q ( w k q ) T = Ξ k Q k g y r o Ξ k T
where Q k g y r o = σ g y r o 2 I is the covariance matrix for gyroscope measurement noise vector w k g y r o .
The measurement model is constructed based on the acceleration signals observed when the human body is almost not affected by any acceleration:
a k + 1  =  ( C R C S D C S ( q k + 1 ) ) T g R C S + v k + 1 a
where a k + 1 is the recognized gravity vector at DCS, g R C S is the gravity vector at RCS , and v k + 1 a is the related white Gaussian measurement noise. Let R k + 1 be the covariance matrix of the measurement noise. To filter out the disturbance of significant human motions, we construct adaptive measurement noise covariance matrix R k + 1 = R σ a 2 I :
R σ a 2 = { σ a 2 , a k + 1 g 2 < ε a , otherwise
Thus, the measurement model can be approximated as a linearized formula:
a k + 1   =   H k + 1 q k + 1 H k + 1 q k + 1 + ( C R C S D C S ( q k + 1 ) ) T g + v k + 1 a
where H k + 1 = a k + 1 / q k + 1 | q k + 1 = q k + 1 , v k + 1 a = 0 is the related Jacobian matrix, q k + 1 = F k q ^ k is the best state estimation of q k + 1 available, namely the a priori state estimate, q ^ k is the quaternion estimation result of EKF at time instants k T s .
Based on the state model in Equation (5) and the measurement model in Equation (10), with the process noise covariance matrix Q k and measurement noise covariance matrix R k + 1 , the EKF for estimating the state vector q k + 1 may be established. Detailed procedures for executing the EKF may be found in [31]. Therefore, after estimating the state vector q k + 1 , the projection of acceleration signals at DCS into RCS may be derived from Equation (1):
a R C S ( t ) = C R C S D C S ( q ( t ) ) a D C S ( t )
where q ( t ) is the related smartphone rotation quaternion at time t , a R C S ( t ) is the obtained projected acceleration signals at RCS.

4.2. PCA for Local Walking Direction Extraction

PCA for walking direction extraction is based on the observation that the most variations in the horizontal plane will be parallel to the walking direction [20]. As shown in Figure 2a, the walking cycle of humans includes two main phases, the stance phase and the swing phase. The stance phase is usually defined as a period of the walking cycle, started with a heel strike moment and ended with a toe off moment [22]. Each phase corresponds to a footstep. As shown in Figure 2b, we empirically find that the norm of the acceleration signal is a robust feature for footstep detection. We deploy a peak detection algorithm [32,33] with two thresholds, which are used to eliminate false peaks caused by device shakes with too short time intervals or too small magnitude:
{ Δ T T Th | A c c N o r m g | A Th
where Δ T is the time interval between every two adjacent peaks, A c c N o r m is the norm of the three dimensional acceleration signal, g is the local gravity, T Th and A Th are the time threshold and the magnitude threshold for filtering false peaks, respectively. Therefore, each recognized peak point can stand a footstep. Moreover, as seen in Figure 2b, the peak point during the swing phase has a larger magnitude than that during the stance phase. This can be used to distinguish between the two phases within a stride. The peak detection algorithm may be applied successfully under most smartphone placements [33], such as in a hand or a pocket, though the two thresholds should be adjusted according to the different magnitude of acceleration signals.
After recognizing a stride with the related acceleration signals, we obtain the horizontal plane at RCS by computing the related vertical component. We apply a sliding window over all three dimensions of the acceleration signals at the initial time when the smartphone is held initially by the user. If the variances of all dimensions are close to 0 and the total magnitude approaches 9.81 m / s 2 , the signal is very likely to be dominated by the vertical component, and thus being recognized as the local gravity vector. Denote the gravity vector at RCS as the 3 × 1 vector g R C S , the acceleration signals can be projected into the horizontal plane as given:
a hor R C S  =  a R C S ( g R C S ) T a R C S g R C S / g R C S 2
where a R C S and a hor R C S are the acceleration at RCS and the related horizontal component, respectively.
Figure 2. Acceleration patterns of pedestrian walking with the smartphone put in the pocket: (a) Walking cycle includes two phases: stance phase and swing phase; (b) The peak detection algorithm detects one stride with two footsteps (peak points).
Figure 2. Acceleration patterns of pedestrian walking with the smartphone put in the pocket: (a) Walking cycle includes two phases: stance phase and swing phase; (b) The peak detection algorithm detects one stride with two footsteps (peak points).
Sensors 15 21518 g002
After obtaining the horizontal plane at RCS, PCA extracts the local walking direction by computing the related first eigenvector, as shown in Figure 3. However, PCA for walking direction extraction suffers from the 180° ambiguity problem. The obtained first eigenvector cannot distinguish between the forward and backward walking directions. To address this problem, we exploit the observation that, around the peak point during the stance phase, the thigh keeps swinging forward distinctly with the rotating smartphone. Thus, at that moment, there should be a positive (negative) rotation along positive (negative) direction of the rotation axis X U . Figure 4 shows the angular velocity along the negative direction of the rotation axis, whose angular movement is negative at the peak point during the stance phase. After a rough estimation of the parallel direction of rotation axis X U as referred in [21], we can align the positive direction of rotation axis X U to the right side of the body by requiring the angular movement to be positive. Then, an approximation of the forward direction can be obtained by a rotation of 90° in the horizontal plane, and ultimately is used to eliminate the 180° ambiguity problem.
Figure 3. Illustration of PCA for eigenvector extraction over the horizontal plane of acceleration signals to infer the local walking direction at RCS.
Figure 3. Illustration of PCA for eigenvector extraction over the horizontal plane of acceleration signals to infer the local walking direction at RCS.
Sensors 15 21518 g003
Figure 4. The rotation value along the negative (positive) direction of the rotation axis X U should be negative (positive) at the peak point during stance phase. This can be used to align the positive direction of the rotation axis X U to the right side of user body.
Figure 4. The rotation value along the negative (positive) direction of the rotation axis X U should be negative (positive) at the peak point during stance phase. This can be used to align the positive direction of the rotation axis X U to the right side of user body.
Sensors 15 21518 g004

4.3. Calibration Process for Determining Global Walking Direction

The local walking direction vector at RCS can be firstly projected into the UCS. Let C U C S R C S be the rotation matrix of RCS with respect to UCS, D R C S be the local walking direction vector obtained at RCS. The walking direction vector at UCS can be given by:
D U C S = C U C S R C S D R C S
where D U C S = [ D x U C S D y U C S D z U C S ] T is the walking direction vector at UCS. Assume that RCS is subjected to three subsequent rotations, i.e., a rotation by a pitch angle θ x around X U , a rotation by a roll angle θ y around Y U , and, finally, a rotation by a yaw angle θ z around Z U . The total rotation matrix can be obtained from three elementary rotation matrices:
C U C S R C S = C z ( θ z ) C y ( θ y ) C x ( θ x )
C x ( θ x ) = [ 1 0 0 0 cos θ x sin θ x 0 sin θ x cos θ x ]
C y ( θ y ) = [ cos θ y 0 sin θ y 0 1 0 sin θ y 0 cos θ y ]
C z ( θ z ) = [ cos θ z sin θ z 0 sin θ z cos θ z 0 0 0 1 ]
According to the first assumption in Section 3, the yaw angle θ z at RCS is zero, since the smartphone’s forward axis Y D is aligned with the user’s walking direction axis Y U . The unknown pitch angle θ x and roll angle θ y can be obtained by exploiting the local gravity vector as follows:
g U C S = C U C S R C S g R C S
where g U C S = [ 0 0 9.81 ] T is the local gravity vector at UCS and g R C S is the measured gravity vector at RCS. After knowing the three angles, i.e., the pitch angle θ x , the roll angle θ y , and the yaw angle θ z , the rotation matrix C U C S R C S can be computed by Equation (15).
According to the second assumption, we assume that the initial yaw angle of the user orientation at GCS is ψ 0 . Finally, the global walking direction, i.e. the yaw angle ψ at GCS, can be given as:
ψ = arctan ( D y U C S D x U C S ) + ψ 0 π 2

4.4. Turn Detection Algorithm for Improving RMPCA

The basic idea of turn detection is to exploit the heading change pattern during pedestrian walking. The user heading changes alternative between positive and negative with the similar amplitude when pedestrian walks straight and sways sideways, as shown in Figure 5. If the heading change pattern is corrupted to some degree, a user turn is detected. We deploy the gyroscope outputs in horizontal plane to compute the heading change for each step. Derived from Equations (11) and (14), the angular velocity at UCS can be given by related rotation matrix:
w U C S ( t ) = C U C S R C S C R C S D C S ( q ( t ) ) w D C S ( t )
where q ( t ) is the related rotation quaternion at time t , w U C S ( t ) and w D C S ( t ) are the representations of angular velocity signal at UCS and DCS at time t respectively, C U C S R C S is the rotation matrix of RCS with respect to UCS, and C R C S D C S ( q ( t ) ) is the rotation matrix of DCS with respect to RCS at time t . Therefore, the heading change for each step can be given by:
δ θ i = k = 1 N i w k , i U C S ( Z U ) T g y r o
where δ θ i is the heading change for step i , w k , i U C S ( Z U ) is the angular velocity component rotating around Z U of the k th angular velocity sample for step i , N i is the total number of samples within the step i , and T g y r o is the related sampling interval of gyroscope outputs.
After computing the heading change for each step, if we find a positive (negative) heading change follows a positive (negative) one, and the absolute heading change of the two steps exceeds a given threshold δ θ th 1 , the heading change pattern during straight walking is considered to be corrupted. As a result, a user turn is reported. Therefore, we estimate the global walking direction by adding the current step heading change on the heading of previous step. Inversely, if a positive (negative) heading change follows a negative (positive) one, and the absolute heading change is smaller than a given threshold δ θ th2 , we recognize that the user turn terminates and deploy RMPCA again.
Figure 5. Identification of turns.
Figure 5. Identification of turns.
Sensors 15 21518 g005

5. Evaluation

5.1. Experimental Setup

We have tested our heading estimation approach in an office building, as shown in Figure 6. Six subjects with different physical characteristics were asked to walk along the path denoted by a red solid line in Figure 6. The size of the walking path is 14.4 m × 12 m with a total length of 52.8 m, which requires an average of 88 steps (44 strides) to complete. Each subject firstly held the phone in hand and stands for few seconds to start the application. Then, the subject put the phone into the pocket and walked along the path. The subjects were free to change the device orientation arbitrarily when he put the phone into the pocket. Each subject repeated the above procedure at least 10 times. Thus, 5280 samples could be collected for the evaluation of the proposed approach. To label the ground truth, the subjects were asked to walk along the densely placed tags on the ground carefully. We also used a camera to record the entire walking procedure. Before carrying out experiments, we did necessary calibrations to make the gyroscope outputs more precise and robust. As in many other literatures [25,26] do, the user’s initial state, including location and orientation, is assumed to be known as a priori.
The experiments were performed indoors, where severe magnetic perturbations existed and were difficult to calibrate. Thus, the compared approaches PCA and uDirect adopted outdoors could not deploy a compass to transform the local walking direction into the global one. For convenience of comparisons, the local walking directions obtained by the PCA and uDirect approaches were also transformed into the global one by our calibration process. Therefore, in our experiments, the accuracy performance differences of the compared approaches were mainly caused by their local walking direction extraction schemes.
Figure 6. Indoor test environment.
Figure 6. Indoor test environment.
Sensors 15 21518 g006

5.2. Heading Estimation Performance Analysis

Figure 7 compares the heading estimation error distributions of different approaches, respectively. Clearly, RMPCA performs best, since the estimation errors distributed in the large magnitude field are much less than those of the other two approaches. Similar results can be seen in Figure 8. The 50th percentile absolute heading estimation errors of RMPCA, PCA and uDirect are 4.6°, 9.5°, and 6.9°, respectively, while the 75th percentile absolute errors are 12.1°, 18.5° and 20.3°. RMPCA reduces the mean absolute estimation error by 25.8% (3.1°), and 31.0% (4.0°) than PCA and uDirect, respectively. Figure 9 shows the heading estimation results of the compared approaches for one trace. One can clearly see that the heading estimation of our RMPCA is the closest to the baseline direction.
The main difference between PCA and our RMPCA is the estimation scheme of the vertical acceleration components, which is used to infer the horizontal plane of acceleration signals. PCA obtains the vertical components by finding acceleration signals whose magnitude approaches 9.81  m / s 2 and the variance of three dimensions is close to 0. However, this scheme suffers from two drawbacks. First, the obtained vertical components are always suboptimal due to the noisy components introduced by walking locomotion. Second, the DCS of the obtained vertical component does not coincide with the DCS of the other acceleration signals within the same stride. Thus, directly deploying the obtained vertical component to compute the horizontal plane renders an inaccurate estimation. In contrast to the PCA approach, our RMPCA approach avoids the second drawback by projecting the acceleration signals into the same RCS. For the first drawback, the noisy components introduced by the body locomotion will be alleviated substantially by choosing the RCS when users stand and start the application for few seconds. Therefore, our RMPCA approach performs much better than the PCA approach.
For the uDirect approach, the local walking direction is extracted at the moment when the side-to-side acceleration component is minimized during the walking cycle. Unfortunately, such a moment is susceptible to sideway acceleration components. Even when the side-to-side acceleration component is indeed minimized, the walking direction component may not dominate the horizontal plane of acceleration signals, according to our experimental study. Therefore, uDirect approach performs the worst and is more likely to generate heading estimation errors of large magnitude.
Figure 7. Heading estimation error distribution.
Figure 7. Heading estimation error distribution.
Sensors 15 21518 g007
Figure 8. Performance comparisons of the absolute heading estimation error: mean, standard deviation (SD), 50th percentile and 75th percentile.
Figure 8. Performance comparisons of the absolute heading estimation error: mean, standard deviation (SD), 50th percentile and 75th percentile.
Sensors 15 21518 g008
Figure 9. Heading estimations of different approaches vs. baseline.
Figure 9. Heading estimations of different approaches vs. baseline.
Sensors 15 21518 g009

5.3. Turn detection for Heading Estimation Improvement

If the user is making a turn, the proposed RMPCA and compared approaches all suffer from degraded performance. This may be contributed to the substantial sideway acceleration jitters during the user turn, which corrupt the acceleration pattern exploited by the heading estimation approaches. As seen in Figure 9, the estimated headings all deviate from the baseline headings significantly when the user makes three turns. Therefore, we deploy the turn detection algorithm proposed in Section 4.4 to improve the heading estimation. Once a turn is reported, we estimate the current heading for each step by adding the current heading change in Equation (21) to the heading of previous step. The heading change scheme for heading estimation performs relatively well during a short period, while estimation error may accumulate rapidly due to the jitters caused by body locomotion. Thus, if a turn termination is recognized, we deploy the proposed RMPCA for heading estimation again. Figure 10 shows the heading estimation performance improvement of RMPCA by combing the proposed turn detection algorithm. Since the estimation errors of large magnitude caused by the user turns are reduced, the tail of the error distribution is cut down. Particularly, the turn detection algorithm reduces the mean absolute error, standard deviation (SD), 90th percentile from 8.93°, 12.75°, and 21.3° to 8.05°, 11.12°, and 19.1°, respectively.
Figure 10. Heading estimation performance comparisons between RMPCA with and without turn detection. (a) Error distribution; (b) Statistical results of the absolute error.
Figure 10. Heading estimation performance comparisons between RMPCA with and without turn detection. (a) Error distribution; (b) Statistical results of the absolute error.
Sensors 15 21518 g010

5.4. PDR Application

Although the heading estimation for PDR using a smartphone under a highly controlled situation, such as held in hand, has been widely used, it is still an unsolved problem for smartphones placed in the user’s pant pocket. The proposed RMPCA aims to solve this problem by three steps: project the acceleration signals into RCS by a related rotation matrix, then estimate the local walking direction at RCS by applying PCA, and finally obtain the global walking direction at GCS by a calibration process. In order to calculate the walking distance, a step length estimation model should also be developed. Though the step lengths are determined by various factors, including height, attitude, and walking frequency [16,33], for the same pedestrian, it mainly depends on the walking speed. For different pedestrians, we may develop a memorization function to store the personal parameters of step length estimation model. Considering the strong correlation of the walking speed with the amplitude of the measured acceleration, we deploy the empirical model [34] given as follows:
S t e p L e n g t h = K A c c max A c c min 4
where A c c max and A c c min represent the maximum and minimum amplitudes of vertical acceleration components during each stride, K is the personalized parameter that need to be calibrated for each pedestrian. Figure 11 shows the tracking trajectories comparisons of one trace between different heading estimation approaches. Clearly, the heading estimation error is the main localization error source for PDR. Thus, the proposed RMPCA with the smallest heading estimation error obtains the best tracking performance. As shown in Table 1, the proposed RMPCA are the only approach whose mean error and 50th percentile error are within 1.5 m. RMPCA reduces the mean localization error by 32.2% (0.66 m), and 37.7% (0.84 m) than that of PCA and uDirect, respectively.
Figure 11. Tracking performance comparisons of one trace between different heading estimation approaches.
Figure 11. Tracking performance comparisons of one trace between different heading estimation approaches.
Sensors 15 21518 g011
Table 1. Comparisons of real-time localization errors (m).
Table 1. Comparisons of real-time localization errors (m).
Heading Estimation ApproachesReal-Time Localization Errors
MeanStandard Deviation50th Percentile95th Percentile
RMPCA1.391.161.433.58
PCA2.051.622.145.04
uDirect2.231.842.305.67

6. Conclusions and Future Work

This paper presents RMPCA, an approach to determine pedestrian walking direction indoors using a smartphone placed in the user’s pant pocket, independent of the smartphone’s orientation. First, we develop an EKF-based attitude estimation model to compute the rotation matrix between DCS and RCS, which is used to adapt the changing DCS and obtain the horizontal plane acceleration signals at RCS. Then, PCA is applied over the horizontal plane for local walking direction extraction. Finally, we develop a calibration process for global walking direction translation without requiring noisy compass readings, which are almost unavailable in most indoor environments. Besides, a user turn detection algorithm exploiting heading change pattern is used to improve the heading estimation. Experimental results show that our approach reduces the mean absolute heading estimation error by 25.8% (3.1°), and 31.0% (4.0°) compared that of PCA and uDirect, respectively.
In future works, to verify the effectiveness of the proposed RMPCA, we will perform the experiments at more complicated conditions, such as on various curved walking paths and with random walking velocities. The effect of other smartphone mounting positions, including in a bag, against the ear during phone calling and in a swinging hand, will also be studied for the proposed RMPCA. More importantly, we will further expand the proposed heading estimation approach to the long term indoor navigation application, which has been paid few attentions. The main challenge is the accumulated error caused by gyroscope biases. We will introduce some landmarks or building map information for gyroscope bias calibrations and related orientation self-calibrations. Besides, some external localization techniques such as Wi-Fi fingerprinting and RFID may be combined to provide the initial user position and orientation estimations.

Acknowledgments

This research is supported by National Natural Science Foundation of China (Granted Nos. 61301132, 61300188, and 61301131) and the Fundamental Research Funds for the Central Universities (Granted No. 3132014211).

Author Contributions

Zhi-An Deng proposed the original idea and wrote this paper; Guofeng Wang analyzed the data and gave some valuable suggestions; Ying Hu developed the turn detection algorithm; Di Wu designed the experiments and revised the paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Deng, Z.-A.; Xu, Y.B.; Ma, L. Indoor positioning via nonlinear discriminative feature extraction in wireless local area network. Comput. Commun. 2012, 35, 738–747. [Google Scholar] [CrossRef]
  2. Chehri, A.; Fortier, P.; Tardif, P.M. UWB-based sensor networks for localization in mining environments. Ad Hoc Netw. 2009, 7, 987–1000. [Google Scholar] [CrossRef]
  3. Kong, J. Fault-tolerant RFID reader localization based on passive RFID tags. IEEE Trans. Parall. Distr. 2014, 25, 2065–2076. [Google Scholar]
  4. Renaudin, V.; Combettes, C. Magnetic, acceleration fields and gyroscope quaternion (MAGYQ)-based attitude estimation with smartphone sensors for indoor pedestrian navigation. Sensors 2014, 14, 22864–22890. [Google Scholar] [CrossRef] [PubMed]
  5. Qian, J.; Pei, L.; Ma, J.; Ying, R.; Liu, P. Vector graph assisted pedestrian dead reckoning using an unconstrained smartphone. Sensors 2015, 15, 5032–5057. [Google Scholar] [CrossRef] [PubMed]
  6. Hoseinitabatabaei, S.A.; Gluhak, A.; Tafazoll, R. A survey on smartphone-based systems for opportunistic user context recognition. ACM Comput. Surv. 2013, 45, 1–55. [Google Scholar] [CrossRef]
  7. Deng, Z.-A.; Hu, Y.; Yu, J.; Na, Z. Extended kalman filter for real time indoor localization by fusing WiFi and smartphone inertial sensors. Micromachines 2015, 6, 523–543. [Google Scholar] [CrossRef]
  8. Callmer, J.; Tornqvist, D.; Gustafsson, F. Robust heading estimation indoors using convex optimization. In Proceedings of the 16th International Conference on Information Fusion (FUSION), Istanbul, Turkey, 9–12 July 2013; pp. 1173–1179.
  9. Chen, Z.; Zou, H.; Jiang, H.; Zhu, Q.; Soh, Y.C.; Xie, L. Fusion of WiFi, smartphone sensors and landmarks using the Kalman filter for indoor localization. Sensors 2015, 15, 715–732. [Google Scholar] [CrossRef] [PubMed]
  10. Diaz, E.M.; Gonzalez, A.L.M.; de Ponte Müller, F. Standalone inertial pocket navigation system. In Proceedings of the IEEE/ION Position, Location and Navigation Symposium (PLANS) 2014, Monterey, CA, USA, 5–8 May 2014; pp. 241–251.
  11. Ichikawa, F.; Chipchase, J.; Grignani, R. Where’s the phone? A study of mobile phone location in public spaces. In Proceedings of the 2005 Mobility Conference on Mobile Technology Applications & Systems Retrieve, Guangzhou, China, 15–17 November 2005; pp. 1–8.
  12. Afzal, M.H.; Renaudin, V.; Lachapelle, G. Assessment of indoor magnetic field anomalies using multiple magnetometers. In Proceedings of the 23rd International Technical Meeting of the Satellite Division of the Institute of Navigation (ION GNSS 2010), Portland, OR, USA, 21–24 September 2010; pp. 1–9.
  13. Foxlin, E. Pedestrian tracking with shoe-mounted inertial sensors. IEEE Comput. Gr. Appl. 2005, 25, 38–46. [Google Scholar] [CrossRef]
  14. Jung, W.; Woo, W.; Lee, S. Orientation tracking exploiting ubiTrack. In Proceedings of the UbiComp 2005, Tokyo, Japan, 11–14 September 2005; pp. 47–50.
  15. Jirawimut, R.; Prakoonwit, S.; Cecelja, F.; Balachandran, W. Visual odometer for pedestrian navigation. IEEE Trans. Instrum. Meas. 2003, 52, 1166–1173. [Google Scholar] [CrossRef]
  16. Pei, L.; Chen, R.; Chen, Y.; Leppakoski, H.; Perttula, A. Indoor/outdoor seamless positioning technologies integrated on smart phone. In Proceedings of the IEEE International Conference on Advances in Satellite and Space Communications, Colmar, France, 20–25 July 2009; pp. 141–145.
  17. Roy, N.; Wang, H.; Choudhury, R.R. I am a smartphone and I can tell my user’s walking direction. In Proceedings of the 12th Annual International Conference on Mobile Systems, Applications, and Services, Bretton Woods, NH, USA, 16–19 June, 2014; pp. 354–354.
  18. Evennou, F.; Marx, F. Advanced integration of WiFi and inertial navigation systems for indoor mobile positioning. Eurasip J. Appl. Signal Proc. 2006, 2006, 1–11. [Google Scholar] [CrossRef]
  19. Kunze, K.; Lukowicz, P.; Junker, H.; Troster, G. Where am I: Recognizing on-body positions of wearable sensors. Locat. Context Aware. 2005, 3479, 264–275. [Google Scholar]
  20. Kunze, K.; Lukowicz, P.; Partridge, K.; Begole, B. Which way am I facing: Inferring horizontal device orientation from an accelerometer signal. In Proceedings of the International Symposium on Wearable Computers, Linz, Austria, 4–7 September 2009; pp. 149–150.
  21. Steinhoff, U.; Schiele, B. Dead reckoning from the pocket—An experimental study. In Proceedings of the IEEE International Conference on Pervasive Computing and Communications (PerCom), Mannheim, Germany, 29 March–2 April 2010; pp. 162–170.
  22. Hoseinitabatabaei, S.A.; Gluhak, A.; Tafazolli, R.; Headley, W. Design, realization, and evaluation of uDirect—An approach for pervasive observation of user facing direction on mobile phones. IEEE Trans. Mob. Comput. 2014, 13, 1981–1994. [Google Scholar] [CrossRef]
  23. Afzal, M.H.; Renaudin, V.; Lachapelle, G. Use of earth’s magnetic field for mitigating gyroscope errors regardless of magnetic perturbation. Sensors 2011, 11, 11390–11414. [Google Scholar] [CrossRef] [PubMed]
  24. De Vries, W.; Veeger, H.; Baten, C.; van der Helm, F. Magnetic distortion in motion labs, implications for validating inertial magnetic sensors. Gait Posture 2009, 29, 535–541. [Google Scholar] [CrossRef] [PubMed]
  25. Kim, Y.; Chon, Y.; Cha, H. Smartphone-based collaborative and autonomous radio fingerprinting. IEEE Trans. Syst. Man Cybern. C Appl. Rev. 2012, 42, 112–122. [Google Scholar] [CrossRef]
  26. Li, F.; Zhao, C.; Ding, G.; Gong, J.; Liu, C.; Zhao, F. A reliable and accurate indoor localization method using phone inertial sensors. In Proceedings of the 14th ACM International Conference on Ubiquitous Computing, Pittsburgh, PA, USA, 5–8 September 2012; pp. 1–10.
  27. Ma, L.; Xu, Y.B. Received signal strength recovery in green WLAN indoor positioning system using singular value thresholding. Sensors 2015, 15, 1292–1311. [Google Scholar] [CrossRef] [PubMed]
  28. Wang, H.; Sen, S.; Elgohary, A.; Farid, M.; Youssef, M.; Choudhury, R.R. No need to war-drive: unsupervised indoor localization. In Proceedings of the 10th International Conference on Mobile Systems, Applications, and Services (MobiSys’12), Low Wood Bay, Lake District, UK, 25–29 June 2012; pp. 197–210.
  29. Chou, J.C.K. Quaternion kinematic and dynamic differential equations. IEEE Trans. Robot. Autom. 1992, 8, 53–64. [Google Scholar] [CrossRef]
  30. Sabatini, A.M. Quaternion-based extended Kalman filter for determining orientation by inertial and magnetic sensing. IEEE Trans. Biomed. Eng. 2006, 53, 1346–1356. [Google Scholar] [CrossRef] [PubMed]
  31. Bar-Shalom, Y.; Li, X.-R.; Kirubarajan, T. Estimation with Applications to Tracking and Navigation; Wiley: New York, NY, USA, 2001. [Google Scholar]
  32. Kourogi, M.; Kurata, T. Personal positioning based on walking locomotion analysis with self-contained sensors and a wearable camera. In Proceedings of the Second IEEE/ACM International Symposium on Mixed and Augmented Reality, Tokyo, Japan, 8–10 October 2003; pp. 103–112.
  33. Brajdic, A.; Harle, R. Walk detection and step counting on unconstrained smartphones. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing, Zurich, Switzerland, 8–12 September, 2013; pp. 225–234.
  34. Jahn, J.; Batzer, U.; Seitz, J.; Patino-Studencka, L.; Boronat, J.G. Comparison and evaluation of acceleration based step length estimators for handheld devices. In Proceedings of the 2010 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Zurich, Switzerland, 15–17 September 2010; pp. 1–6.

Share and Cite

MDPI and ACS Style

Deng, Z.-A.; Wang, G.; Hu, Y.; Wu, D. Heading Estimation for Indoor Pedestrian Navigation Using a Smartphone in the Pocket. Sensors 2015, 15, 21518-21536. https://0-doi-org.brum.beds.ac.uk/10.3390/s150921518

AMA Style

Deng Z-A, Wang G, Hu Y, Wu D. Heading Estimation for Indoor Pedestrian Navigation Using a Smartphone in the Pocket. Sensors. 2015; 15(9):21518-21536. https://0-doi-org.brum.beds.ac.uk/10.3390/s150921518

Chicago/Turabian Style

Deng, Zhi-An, Guofeng Wang, Ying Hu, and Di Wu. 2015. "Heading Estimation for Indoor Pedestrian Navigation Using a Smartphone in the Pocket" Sensors 15, no. 9: 21518-21536. https://0-doi-org.brum.beds.ac.uk/10.3390/s150921518

Article Metrics

Back to TopTop