Next Article in Journal
Efficient Design for Integrated Photonic Waveguides with Agile Dispersion
Previous Article in Journal
Identifying and Characterizing Conveyor Belt Longitudinal Rip by 3D Point Cloud Processing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Smartphone-Based Human Sitting Behaviors Recognition Using Inertial Sensor

by
Vikas Kumar Sinha
1,
Kiran Kumar Patro
2,
Paweł Pławiak
3,4,* and
Allam Jaya Prakash
1
1
Department of Electronics and Communication Engineering, National Institute of Technology, Rourkela 769008, India
2
Department of Electronics and Communication Engineering, Aditya Institute of Technology and Management (A), Tekkali 532201, India
3
Department of Computer Science, Faculty of Computer Science and Telecommunications, Cracow University of Technology, Warszawska 24, 31-155 Krakow, Poland
4
Institute of Theoretical and Applied Informatics, Polish Academy of Sciences, Bałtycka 5, 44-100 Gliwice, Poland
*
Author to whom correspondence should be addressed.
Submission received: 7 August 2021 / Revised: 22 September 2021 / Accepted: 28 September 2021 / Published: 7 October 2021
(This article belongs to the Section Optical Sensors)

Abstract

:
At present, people spend most of their time in passive rather than active mode. Sitting with computers for a long time may lead to unhealthy conditions like shoulder pain, numbness, headache, etc. To overcome this problem, human posture should be changed for particular intervals of time. This paper deals with using an inertial sensor built in the smartphone and can be used to overcome the unhealthy human sitting behaviors (HSBs) of the office worker. To monitor, six volunteers are considered within the age band of 26 ± 3 years, out of which four were male and two were female. Here, the inertial sensor is attached to the rear upper trunk of the body, and a dataset is generated for five different activities performed by the subjects while sitting in the chair in the office. Correlation-based feature selection (CFS) technique and particle swarm optimization (PSO) methods are jointly used to select feature vectors. The optimized features are fed to machine learning supervised classifiers such as naive Bayes, SVM, and KNN for recognition. Finally, the SVM classifier achieved 99.90% overall accuracy for different human sitting behaviors using an accelerometer, gyroscope, and magnetometer sensors.

1. Introduction

People spend their lives in three modes from childhood to old age: active, sedentary, and non-active. In childhood, they spend much time in active mode while playing outdoor games, dancing, running, jumping, etc. In middle age, people spend a lot of time passively, such as sitting on the chair at the office and home on a sofa or bed while watching television for long periods. In old age, people spend maximum time only in non-active mode due to lower physical energy levels. Today, people spend little time in active mode and maximum time in sleeping and sedentary mode. In a sedentary lifestyle, people spend much time sitting and sleeping because of the short life schedule, and people do not care about sitting postures. Sitting is not an issue, but sitting in wrong postures for a long duration in a daily routine may lead to major health issues.
Nowadays, sitting has become a kind of smoke that can lead to many health issues such as back pain [1], sciatica [2], and cervical spondylosis [3]. A traditional technique to analyze patient sitting postures is to let a patient sit on the chair of the hospital in front of the observer such as doctor or therapist, or nurse and complete question answers regarding sitting postures that has been replaced by a smart cushion system [4]. Generally, the diagnostic process takes approximately an hour, which is too short for the complete observation of the patient’s health status. If major issues are found on the spine of the patient, doctors prefer an X-ray to check the curvature of the spine and can detect the spine and the damaged part in the spine of the human body. However, X-ray, computerized tomography (CT) scan, and magnetic resonance imaging cannot be used regularly.
In [4,5,6,7,8], a smart chair-based approach is applied to improve the sitting postures. The cushion-based smart chair combines pressure sensors and IMU to monitor the sitting behavior at the workplace, in the car, and wheelchair. A generated dataset of sitting behavior has been featured in approximate entropy and standard deviation to monitor and recognize the activities and activity levels [5]. In [6], a smart cushion system was implemented to monitor the sitting activities containing calibrated e-Textile sensors and developed a dynamic time warping-based algorithm to recognize the human sitting behaviors. Sitting posture monitoring systems (SPMSs) are used to help assess the real-time postures of the person and improve the sitting behaviors [7,8]. Roh et al. [6] proposed a system to monitor six sitting behaviors by mounting only four low-cost load cells onto the chair’s seat plate and one on the backrest. A crucial role is played by ergonomic information for the seated person to improve the sitting behaviors and the attitude of the seated person [9,10,11,12].
The basic descriptions of machine learning classifiers used in this work are given as follows: the KNN is the non-parametric algorithm that is most popular for its simplicity and effectiveness [13]. In the KNN classifier, it does not require a learning process [14]. The KNN method classifies a given object based on the closest training object(s) [15]. The KNN classifier has two open issues to be addressed [16]; the first is to measure the similarity between two points, and the other is to choose an ideal value for K [17]. The authors in [18] presented an embedded system designed to perform a KNN classifier to achieve 75% accuracy by applying condensed nearest neighborhood as a prototype selection technique, data balance with Kennard stone, and PCA to reduce the dimension.
The support vector machine (SVM) [19] is a supervised learning approach that analyzes the dataset and recognizes patterns, mainly used for classification and regression analysis [20]. The mathematical expression of the SVM algorithm is covered in [21,22]. The SVM classifier aims to find the optimal separating hyper-plane between two classes for some labeled training datasets [23]. The SVM classifier can also be used for different gesture recognition, for example SVM classifier used in [24] for hand gesture recognition and referred publicly available dataset. It is also one of the supervised classifier techniques, which is popular for its easy implementation and simplicity. In the naive Bayes classifier, the input feature needs to be independent. In contrast, the conditional likelihood function of each sitting behavior can be expressed as the product of probability density functions [25].
To improve analyzing sitting postures, we need to replace the traditional techniques and think of advanced techniques [26]. As we are living in the 21st century and people are using a smartphone for their convenience. First, we need to define different sitting postures as discussed in previous literature, such as forward position, middle position, or backward position, forward sitting posture, reclined sitting posture, slumped sitting posture, laterally tilted left or right sitting posture, arm back leaning, right over left, twist left, poking chin, arm learning, crossed legs right over left or left over right sitting posture and postures in school children [18,27,28,29,30,31,32,33]. The proposed study covered five different sitting postures, which are illustrated in Figure 1. These sitting postures have been considered major sitting activities by employees in the office chairs from previous studies.
The inertial sensors are inbuilt with a smartphone which consists of an accelerometer, gyroscope, and magnetometer. By using the IMU sensor, the movement of the body can be easily tracked. The IMU sensor is attached to the upper rear trunk to track the exact movement of the spine, which will help recognize the postures of the body [28]. Sitting postures have been categorized into two categories: the first is correct posture and the second incorrect posture. This paper considers one correct, straight movement and four incorrect postures: left, right, front, and back movements.
Some important related papers are discussed based on used sensors, classifiers, and their accuracy in Table 1. Some of the methods listed in Table 1 have limitations like no. of subjects, no. of postures, complexity, convergence time, cost, and less accuracy. This paper implements a smartphone to detect the time spent on correct and incorrect sitting postures during office working hours. The novel contributions of the paper are as follows:
Sitting behaviors dataset is self-generated in the current paper.
  • The inertial sensor in a smartphone is helpful in monitoring the sitting behaviors of office workers continuously.
  • Determining the time spent on different sitting behaviors, whether correct or incorrect.
  • The user can detect the correct and incorrect sitting behaviors.
The rest of the paper is organized as follows: Section 2 discusses the framework of the current paper, which is illustrated in Figure 1. Data collection, pre-processing, feature extraction, feature selection, and classifiers are briefly discussed. Section 3 represents the results of the experimental setup and discussion of the results. Section 4 concludes the paper and compares the outcomes.

2. Framework of Smartphone-Based Sitting Detection

Many different ways have been discussed in previous studies to monitor the multiple sitting postures in the chair. In the proposed approach, the inertial sensor of the smartphone is used as a sitting behaviors detector. The smartphone is one of the best approaches to monitor daily sitting activities at the home or office. In this study, cost, data access, compatibility, unobtrusive use, and system deployment were prioritized.
Our system can collect a dataset for the five different static movements of the body while sitting in the chair, as illustrated in Figure 1. The smartphone is attached to the rear upper trunk at second thoracic vertebrae T2 to gather the measurable dataset, as illustrated in Figure 2. The anatomical landmark of the sensor on the human body is shown in Figure 2. All the datasets collected from the inertial measurement unit (IMU), inbuilt in the smartphone, are classified after feature extraction and feature subset selection process. The system can identify the postures when the subject moves from the correct sitting posture to incorrect sitting postures. All collected raw datasets cannot directly be fed to the classifiers. To recognize the sitting postures properly, five steps are required to be followed: the first is raw data collection for different sitting behaviors, the second is data pre-processing, the third is feature extraction, the fourth is feature selection and the fifth step is classification, as shown in Figure 1 and are discussed further in the current paper.

2.1. Data Collection and Preprocessing

In this work, a new dataset was generated for detecting human sitting behavior while sitting on a chair. Six subjects (four males and two females) participated in generating the dataset by inertial sensors. The age band of 26 ± 3 years was considered for the study. The IMU sensor was attached to the rear upper trunk of the subject. The IMU of the smartphone was used because it is more user friendly. All of the dataset was generated at a 50 Hz sample rate by the sensors data collector one android application.
A total of five general movements were considered while sitting on the chair in the office as follows:
  • Left movement;
  • Right movement;
  • Front movement;
  • Back movement;
  • Straight movement.
All different sitting activities were performed by the subjects for different time intervals. A sample sequence of all different activities performed at the chair is as follows: left movement for 712 s, the right movement for 756 s, the front movement for 665 s, the back movement for 590 s, and straight movement for 549 s. Each activity was performed by the subjects in the presence of an instructor to generate the dataset efficiently. The total number of instances for each activity is reported in Table 2 and analyzed with a pie chart in Figure 3.

Hardware Platform

The One Plus 6 smartphone was used as a hardware platform for dataset gathering [37]. It consists of the accelerometer, gyroscope, and magnetometer sensors required to recognize physical activities. They run the Oxygen OS 10.3.4 android operating system. The small, low-power Bosch-BMI160 is a low noise 16-bit IMU designed by Bosch used as an accelerometer and gyroscope at 0.002m/s2 and 0.030/rad. The AKM-AK09915 magnetometer is used in One Plus 6 smartphones at 0.6µT resolution. The smartphone is 155.70 mm high, 75.40 mm wide, 7.80 mm deep, and weighs 177 g.
Data pre-processing is a crucial step after data collection before feature extraction. First, we removed the present noise of all collected data and then normalized within the range of −1 to +1 to convert them in the same scale, which will be helpful for better feature extraction from the dataset of different sitting behaviors in the office chair.

2.2. Feature Extraction

To calculate feature vectors from the collected dataset, the window of ω t seconds ( N = ( ω t × f s ) Samples) was considered [13]. Here f s is a sampling frequency of 50 Hz of the dataset and N samples. The total acceleration of the accelerometer, gyroscope, and magnetometer can be calculated by [14]:
A T = ( ( A x ) 2 + ( A y ) 2 + ( A z ) 2 )
Ax, Ay, and Az are the acceleration along the x, y, and z axes of the accelerometer, gyroscope, and magnetometer. For better understanding, features are divided into two different categories, the first is morphological features, and the second is entropy-based features. Where, morphological features include mean of absolute values [33], harmonic mean [33], variance [37,38], standard deviation [38,39], root mean square and simple squared integral [40], and entropy-based features are wavelet entropy and log energy entropy.

2.2.1. Morphological Features

The morphological features include the study of the morphological features such as structure and shape from the dataset of sitting behaviors which are formulated below:
M e a n A b s o l u t e V a l u e ( M A V ) = 1 N i = 0 N | x i |
S tan d a r d D e v i a t i o n ( σ ) = 1 N i = 0 N ( x i μ ) 2
V a r i a n c e ( V A R ) = 1 N i = 0 N ( x μ ) 2
S k e w n e s s = 1 N i = 1 N ( x i μ ) 3 σ
R o o t M e a n S q u a r e ( R M S ) = 1 N i = 0 N ( x i ) 2
S i m p l e S q u a r e d I n t e g r a l ( S S I ) = i = 1 N ( x i ) 2
where, μ is mean of dataset, σ is the standard deviation and x i is collected samples. A simple squared integral can help to calculate the energy of the signal [33].

2.2.2. Entropy-Based Features

Two kinds of entropy-based features were introduced: wavelet entropy is the measure of relative energies in the different signals and is used to determine the degree of the disorder [1].
W a v e l e t E n t r o p y ( W E ) = i = 1 N ( x i ln ( x i ) )
Log energy entropy (LEE): For a time series x(n) of finite length m, LEE is calculated as [11]
L o g E n e r g y E n t r o p y = i = 1 N ( log 2 ( x i ) 2 )

2.3. Feature Subset Selection

The dimension of the features is also the dependent parameter of the performance of the classifiers. One of the best ways to reduce the dimension of the extracted features is by applying a feature selection algorithm [40,41]. In this work, a filter method known as correlation-based feature selection (CFS) is applied as a feature selection algorithm [42,43,44]. CFS quickly identifies relevant features and discards redundant, noisy, and irrelevant features based on appropriate correlation measures. This entire approach enhances the performance of classifiers while also reducing their computing time. Table 3 shows the subset of features depending on their contributions. The CFS technique can anticipate each feature as well as feature redundancy at the same time. Hence, a high correlation of the features of the classes is desirable.
Many different search methods are available to select the feature subset in the CFS technique. In [33], a scatter search technique is used and uses a diversification generation technique for a diverse subset. In the current paper, particle swarm optimization (PSO) search technique was applied among other search techniques. PSO technique was proposed by Eberhart and Kennedy in [45]. PSO-based calibration technique was developed to obtain optimized error parameters such as scale factor, bias, and misalignment errors. The all-required mathematical expression and fitness function are explained in [45,46,47]. By combining the CFS technique and PSO search technique, a total of 27 features were selected out of 85 calculated features shown in Table 3.

2.4. Sitting Behavior Recognition Techniques

In this paper, three classifier techniques were applied and compared, the most popular among other recognition techniques such as support vector machine (SVM), K-nearest neighbor (KNN), and naive Bayes after feature selection algorithm to recognize the sitting behaviors. The accuracy of all applied recognition techniques estimation is based on the confusion matrix [47]. True and false predicted values help calculate the accuracy of the sitting behaviors recognition techniques, as shown in Equation (10). In the following paragraph, these all-recognition techniques will be discussed briefly
A c c u r a c y = T P + T N T P + T N + F P + F N

3. Results and Discussion

In this section, the whole experimental setup and results are discussed. Here, MATLAB R2021a was used to perform all calculations and analyses of the dataset of human sitting behaviors of office workers in the office environment. CFS and PSO are jointly utilized for feature selection among various extracted features to provide the highest performance of the applied classifiers. The inertial sensor unit of the smartphone, which includes an accelerometer, gyroscope and magnetometer sensors, was employed to collect the raw dataset defining the five sitting behaviors of the office workers. The recognition results of each sitting behavior were using 10-fold cross-validation for training. Here we split the dataset in the ratio of 3:1 that is 75% of this dataset is the training cross-validation (around 123,545 instances) dataset and 25% of the dataset is the test set (around 39,956 instances). The feature vectors are the representation of the static sitting movements in x, y, and z directions.

3.1. Performance Analysis of Classifiers with Feature Selection of Accelerometer, Gyroscope, and Magnetometer

The performance evaluation of the KNN, naive Bayes, and SVM classifiers for feature selection using an accelerometer, gyroscope, and magnetometer is presented in Table 4. The evaluation is based on the sitting activities with 27 feature subsets extracted using the combination of the accelerometer, gyroscope, and magnetometer sensors. All the classifiers were trained properly with similar training conditions. A multiclass SVM (OAA) with a medium Gaussian (RBF) kernel classifies the test data. In the case of K-NN, the hamming distance metric is chosen to find distance along with the nearest neighbor search method. The comparative analysis of the different K values 3, 5, 7, and 11 of KNN are also illustrated in Table 4. The overall accuracy of KNN (K = 3), KNN (K = 5), KNN (K = 7), KNN (K = 11), SVM and naive Bayes are 99.73%, 99.82%, 99.50%, 99.40%, 99.89%, 97.46%, respectively. From the results, it can be inferred that the KNN at the values of K at 3 and 5 and the SVM classifier outperformed among the other classifiers. The performance of each classifier was evaluated for all the five sitting behaviors, and a comparison of experimental results is shown in Table 4.
The confusion matrix of all five sitting behaviors by the SVM classifier is shown in Table 5. The confusion matrix in Table 5 is formed for the instances of four different sitting behaviors confused with each other and straight movement. In the confusion matrix, each row represents instances in true class, and the column represents instances in predicted class.

3.2. Performance Analysis of the Classifiers with Feature Selection of Accelerometer and Gyroscope

Table 6 illustrates the results of classifiers for feature selection using accelerometer and gyroscope sensors. The classifier results were obtained for the classifier techniques that are KNN (K = 3, K = 5, K = 7, K = 11), SVM, and naive Bayes. The naive Bayes classifier performs poorly among the other classifiers with 91.60% overall accuracy. The KNN for K = 5 and 11 performed poorly with 98.00% overall accuracy compared to the KNN with K values as 3 and 7. The KNN with K = 3 has better accuracy of 98.60% among all the other K values of KNN. The SVM classifier achieves the overall best performance by the SVM classifier of 98.80%, and the overall comparison of classifier results with all input postures is shown in Figure 4.
From Figure 4, it can be inferred that for the right movement, the classifiers KNN (K = 3, K = 5, and K = 7), SVM, and naive Bayes classifiers have the maximum accuracy with 99.78%, 99.76%, 99.73%, 99.88%, and 99.37%. The KNN with K = 11 performed best for left movement with 96.95% accuracy.
The confusion matrix for all considered sitting behaviors of office workers by using accelerometer and gyroscope with the SVM classifier is shown in Table 7. The confusion matrix in Table 7 is formed for five different sitting behaviors confused with each other.

3.3. Performance Analysis of the Classifiers with Feature Selection of Accelerometer

The result of all the applied classifiers only by using the accelerometer sensor is shown in Table 8. From Table 8, it can be illustrated that the naive Bayes classifier performed with 91.90% least overall accuracy compared to the other classifier techniques. The KNN (K = 3), KNN (K = 5) and KNN (K = 7) performed comparably with 99.70% accuracy. The KNN with K = 11 performed as the best classifier with 99.60% accuracy compared to all the other techniques. The overall accuracy of SVM is 99.5%. The right movement feature achieved attention by naive Bayes and all KNN classifiers. Whereas, left movement is most accurately classified by SVM classifier with 98.94% accuracy. The confusion matrix for all the considered sitting behaviors of office workers using an accelerometer with KNN (K = 3) classifier is shown in Table 8. The confusion matrix in Table 9 is formed for five different sitting behaviors confused with each other.

3.4. Analysis of Results

In this work, we employed smartphone technology as the sensor for analysis of the sitting behavior of the test subjects. This work made an effort to analyze the static sitting behavior for left movement, right movement, front movement, back movement, and straight movement. It was observed that the SVM classifier outperformed the other classifiers with feature selection from accelerometer, gyroscope, and magnetometer. The performance of each classifier was evaluated for all five sitting behaviors. Even for the feature selection using accelerometer and gyroscope sensors, the SVM classifier achieved the best performance. All the applied classifiers with only the accelerometer, the naive Bayes, and all KNN classifiers achieved the best performance for the right movement. Whereas, the SVM classifier most accurately classified the left movement. After analyzing the overall results, the SVM classifier performed better accuracy with considerable (stable) performance for detecting all three postures from three sensors. Still, while considering other classifiers, the performance is not that much accurate by considering all the postures.

4. Conclusions

In this paper, five general sitting behaviors such as left movement, right movement, front movement, back movement, and straight movement of office workers were recognized by using the inertial sensor inbuilt in the smartphone with the help of machine learning classification techniques. An efficient smartphone-based framework was developed with the following stages of pre-processing, feature extraction, feature selection, and machine learning classification. Popular activity recognition techniques like naive Bayes, SVM, and KNN were performed with experimentation to classify the different sitting behaviors. In this paper, the comparative analysis among the above-discussed activity recognition techniques was performed for five sitting behaviors of the office worker. The performance of the KNN classifier at different values of K such as 3, 5, 7, and 11 were also observed by the impact of the sitting behaviors on the office chair. Finally, 99.90% accuracy was achieved in the current work for all sitting behaviors on the office chair by the SVM classifier. From the experimental results, we hope that it is possible to distinguish between the considered five different sitting behaviors only by using a smartphone with the inertial sensor.

Author Contributions

Conceptualization, V.K.S. and A.J.P.; methodology, V.K.S. and P.P.; software, A.J.P.; validation, V.K.S., A.J.P. and K.K.P.; formal analysis, K.K.P.; investigation, V.K.S.; resources, P.P.; data acquisition, V.K.S.; writing—original draft preparation, K.K.P.; writing—review and editing, K.K.P.; visualization, P.P. and A.J.P.; supervision, P.P.; project administration, P.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Informed consent was taken from the Institutional Review Board/Institutional Ethics Committee.

Informed Consent Statement

Informed consent was obtained from all the volunteers involved in the study.

Data Availability Statement

The data presented in this work are not publicly available due to privacy.

Acknowledgments

The authors would like to thank all volunteers who provided us with valuable time and helped to generate the dataset.

Conflicts of Interest

The authors declare that there is no conflict of interest.

Abbreviations

CTComputerized tomography
DTDecision tree
FDAFlexible discriminant analysis
FSRForce-sensing resistor
HMHarmonic mean
HSBHuman sitting behaviors
IMUInertial measurement unit
kNNk-nearest neighbors
MPUMulti-core processing unit
PCAPrincipal component analysis
RBFRadial basis function
RMSRoot mean square
SDStandard deviation
SPMSSitting posture monitoring system
SSISimple squared integral
SVMSupport vector machine

References

  1. Haynes, S.; Williams, K. Impact of seating posture on user comfort and typing performance for people with chronic low back pain. Int. J. Ind. Ergon. 2008, 38, 35–46. [Google Scholar] [CrossRef]
  2. Lis, A.M.; Black, K.M.; Korn, H.; Nordin, M. Association between sitting and occupational LBP. Eur. Spine J. 2007, 16, 283–298. [Google Scholar] [CrossRef] [Green Version]
  3. Magnæs, B. Body position and cerebrospinal fluid pressure. J. Neurosurg. 1976, 44, 687–697. [Google Scholar] [CrossRef]
  4. Xu, W.; Huang, M.C.; Amini, N.; He, L.; Sarrafzadeh, M. Ecushion: A textile pressure sensor array design and calibration for sitting posture analysis. IEEE Sens. J. 2013, 10, 3926–3934. [Google Scholar] [CrossRef]
  5. Ma, C.; Li, W.; Gravina, R.; Cao, J.; Li, Q.; Fortino, G. Activity level assessment using a smart cushion for people with a sedentary lifestyle. Sensors 2017, 17, 2269. [Google Scholar] [CrossRef] [Green Version]
  6. Roh, J.; Park, H.J.; Lee, K.J.; Hyeong, J.; Kim, S.; Lee, B. Sitting posture monitoring system based on a low-cost load cell using machine learning. Sensors 2018, 18, 208. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Xipei, R.; Yu, B.; Lu, Y.; Chen, Y.; Pu, P. Health—Sit: Designing posture based interaction to promote exercise during fitness breaks. Int. J. Hum. Com Puter. Interact. 2019, 35, 870–885. [Google Scholar] [CrossRef] [Green Version]
  8. Robertson, M.; Amick, B.C., III; DeRango, K.; Rooney, T.; Bazzani, L.; Harrist, R.; Moore, A. The effects of an office ergonomics training and chair intervention on worker knowledge, behavior and musculoskeletal risk. Appl. Ergon. 2009, 40, 124–135. [Google Scholar] [CrossRef] [PubMed]
  9. Chanchai, W.; Songkham, W.; Ketsomporn, P.; Sappakitchanchai, P.; Siriwong, W.; Robson, M.G. The impact of ergonomics intervention on psychosocial factors and musculoskeletal symptoms among office workers. Int. J. Ind. Ergon. 2011, 41, 671–676. [Google Scholar]
  10. Goossens, R.H.M.; Netten, M.P.; Van der Doelen, B. An offiice chair to influence the sitting behavior of office workers. Work 2012, 41, 2086–2088. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  11. Menndez, C.C.; Amick, B.C., III; Robertson, M.; Bazzani, L.; DeRango, K.; Rooney, T.; Moore, A. A replicated field intervention study evaluating the impact of a highly adjustable chair and office ergonomics training on visual symptoms. Appl. Ergon. 2012, 43, 639–644. [Google Scholar] [CrossRef] [Green Version]
  12. Taieb-Maimon, M.; Cwikel, J.; Shapira, B.; Orenstein, I. The effectiveness of a training method using selfmoelling webcam photos for reducing musculoskeletal risk among office workers using computers. Appl. Ergon. 2012, 43, 376–385. [Google Scholar] [CrossRef] [PubMed]
  13. Garcia, S.; Derrac, J.; Cano, J.-R.; Herrera, F. Prototype Selection for Nearest Neighbor Classification: Taxonomy and Empirical Study. IEEE Trans. Pattern Anal. Mach. Intell. 2012, 34, 417–435. [Google Scholar] [CrossRef] [PubMed]
  14. Theodoridis, S.; Pikrakis, A.; Koutroumbas, K.; Cavouras, D. Introduction to Pattern Recognition: A Matlab Approach; Academic Press: Waltham, NA, USA, 2010. [Google Scholar]
  15. Duda, R.O.; Hart, P.E.; Stork, D.G. Pattern classification. Int. J. Comput. Intell. Appl. 2001, 1, 335–339. [Google Scholar]
  16. Zhang, S. KNN-CF Approach: Incorporating Certainty Factor to kNN Classification. IEEE Intell. Inform. Bull. 2010, 1, 24–33. [Google Scholar]
  17. Zhang, S.; Li, X.; Zong, M.; Zhu, X.; Wang, R. Efficient kNN Classification with Different Numbers of Nearest Neighbors. IEEE Trans. Neural. Netw. Learn. Syst. 2017, 29, 1774–1785. [Google Scholar] [CrossRef] [PubMed]
  18. Rosero-Montalvo, P.D.; Peluffo-Ordonez, D.H.; Batista, V.F.L.; Serrano, J.; Rosero, E.A. Intelligent system for identification of wheel chair users posture using machine learning techniques. IEEE Sens. J. 2018, 19, 1936–1942. [Google Scholar] [CrossRef]
  19. Chang, C.; Lin, C. Libsvm: A library for support vector machines. ACM Trans. Intell. Syst. Technol. 2011, 2, 1–27. [Google Scholar] [CrossRef]
  20. Hall, M.A. Correlation-Based Feature Selection for Machine Learning. The University of Waikato. Available online: https://www.cs.waikato.ac.nz/~ml/publications/1999/99MH-Thesis.pdf (accessed on 25 September 2021).
  21. Burges, C.J. A Tutorial on Support Vector Machines for Pattern Recognition. Data Min. Knowl. Discov. 1998, 2, 121–167. [Google Scholar] [CrossRef]
  22. Cortes, C.; Vapnik, V. Support vector networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
  23. Gaglio, S.; Re, G.L.; Morana, M. Human Activity Recognition Process Using 3-D Posture Data. IEEE Trans. Hum.-Mach. Syst. 2014, 45, 586–597. [Google Scholar] [CrossRef]
  24. Miron, C.; Pasarica, A.; Costin, H.; Manta, V.; Timofte, R.; Ciucu, R. Hand Gesture Recognition based on SVM Classification. In Proceedings of the 2019 E-Health and Bioengineering Conference (EHB), Iasi, Romania, 21–23 November 2019. [Google Scholar]
  25. Attal, F.; Mohammed, S.; Dedabrishvili, M.; Chamroukhi, F.; Oukhellou, L.; Amirat, Y. Physical Human Activity Recognition Using Wearable Sensors. Sensors 2015, 15, 31314–31338. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  26. Vergara, M.; Page, Á. System to measure the use of the backrest in sitting-posture office tasks. Appl. Ergon. 2000, 31, 247–254. [Google Scholar] [CrossRef]
  27. Zemp, R.; Fliesser, M.; Wippert, P.-M.; Taylor, W.R.; Lorenzetti, S. Occupational sitting behaviour and its relationship with back pain—A pilot study. Appl. Ergon. 2016, 56, 84–91. [Google Scholar] [CrossRef] [Green Version]
  28. Grandjean, E.; Hünting, W. Ergonomics of posture—Review of various problems of standing and sitting posture. Appl. Ergon. 1977, 8, 135–140. [Google Scholar] [CrossRef]
  29. Hyeong, J.-H.; Roh, J.-R.; Park, S.-B.; Kim, S.; Chung, K.-R. A Trend Analysis of Dynamic Chair and Applied Technology. J. Ergon. Soc. Korea 2014, 33, 267–279. [Google Scholar] [CrossRef] [Green Version]
  30. Wang, S.J.; Sommer, B.; Cheng, W.; Schreiber, F. The Virtual-Spine Platform—Acquiring, visualizing, and analyzing individual sitting behavior. PLoS ONE 2018, 13, e0195670. [Google Scholar] [CrossRef] [PubMed]
  31. Bortone, I.; Argentiero, A.; Agnello, N.; Sabato, S.S.; Bucciero, A. A two-stage approach to bring the postural assessment to masses: The KISS-Health Project. In Proceedings of the IEEE-EMBS International Conference on Biomedical and Health Informatics (BHI), Valencia, Spain, 1–4 June 2014; pp. 371–374. [Google Scholar]
  32. Bortone, I.; Argentiero, A.; Agnello, N.; Denetto, V.; Neglia, C.; Benvenuto, M. The PoSE Project: An Innovative Approach to PromoteHealthy Postures in Schoolchildren. In Proceedings of the International Conference on E-Learning, E-Education, and Online Training, Bethesda, MD, USA, 18−20 September 2014. [Google Scholar]
  33. Arif, M.; Ahmed, K. Physical activities monitoring using wearable acceleration sensors attached to the body. PLoS ONE 2015, 10, e0130851. [Google Scholar] [CrossRef] [Green Version]
  34. Zdemir, A.; Billur, B. Detecting falls with wearable sensors using machine learning techniques. Sensors 2014, 14, 10691–10708. [Google Scholar] [CrossRef]
  35. Benocci, M.; Farella, E.; Benini, L.A. Context-Aware Smart Seat. In Proceedings of the 2011 4th IEEE International Workshop on Advances in Sensors and Interfaces (IWASI), Savelletri diFasano, Italy, 28–29 June 2011; pp. 104–109. [Google Scholar]
  36. Zhang, S.; Mccullagh, P.; Nugent, C.; Zheng, H.; Baumgarten, M. Optimal model selection for posture recognition in home-based healthcare. Int. J. Mach. Learn. Cybern. 2010, 2, 1–14. [Google Scholar] [CrossRef]
  37. OnePlus 6 specs, Oneplus.in. 2021. Available online: https://www.oneplus.in/6/specs (accessed on 21 September 2021).
  38. Arif, M.; Mohsin, B.; Ahmed, K.; Ahamed, S.I. Better physical activity classification using smart phone acceleration sensor. J. Med Syst. 2014, 38, 95. [Google Scholar] [CrossRef]
  39. Phinyomark, A.; Hirunviriya, S.; Limsakul, C.; Phukpattaranont, P. Evaluation of EMG feature extraction for hand movement recognition based on Euclidean distance and standard deviation. In Proceedings of the ECTI-CON2010: The 2010 ECTI International Confernce on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology, Chiang Mai, Thailand, 19–21 May 2010; pp. 856–860. [Google Scholar]
  40. Mannini, A.; Sabatini, A.M. Machine learning methods for classifying human physical activity from on body accelerometers. Sensors 2010, 10, 1154–1175. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  41. Hall, M.; Eibe, F.; Holmes, G.; Pfahringer, B.; Reutemann, P.; Witten, I.H. The WEKA data mining software: An update. ACM SIGKDD Explor. Newsl. 2009, 11, 10–18. [Google Scholar] [CrossRef]
  42. Hall, M.A.; Smith, L.A. Feature Selection for Machine Learning: Comparing a Correlation-Based Filter Approach to the Wrapper. In Proceedings of the FLAIRS Conference, Orlando, FL, USA, 1–5 May 1999; pp. 235–239. [Google Scholar]
  43. Aha, D.W.; Kibler, D.; Albert, M.K. Instance based learning algorithms. Mach. Learn. 1991, 6, 37–66. [Google Scholar] [CrossRef] [Green Version]
  44. Russell, E.; Kennedy, J. Particle swarm optimization. In Proceedings of the IEEE International Conference On Neural Networks, Citeseer, Perth, WA, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
  45. Sinha, V.K.; Maurya, A.K. Calibration of Inertial Sensor by Using Particle Swarm Optimization and Human Opinion Dynamics Algorithm. Int. J. Instrum. Control. Syst. 2017, 47, 777–778. [Google Scholar] [CrossRef]
  46. Reddy, S.; Mun, M.; Burke, J.; Estrin, D.; Hansen, M.; Srivastava, M. Using mobile phones to determine transportation modes. ACM Trans. Sens. Netw. 2010, 6, 1–27. [Google Scholar] [CrossRef]
  47. VenkataPhanikrishna, B.; Pławiak, P.; Jaya Prakash, A. A Brief Review on EEG Signal Pre-processing Techniques for Real-Time Brain-Computer Interface Applications. TechRxiv. Preprint. 2021. Available online: https://0-doi-org.brum.beds.ac.uk/10.36227/techrxiv.16691605.v1 (accessed on 20 September 2021).
Figure 1. A framework of the proposed system.
Figure 1. A framework of the proposed system.
Sensors 21 06652 g001
Figure 2. Wearable sensor location in the human body.
Figure 2. Wearable sensor location in the human body.
Sensors 21 06652 g002
Figure 3. Pie chart representation for a number of instances.
Figure 3. Pie chart representation for a number of instances.
Sensors 21 06652 g003
Figure 4. Different classifier results with feature selection of accelerometer and gyroscope.
Figure 4. Different classifier results with feature selection of accelerometer and gyroscope.
Sensors 21 06652 g004
Table 1. Literature review on the basis of previous papers.
Table 1. Literature review on the basis of previous papers.
S. NoAuthorsType of SensorsClassifiersAccuracy (%)Limitations
1Xu, Wenyao et al. [4]Textile sensor array in a smart cushion chairNaïve Bayes network85.90The recognition rate is less.
2Roh, Jongryun et al. [6]Low-cost load cells (P0236-I42)SVM using RBF kernel97.20No. of subjects used is less, and power consumption is more.
3Taieb-Maimon, Meirav et al. [12]Webcam, Rapid Upper Limb Assessment (RULA) tool.Sliced inverse regression86.0Analyzed only three symptom scales as back symptoms, arm symptoms, and neck pain severity.
4Arif, Muhammad et al. [33]Colibri wireless IMUkNN97.90Dataset tested is small, and the optimal set of sensors need to be placed at the appropriate locations on the body.
5Zdemir et al. [34]The MTw sensor unit, MTw software development kitRandom forest90.90Cost is high, and the convergence time is more.
6Rosero-Montalvo et al. [18]Ultrasonic sensor, pressure sensor, Arduinonano, LiPobatterykNN75.0Accuracy reported is much less.
7Benocci et al. [35]FSR, digital magnetometer, accelerometerkNN92.70The number of subjects used in the experiment is less.
8Shumei Zhang et al. [36]HTC smartphone
(HD8282)
kNN92.70A posture-aware reminder system can be attached.
Table 2. Number of instances per activity.
Table 2. Number of instances per activity.
S. NoPhysical ActivitiesNo. of InstancesTime (in Seconds)
1A1: Left movement35,565712
2A2: Right movement37,757756
3A3: Front movement33,268665
4A4: Back movement29,460590
5A5: Straight movement27,451549
Total163,5013272
Table 3. Selected features with contribution ratings.
Table 3. Selected features with contribution ratings.
S. NoSelected FeaturesS. NoSelected Features
1Total-acceleration15Z-magnetometer-SD
2Total-magnetometer16Z-accelerometer-skewness
3Y-accelerometer-MAV17X-gyroscope-skewness
4X-gyroscope-MAV18Y-gyroscope-skewness
5Y-gyroscope-MAV19Z-gyroscope-skewness
6Y-magnetometer-MAV20Y-magnetometer-skewness
7X-accelerometer-HM21Y-accelerometer-LEE
8X-gyroscope-HM22Y-magnetometer-LEE
9Y-accelerometer-Var23X-gyroscope-SSI
10Z-accelerometer-Var24X-accelerometer-WE
11Z-magnetometer-Var25X-gyroscope-WE
12X-gyroscope-SD26X-magnetometer-WE
13Y-gyroscope-SD27Y-magnetometer-WE
14X-magnetometer-SD
Table 4. Classifier results with feature selection of accelerometer, gyroscope, and magnetometer.
Table 4. Classifier results with feature selection of accelerometer, gyroscope, and magnetometer.
S. NoActivitiesKNN
(K = 3)
KNN
(K = 5)
KNN
(K = 7)
KNN
(K = 11)
SVMNaive Bayes
1A1: Left movement99.2099.9199.9199.9299.9998.51
2A2: Right movement99.9799.9799.9799.9599.9899.06
3A3: Front movement99.9699.9699.9699.9499.9899.15
4A4: Back movement99.6799.7299.7099.6899.7691.89
5A5: Straight movement99.8999.5898.3197.6899.7798.71
Table 5. Confusion matrix of SVM classifier of selected features of accelerometer, gyroscope, and magnetometer.
Table 5. Confusion matrix of SVM classifier of selected features of accelerometer, gyroscope, and magnetometer.
S. NoActivityA1A2A3A4A5
True Class1A110,6690001
2A2111,476000
3A31010,05510
4A4000876221
5A5002168044
Predicted Class
Table 6. Classifiers result with feature selection of accelerometer and gyroscope.
Table 6. Classifiers result with feature selection of accelerometer and gyroscope.
S. NoActivitiesKNN
(K = 3)
KNN
(K = 5)
KNN
(K = 7)
KNN
(K = 11)
SVMNaive Bayes
1A1: Left Movement98.2497.8897.5696.9597.1084.55
2A2: Right Movement99.7899.7699.7399.6999.8899.37
3A3: Front Movement99.5599.5399.5299.5199.6692.43
4A4: Back Movement98.9599.0899.1799.2698.7885.66
5A5: Straight Movement95.8795.1294.5793.5198.3195.31
Table 7. Confusion matrix of SVM classifier of selected features of accelerometer and gyroscope.
Table 7. Confusion matrix of SVM classifier of selected features of accelerometer and gyroscope.
S. NoActivityA1A2A3A4A5
True Class1A110,34171088185
2A2311,460325
3A3130210,03812
4A41108746106
5A56941597868
Predicted Class
Table 8. Classifier results with feature selection of accelerometer.
Table 8. Classifier results with feature selection of accelerometer.
S. NoActivitiesKNN
(K = 3)
KNN
(K = 5)
KNN
(K = 7)
KNN
(K = 11)
SVMNaive Bayes
1A1: Left movement99.6399.5799.5099.4698.9482.18
2A2: Right movement99.9599.9399.9299.9199.8999.80
3A3: Front movement99.9099.8899.8799.8799.8490.62
4A4: Back movement99.6899.7299.7099.7599.6091.06
5A5: Straight movement99.3199.2099.1598.8899.3096.28
Table 9. Confusion matrix of KNN (K = 3) classifier of selected features of accelerometer.
Table 9. Confusion matrix of KNN (K = 3) classifier of selected features of accelerometer.
S. NoActivityA1A2A3A4A5
True Class1A110,634618114
2A2311,467101
3A39010,04901
4A41008769127
5A51510397994
Predicted Class
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Sinha, V.K.; Patro, K.K.; Pławiak, P.; Prakash, A.J. Smartphone-Based Human Sitting Behaviors Recognition Using Inertial Sensor. Sensors 2021, 21, 6652. https://0-doi-org.brum.beds.ac.uk/10.3390/s21196652

AMA Style

Sinha VK, Patro KK, Pławiak P, Prakash AJ. Smartphone-Based Human Sitting Behaviors Recognition Using Inertial Sensor. Sensors. 2021; 21(19):6652. https://0-doi-org.brum.beds.ac.uk/10.3390/s21196652

Chicago/Turabian Style

Sinha, Vikas Kumar, Kiran Kumar Patro, Paweł Pławiak, and Allam Jaya Prakash. 2021. "Smartphone-Based Human Sitting Behaviors Recognition Using Inertial Sensor" Sensors 21, no. 19: 6652. https://0-doi-org.brum.beds.ac.uk/10.3390/s21196652

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop