Studi Literatur Human Activity Recognition (HAR) Menggunakan Sensor Inersia

Literature Study of Human Activity Recognition (HAR) Using Inertial Sensors

  • Humaira Nur Pradani Institut Teknologi Sepuluh Nopember
  • Faizal Mahananto
Keywords: Human Activity Recognition, inertial sensor, accelerometer, gyroscope


Human activity recognition (HAR) is one of the topics that is being widely researched because of its diverse implementation in various fields such as health, construction, and UI / UX. As MEMS (Micro Electro Mechanical Systems) evolves, HAR data acquisition can be done more easily and efficiently using inertial sensors. Inertial sensor data processing for HAR requires a series of processes and a variety of techniques. This literature study aims to summarize the various approaches that have been used in existing research in building the HAR model. Published articles are collected from ScienceDirect, IEEE Xplore, and MDPI over the past five years (2017-2021). From the 38 studies identified, information extracted are the overview of the areas of HAR implementation, data acquisition, public datasets, pre-process methods, feature extraction approaches, feature selection methods, classification models, training scenarios, model performance, and research challenges in this topic. The analysis showed that there is still room to improve the performance of the HAR model. Therefore, future research on the topic of HAR using inertial sensors can focus on extracting and selecting more optimal features, considering the robustness level of the model, increasing the complexity of classified activities, and balancing accuracy with computation time.



Download data is not yet available.


[1] S. Memar, M. Delrobaei, G. Gilmore, K. McIsaac, and M. Jog, “Segmentation and detection of physical activities during a sitting task in Parkinson’s disease participants using multiple inertial sensors,” J. Appl. Biomed., vol. 15, no. 4, pp. 282–290, 2017, doi: 10.1016/j.jab.2017.05.002.
[2] L. Sanhudo et al., “Activity classification using accelerometers and machine learning for complex construction worker activities,” J. Build. Eng., vol. 35, no. October 2020, 2021, doi: 10.1016/j.jobe.2020.102001.
[3] M. Ehatisham-Ul-Haq, M. A. Azam, Y. Amin, and U. Naeem, “C2FHAR: Coarse-to-Fine Human Activity Recognition with Behavioral Context Modeling Using Smart Inertial Sensors,” IEEE Access, vol. 8, pp. 7731–7747, 2020, doi: 10.1109/ACCESS.2020.2964237.
[4] S. Zhang, Z. Wei, J. Nie, L. Huang, S. Wang, and Z. Li, “A Review on Human Activity Recognition Using Vision-Based Method,” J. Healthc. Eng., vol. 2017, 2017, doi: 10.1155/2017/3090343.
[5] F. Demrozi, G. Pravadelli, A. Bihorac, and P. Rashidi, “Human Activity Recognition Using Inertial, Physiological and Environmental Sensors: A Comprehensive Survey,” IEEE Access, vol. 8, no. Dl, pp. 210816–210836, 2020, doi: 10.1109/ACCESS.2020.3037715.
[6] R. San-Segundo, H. Blunck, J. Moreno-Pimentel, A. Stisen, and M. Gil-Martín, “Robust Human Activity Recognition using smartwatches and smartphones,” Eng. Appl. Artif. Intell., vol. 72, no. March, pp. 190–202, 2018, doi: 10.1016/j.engappai.2018.04.002.
[7] F. Tchuente, N. Baddour, and E. D. Lemaire, “Classification of aggressive movements using smartwatches,” Sensors (Switzerland), vol. 20, no. 21, pp. 1–12, 2020, doi: 10.3390/s20216377.
[8] C. Fan and F. Gao, “Enhanced human activity recognition using wearable sensors via a hybrid feature selection method,” Sensors, vol. 21, no. 19, 2021, doi: 10.3390/s21196434.
[9] O. S. Eyobu and D. S. Han, “Feature representation and data augmentation for human activity classification based on wearable IMU sensor data using a deep LSTM neural network,” Sensors (Switzerland), vol. 18, no. 9, pp. 1–26, 2018, doi: 10.3390/s18092892.
[10] I. Arun Faisal, T. Waluyo Purboyo, and A. Siswo Raharjo Ansori, “A Review of Accelerometer Sensor and Gyroscope Sensor in IMU Sensors on Motion Capture,” J. Eng. Appl. Sci., vol. 15, no. 3, pp. 826–829, 2019, doi: 10.36478/jeasci.2020.826.829.
[11] M. Fatima Amjad, Muhammad Hassan Khan, Muhammad Adeel Nisar,, Muhammad Shahid Farid Grzegorzek, “A Comparative Study of Feature Selection Approaches for Human Activity Recognition Using Multimodal Sensory Data,” Sensors, pp. 1–21, 2021.
[12] J. Sena, J. Barreto, C. Caetano, G. Cramer, and W. R. Schwartz, “Human activity recognition based on smartphone and wearable sensors using multiscale DCNN ensemble,” Neurocomputing, vol. 444, pp. 226–243, 2021, doi: 10.1016/j.neucom.2020.04.151.
[13] Y. Chen and C. Shen, “Performance Analysis of Smartphone-Sensor Behavior for Human Activity Recognition,” IEEE Access, vol. 5, no. c, pp. 3095–3110, 2017, doi: 10.1109/ACCESS.2017.2676168.
[14] A. Ferrari, D. Micucci, M. Mobilio, and P. Napoletano, “Trends in human activity recognition using smartphones,” J. Reliab. Intell. Environ., vol. 7, no. 3, pp. 189–213, 2021, doi: 10.1007/s40860-021-00147-0.
[15] M. J. Page et al., “The PRISMA 2020 statement: an updated guideline for reporting systematic reviews,” BMJ, vol. 372, Mar. 2021, doi: 10.1136/BMJ.N71.
[16] S. Zhao, W. Li, and J. Cao, “A user-adaptive algorithm for activity recognition based on K-means clustering, local outlier factor, and multivariate gaussian distribution,” Sensors (Switzerland), vol. 18, no. 6, 2018, doi: 10.3390/s18061850.
[17] M. Gjoreski et al., “Classical and deep learning methods for recognizing human activities and modes of transportation with smartphone sensors,” Inf. Fusion, vol. 62, no. April 2019, pp. 47–62, 2020, doi: 10.1016/j.inffus.2020.04.004.
[18] S. Zhuo, L. Sherlock, G. Dobbie, Y. S. Koh, G. Russello, and D. Lottridge, “REAL-time smartphone activity classification using inertial sensors—recognition of scrolling, typing, and watching videos while sitting or walking,” Sensors (Switzerland), vol. 20, no. 3, pp. 1–18, 2020, doi: 10.3390/s20030655.
[19] S. Fan, Y. Jia, and C. Jia, “A feature selection and classification method for activity recognition based on an inertial sensing unit,” Inf., vol. 10, no. 10, 2019, doi: 10.3390/info10100290.
[20] J.-L. Reyes-Ortiz, L. Oneto, A. Samá, A. Ghio, X. Parra, and D. Anguita, “Transition-Aware Human Activity Recognition Using Smartphones.”
[21] M. Malekzadeh, R. G. Clegg, A. Cavallaro, and H. Haddadi, “Mobile Sensor Data Anonymization,” vol. 10, no. 19, 2019, doi: 10.1145/3302505.3310068.
[22] O. Banos et al., “mHealthDroid: A Novel Framework for Agile Development of Mobile Health Applications.”
[23] M. A. Nisar, K. Shirahama, F. Li, X. Huang, and M. Grzegorzek, “Rank Pooling Approach for Wearable Sensor-Based ADLs Recognition,” Sensors 2020, Vol. 20, Page 3463, vol. 20, no. 12, p. 3463, Jun. 2020, doi: 10.3390/S20123463.
[24] G. M. Weiss, K. Yoneda, and T. Hayajneh, “Smartphone and Smartwatch-Based Biometrics Using Activities of Daily Living,” IEEE Access, vol. 7, pp. 133190–133202, 2019, doi: 10.1109/ACCESS.2019.2940729.
[25] M. Shoaib, S. Bosch, O. D. Incel, H. Scholten, and P. J. M. Havinga, “Complex Human Activity Recognition Using Smartphone and Wrist-Worn Motion Sensors,” Sensors 2016, Vol. 16, Page 426, vol. 16, no. 4, p. 426, Mar. 2016, doi: 10.3390/S16040426.
[26] C. Fan and F. Gao, “A New Approach for Smoking Event Detection Using a Variational Autoencoder and Neural Decision Forest,” IEEE Access, vol. 8, pp. 120835–120849, 2020, doi: 10.1109/ACCESS.2020.3006163.
[27] S. S. Saha, S. Rahman, M. J. Rasna, A. K. M. Mahfuzul Islam, and M. A. Rahman Ahad, “DU-MD: An open-source human action dataset for ubiquitous wearable sensors,” 2018 Jt. 7th Int. Conf. Informatics, Electron. Vis. 2nd Int. Conf. Imaging, Vis. Pattern Recognition, ICIEV-IVPR 2018, pp. 567–572, 2019, doi: 10.1109/ICIEV.2018.8641051.
[28] A. Reiss and D. Stricker, “Creating and benchmarking a new dataset for physical activity monitoring,” ACM Int. Conf. Proceeding Ser., 2012, doi: 10.1145/2413097.2413148.
[29] M. Zhang and A. A. Sawchuk, “USC-HAD: A Daily Activity Dataset for Ubiquitous Activity Recognition Using Wearable Sensors,” Proc. 2012 ACM Conf. Ubiquitous Comput. - UbiComp ’12, 2012, doi: 10.1145/2370216.
[30] B. Bruno, F. Mastrogiovanni, and A. Sgorbissa, “A public domain dataset for ADL recognition using wrist-placed accelerometers,” IEEE RO-MAN 2014 - 23rd IEEE Int. Symp. Robot Hum. Interact. Commun. Human-Robot Co-Existence Adapt. Interfaces Syst. Dly. Life, Ther. Assist. Soc. Engag. Interact., pp. 738–743, Oct. 2014, doi: 10.1109/ROMAN.2014.6926341.
[31] H. Gjoreski et al., “The University of Sussex-Huawei Locomotion and Transportation Dataset for Multimodal Analytics with Mobile Devices,” IEEE Access, vol. 6, pp. 42592–42604, 2018, doi: 10.1109/ACCESS.2018.2858933.
[32] D. Wang, J. Wan, J. Chen, and Q. Zhang, “An Online Dictionary Learning-Based Compressive Data Gathering Algorithm in Wireless Sensor Networks,” Sensors 2016, Vol. 16, Page 1547, vol. 16, no. 10, p. 1547, Sep. 2016, doi: 10.3390/S16101547.
[33] N. Twomey et al., “The SPHERE Challenge: Activity Recognition with Multimodal Sensor Data,” Mar. 2016, Accessed: Oct. 30, 2021. [Online]. Available:
[34] D. Micucci, M. Mobilio, and P. Napoletano, “UniMiB SHAR: A Dataset for Human Activity Recognition Using Acceleration Data from Smartphones,” Appl. Sci. 2017, Vol. 7, Page 1101, vol. 7, no. 10, p. 1101, Oct. 2017, doi: 10.3390/APP7101101.
[35] G. Vavoulas, C. Chatzaki, T. Malliotakis, M. Pediaditis, and M. Tsiknakis, “The MobiAct Dataset: Recognition of Activities of Daily Living using Smartphones,” Accessed: Oct. 30, 2021. [Online]. Available:
[36] Y. Vaizman, K. Ellis, G. Lanckriet, and N. Weibel, “Extrasensory app: Data collection in-the-wild with rich user interface to self-report behavior,” Conf. Hum. Factors Comput. Syst. - Proc., vol. 2018-April, Apr. 2018, doi: 10.1145/3173574.3174128.
[37] E. Casilari, J. A. Santoyo-Ramón, and J. M. Cano-García, “UMAFall: A Multisensor Dataset for the Research on Automatic Fall Detection,” Procedia Comput. Sci., vol. 110, pp. 32–39, 2017, doi: 10.1016/J.PROCS.2017.06.110.
[38] S. B. ud din Tahir, A. Jalal, and K. Kim, “Wearable Inertial Sensors for Daily Activity Analysis Based on Adam Optimization and the Maximum Entropy Markov Model,” Entropy 2020, Vol. 22, Page 579, vol. 22, no. 5, p. 579, May 2020, doi: 10.3390/E22050579.
[39] L. Meng et al., “Exploration of human activity recognition using a single sensor for stroke survivors and able-bodied people,” Sensors (Switzerland), vol. 21, no. 3, pp. 1–18, 2021, doi: 10.3390/s21030799.
[40] I. M. Pires, F. Hussain, N. M. Garcia, P. Lameski, and E. Zdravevski, “Homogeneous data normalization and deep learning: A case study in human activity classification,” Futur. Internet, vol. 12, no. 11, pp. 1–14, 2020, doi: 10.3390/fi12110194.
[41] A. Jalal, M. Batool, and K. Kim, “Stochastic recognition of physical activity and healthcare using tri-axial inertial wearable sensors,” Appl. Sci., vol. 10, no. 20, pp. 1–20, 2020, doi: 10.3390/app10207122.
[42] M. M. Hassan, M. Z. Uddin, A. Mohamed, and A. Almogren, “A robust human activity recognition system using smartphone sensors and deep learning,” Futur. Gener. Comput. Syst., vol. 81, pp. 307–313, 2018, doi: 10.1016/j.future.2017.11.029.
[43] J. Saha, C. Chowdhury, I. R. Chowdhury, S. Biswas, and N. Aslam, “An ensemble of condition based classifiers for device independent detailed human activity recognition using smartphones,” Inf., vol. 9, no. 4, 2018, doi: 10.3390/info9040094.
[44] A. Ferrari, D. Micucci, M. Mobilio, and P. Napoletano, “On the Personalization of Classification Models for Human Activity Recognition,” IEEE Access, vol. 8, pp. 32066–32079, 2020, doi: 10.1109/ACCESS.2020.2973425.
[45] A. Dehghani, O. Sarbishei, T. Glatard, and E. Shihab, “A Quantitative Comparison of Overlapping and Non-Overlapping Sliding Windows for Human Activity Recognition Using Inertial Sensors,” Sensors (Basel)., vol. 19, no. 22, Nov. 2019, doi: 10.3390/S19225026.
[46] N. Ahmed, J. I. Rafiq, and M. R. Islam, “Enhanced human activity recognition based on smartphone sensor data using hybrid feature selection model,” Sensors (Switzerland), vol. 20, no. 1, 2020, doi: 10.3390/s20010317.
[47] K. N. K. A. Rahim, I. Elamvazuthi, L. I. Izhar, and G. Capi, “Classification of human daily activities using ensemble methods based on smartphone inertial sensors,” Sensors (Switzerland), vol. 18, no. 12, 2018, doi: 10.3390/s18124132.
[48] R. Zhu et al., “Efficient Human Activity Recognition Solving the Confusing Activities Via Deep Ensemble Learning,” IEEE Access, vol. 7, pp. 75490–75499, 2019, doi: 10.1109/ACCESS.2019.2922104.
[49] W. Qi, H. Su, and A. Aliverti, “A Smartphone-Based Adaptive Recognition and Real-Time Monitoring System for Human Activities,” IEEE Trans. Human-Machine Syst., vol. 50, no. 5, pp. 414–423, 2020, doi: 10.1109/THMS.2020.2984181.
[50] M. B. Dehkordi, A. Zaraki, and R. Setchi, “Feature extraction and feature selection in smartphone-based activity recognition,” Procedia Comput. Sci., vol. 176, pp. 2655–2664, 2020, doi: 10.1016/j.procs.2020.09.301.
[51] R. A. Bhuiyan, N. Ahmed, M. Amiruzzaman, and M. R. Islam, “A robust feature extraction model for human activity characterization using 3-axis accelerometer and gyroscope data,” Sensors (Switzerland), vol. 20, no. 23, pp. 1–17, 2020, doi: 10.3390/s20236990.
[52] A. Elsts, N. Twomey, R. McConville, and I. Craddock, “Energy-efficient activity recognition framework using wearable accelerometers,” J. Netw. Comput. Appl., vol. 168, no. October 2019, p. 102770, 2020, doi: 10.1016/j.jnca.2020.102770.
[53] Z. Chen, L. Zhang, Z. Cao, and J. Guo, “Distilling the Knowledge from Handcrafted Features for Human Activity Recognition,” IEEE Trans. Ind. Informatics, vol. 14, no. 10, pp. 4334–4342, 2018, doi: 10.1109/TII.2018.2789925.
[54] E. Keogh and A. Mueen, “Curse of Dimensionality,” Encycl. Mach. Learn. Data Min., pp. 314–315, 2017, doi: 10.1007/978-1-4899-7687-1_192.
[55] G. Chandrashekar and F. Sahin, “A survey on feature selection methods,” 2014, doi: 10.1016/j.compeleceng.2013.11.024.
[56] L. Chen, S. Fan, V. Kumar, and Y. Jia, “A method of human activity recognition in transitional period,” Inf., vol. 11, no. 9, pp. 1–17, 2020, doi: 10.3390/INFO11090416.
[57] I. T. J. Springer, “Principal Component Analysis, Second Edition.”
[58] Z. Chen, C. Jiang, and L. Xie, “A Novel Ensemble ELM for Human Activity Recognition Using Smartphone Sensors,” IEEE Trans. Ind. Informatics, vol. 15, no. 5, pp. 2691–2699, 2019, doi: 10.1109/TII.2018.2869843.
[59] Z. Chen, Q. Zhu, Y. C. Soh, and L. Zhang, “Robust Human Activity Recognition Using Smartphone Sensors via CT-PCA and Online SVM,” IEEE Trans. Ind. Informatics, vol. 13, no. 6, pp. 3070–3080, 2017, doi: 10.1109/TII.2017.2712746.
[60] A. Jain and V. Kanhangad, “Human Activity Classification in Smartphones Using Accelerometer and Gyroscope Sensors,” IEEE Sens. J., vol. 18, no. 3, pp. 1169–1177, 2018, doi: 10.1109/JSEN.2017.2782492.
[61] C. Ma, W. Li, J. Cao, J. Du, Q. Li, and R. Gravina, “Adaptive sliding window based activity recognition for assisted livings,” Inf. Fusion, vol. 53, no. December 2018, pp. 55–65, 2020, doi: 10.1016/j.inffus.2019.06.013.
[62] X. Zhou, “Wearable health monitoring system based on human motion state recognition,” Comput. Commun., vol. 150, no. October 2019, pp. 62–71, 2020, doi: 10.1016/j.comcom.2019.11.008.
[63] Z. Chen, S. Xiang, J. Ding, and X. Li, “Smartphone sensor-based human activity recognition using feature fusion and maximum full a posteriori,” IEEE Trans. Instrum. Meas., vol. 69, no. 7, pp. 3992–4001, 2020, doi: 10.1109/TIM.2019.2945467.
[64] R. Jansi and R. Amutha, “Hierarchical evolutionary classification framework for human action recognition using sparse dictionary optimization,” Swarm Evol. Comput., vol. 63, no. October 2020, 2021, doi: 10.1016/j.swevo.2021.100873.
[65] M. Webber and R. F. Rojas, “Human Activity Recognition with Accelerometer and Gyroscope: A Data Fusion Approach,” IEEE Sens. J., vol. 21, no. 15, pp. 16979–16989, 2021, doi: 10.1109/JSEN.2021.3079883.
[66] S. Liaqat et al., “Novel Ensemble Algorithm for Multiple Activity Recognition in Elderly People Exploiting Ubiquitous Sensing Devices,” IEEE Sens. J., vol. 21, no. 16, pp. 18214–18221, 2021, doi: 10.1109/JSEN.2021.3085362.
[67] G. Marcus et al., “Deep Learning: A Critical Appraisal,” Jan. 2018, Accessed: Nov. 07, 2021. [Online]. Available:
How to Cite
Pradani, H. N., & Mahananto, F. (2021). Studi Literatur Human Activity Recognition (HAR) Menggunakan Sensor Inersia. Jurnal RESTI (Rekayasa Sistem Dan Teknologi Informasi), 5(6), 1193 - 1206.
Artikel Teknologi Informasi