Volume 5 , Issue 2 , PP: 45-62, 2023 | Cite this article as | XML | Html | PDF | Full Length Article
Basma K. Eldrandaly 1 *
Doi: https://doi.org/10.54216/JCHCI.050205
Human activity recognition (HAR) from smartphone sensors has gained significant attention due to its potential to enhance user experience (UX) and human computer interaction (HCI) in various domains, HAR can enable personalized, context-aware, and adaptive interfaces that improve accessibility and promote health and wellness in various applications such as healthcare, smart homes, fitness tracking, and context-aware systems. However, evaluating the performance of different machine learning (ML) algorithms on activity recognition tasks remains challenging, primarily due to the lack of standardized benchmark datasets and evaluation protocols. In this paper, we presented ActivBench, an end-to-end computational intelligence benchmark designed to facilitate the evaluation and comparison of ML algorithms for human activity inference from smartphone sensors. We addressed the challenges in benchmarking activity recognition systems by providing a unified evaluation protocol and standardized performance metrics. Through extensive experiments using various state-of-the-art algorithms, we demonstrated the effectiveness of ActivBench in assessing the strengths and limitations of different approaches. The benchmark results provide valuable insights into the strengths and limitations of different algorithms, facilitating the development of robust and accurate activity recognition systems that can enhance human computer interaction in various applications. ActivBench is serving as a valuable resource for researchers and practitioners in human activity recognition and human-computer interaction, enabling fair comparisons and fostering advancements in the field. It also serves as a catalyst for advancements in the field, enabling the exploration of novel algorithms, feature engineering techniques, and sensor modalities.
ActivBench , Human Computer Interaction , Computational Intelligence , Benchmark , Human Activity Inference , Smartphone Sensors , Applied Intelligence
[1] L. Liu, B. Zhou, Z. Zou, S. C. Yeh, and L. Zheng, “A Smart Unstaffed Retail Shop Based on Artificial Intelligence and IoT,” IEEE International Workshop on Computer Aided Modeling and Design of Communication Links and Networks, CAMAD, vol. 2018-September, Oct. 2018, doi: 10.1109/CAMAD.2018.8514988.
[2] R. Abdel-Salam, R. Mostafa, and M. Hadhood, “HUMAN ACTIVITY RECOGNITION USING WEARABLE SENSORS: REVIEW, CHALLENGES, EVALUATION BENCHMARK A PREPRINT,” 2021.
[3] M. Straczkiewicz, P. James, and J. P. Onnela, “A systematic review of smartphone-based human activity recognition methods for health research,” npj Digital Medicine 2021 4:1, vol. 4, no. 1, pp. 1–15, Oct. 2021, doi: 10.1038/s41746-021-00514-4.
[4] W. Elmenreich, “Sensor Fusion in Time-Triggered Systems”.
[5] Q. Ethan. McCallum, “Bad data handbook,” p. 245, 2012.
[6] U. Michelucci, “Advanced applied deep learning : convolutional neural networks and object detection”.
[7] D. P. Mandic and J. A. Chambers, “Recurrent Neural Networks for Prediction,” Recurrent Neural Networks for Prediction, Aug. 2001, doi: 10.1002/047084535X.
[8] N. Cristianini and J. Shawe-Taylor, “An Introduction to Support Vector Machines and Other Kernel-based Learning Methods,” An Introduction to Support Vector Machines and Other Kernel-based Learning Methods, Mar. 2000, doi: 10.1017/CBO9780511801389.
[9] S. Ali and N. Bouguila, “A Roadmap to Hidden Markov Models and a Review of Its Application in Occupancy Estimation,” pp. 1–31, 2022, doi: 10.1007/978-3-030-99142-5_1.
[10] N. Bouguila, W. Fan, and M. Amayri, Eds., “Hidden Markov Models and Applications,” 2022, doi: 10.1007/978-3-030-99142-5.
[11] H. Sharp, J. Preece, and Y. Rogers, “The Process of Interaction Design,” Interaction Design: Beyond Human-Computer Interaction, vol. 1, no. 4, pp. 37–67, 2019, Accessed: Feb. 17, 2023. [Online]. Available: https://www.wiley.com/enus/ Interaction+Design%3A+Beyond+Human+Computer+Interaction%2C+5th+Edition-p- 9781119547259
[12] J. Nielsen and R. L. Mack, Usability Inspection Methods. 1194. Accessed: Oct. 17, 2022. [Online]. Available: https://www.nngroup.com/books/usability-inspection-methods/
[13] S. Riihiaho, “Experiences with Usability Evaluation Methods,” Apr. 2001.
[14] A. Dix, J. Finlay, G. D. Abowd, and R. Beale, “Human-computer interaction,” 2004, Accessed: Oct. 17, 2022. [Online]. Available: www.hcibook.com
[15] A. Dix, J. Finlay, G. D. Abowd, and R. Beale, “HUMAN-COMPUTER INTERACTION”, Accessed: Aug. 11, 2022. [Online]. Available: www.pearsoned.co.uk
[16] “Customization vs. Personalization in the User Experience.” https://www.nngroup.com/articles/customization-personalization/ (accessed Jun. 25, 2023).
[17] “(PDF) Universal accessibility in HCI: Process-oriented design guidelines and tool requirements.” https://www.researchgate.net/publication/2620118_Universal_accessibility_in_HCI_Processoriented_ design_guidelines_and_tool_requirements (accessed Jun. 25, 2023).
[18] Z. Mahmood, “Guide to ambient intelligence in the IoT environment : principles, technologies and applications”, Accessed: Jun. 25, 2023. [Online]. Available: https://books.google.com/books/about/Guide_to_Ambient_Intelligence_in_the_IoT.html?id=5AuCDwA AQBAJ
[19] “Context-Aware Computing | The Encyclopedia of Human-Computer Interaction, 2nd Ed.” https://www.interaction-design.org/literature/book/the-encyclopedia-of-human-computer-interaction- 2nd-ed/context-aware-computing-context-awareness-context-aware-user-interfaces-and-implicitinteraction (accessed Jun. 25, 2023).
[20] F. Gullà, S. Ceccacci, M. Germani, and L. Cavalieri, “Design adaptable and adaptive user interfaces: A method to manage the information,” Biosystems and Biorobotics, vol. 11, pp. 47–58, 2015, doi: 10.1007/978-3-319-18374-9_5/FIGURES/4.
[21] “Augmented Human: How Technology Is Shaping the New Reality: Amazon.co.uk: Papagiannis, Helen: 9781491928325: Books.” https://www.amazon.co.uk/Augmented-Human-Technology-Shaping- Reality/dp/1491928328 (accessed Jun. 25, 2023).
[22] J. Jerald, “What Is Virtual Reality?,” The VR Book, p. 9, Oct. 2015, doi: 10.1145/2792790.2792793.
[23] J. Jerald, “The VR Book,” The VR Book, Oct. 2015, doi: 10.1145/2792790.
[24] K. O’Connell, “Designing for mixed reality : blending data, AR, and the physical world”, Accessed: Jun. 25, 2023. [Online]. Available: https://www.oreilly.com/library/view/designing-formixed/ 9781492042761/
[25] Z. Zhang, L. Chu, S. Xia, and L. Pei, “Open Set Mixed-Reality Human Activity Recognition,” 2021 IEEE Global Communications Conference, GLOBECOM 2021 - Proceedings, 2021, doi: 10.1109/GLOBECOM46510.2021.9685735.
[26] I. Gonsher et al., “Designing the Metaverse: A Study of Design Research and Creative Practice from Speculative Fictions to Functioning Prototypes,” Lecture Notes in Networks and Systems, vol. 560 LNNS, pp. 561–573, 2023, doi: 10.1007/978-3-031-18458-1_38/FIGURES/7.
[27] J. R. Kwapisz, G. M. Weiss, and S. A. Moore, “Activity recognition using cell phone accelerometers,” ACM SIGKDD Explorations Newsletter, vol. 12, no. 2, pp. 74–82, Mar. 2011, doi: 10.1145/1964897.1964918.
[28] A. Anjum and M. U. Ilyas, “Activity recognition using smartphone sensors,” 2013 IEEE 10th Consumer Communications and Networking Conference, CCNC 2013, pp. 914–919, 2013, doi: 10.1109/CCNC.2013.6488584.
[29] S. A. Antos, M. V. Albert, and K. P. Kording, “Hand, belt, pocket or bag: Practical activity tracking with mobile phones,” J Neurosci Methods, vol. 231, pp. 22–30, Jul. 2014, doi: 10.1016/J.JNEUMETH.2013.09.015.
[30] A. Bayat, M. Pomplun, and D. A. Tran, “A Study on Human Activity Recognition Using Accelerometer Data from Smartphones,” Procedia Comput Sci, vol. 34, pp. 450–457, Jan. 2014, doi: 10.1016/J.PROCS.2014.07.009.
[31] F. Miao, Y. He, J. Liu, Y. Li, and I. Ayoola, “Identifying typical physical activity on smartphone with varying positions and orientations,” Biomed Eng Online, vol. 14, no. 1, Apr. 2015, doi: 10.1186/S12938- 015-0026-4.
[32] C. Catal, S. Tufekci, E. Pirmit, and G. Kocabag, “On the use of ensemble of classifiers for accelerometerbased activity recognition,” Appl Soft Comput, vol. 37, pp. 1018–1022, Dec. 2015, doi: 10.1016/J.ASOC.2015.01.025.
[33] Y. Tian and W. Chen, “MEMS-based human activity recognition using smartphone,” Chinese Control Conference, CCC, vol. 2016-August, pp. 3984–3989, Aug. 2016, doi: 10.1109/CHICC.2016.7553975.
[34] K. Nurhanim, I. Elamvazuthi, L. I. Izhar, and T. Ganesan, “Classification of human activity based on smartphone inertial sensor using support vector machine,” 2017 IEEE 3rd International Symposium in Robotics and Manufacturing Automation, ROMA 2017, vol. 2017-December, pp. 1–5, Dec. 2017, doi: 10.1109/ROMA.2017.8231736.
[35] S. M. Lee, S. M. Yoon, and H. Cho, “Human activity recognition from accelerometer data using Convolutional Neural Network,” 2017 IEEE International Conference on Big Data and Smart Computing (BigComp), pp. 131–134, Mar. 2017, doi: 10.1109/BIGCOMP.2017.7881728.
[36] B. Cvetković, R. Szeklicki, V. Janko, P. Lutomski, and M. Luštrek, “Real-time activity monitoring with a wristband and a smartphone,” Information Fusion, vol. 43, pp. 77–93, Sep. 2018, doi: 10.1016/J.INFFUS.2017.05.004.
[37] B. Almaslukh, A. M. Artoli, and J. Al-Muhtadi, “A Robust Deep Learning Approach for Position- Independent Smartphone-Based Human Activity Recognition,” Sensors 2018, Vol. 18, Page 3726, vol. 18, no. 11, p. 3726, Nov. 2018, doi: 10.3390/S18113726.
[38] A. Barna, A. K. M. Masum, M. E. Hossain, E. H. Bahadur, and M. S. Alam, “A study on Human Activity Recognition Using Gyroscope, Accelerometer, Temperature and Humidity data,” 2019 International Conference on Electrical, Computer and Communication Engineering (ECCE), Apr. 2019, doi: 10.1109/ECACE.2019.8679226.
[39] M. Ullah, H. Ullah, S. D. Khan, and F. A. Cheikh, “Stacked Lstm Network for Human Activity Recognition Using Smartphone Data,” Proceedings - European Workshop on Visual Information Processing, EUVIP, vol. 2019-October, pp. 175–180, Oct. 2019, doi: 10.1109/EUVIP47703.2019.8946180.
[40] H. Xu et al., “Human activity recognition based on gramian angular field and deep convolutional neural network,” IEEE Access, vol. 8, pp. 199393–199405, 2020, doi: 10.1109/ACCESS.2020.3032699.
[41] D. Thakur and S. Biswas, “Smartphone based human activity monitoring and recognition using ML and DL: a comprehensive survey,” J Ambient Intell Humaniz Comput, vol. 11, no. 11, pp. 5433–5444, Nov. 2020, doi: 10.1007/S12652-020-01899-Y.
[42] D. R. Beddiar, B. Nini, M. Sabokrou, and A. Hadid, “Vision-based human activity recognition: a survey,” Multimed Tools Appl, vol. 79, no. 41–42, pp. 30509–30555, Nov. 2020, doi: 10.1007/S11042-020-09004- 3/TABLES/2.
[43] R. Mondal, D. Mukherjee, P. K. Singh, V. Bhateja, and R. Sarkar, “A New Framework for Smartphone Sensor-Based Human Activity Recognition Using Graph Neural Network,” IEEE Sens J, vol. 21, no. 10, pp. 11461–11468, May 2021, doi: 10.1109/JSEN.2020.3015726.
[44] M. Straczkiewicz, P. James, and J.-P. Onnela, “A systematic review of smartphone-based human activity recognition methods for health research”, doi: 10.1038/s41746-021-00514-4.
[45] M. M. Islam, S. Nooruddin, F. Karray, and G. Muhammad, “Human activity recognition using tools ofconvolutional neural networks: A state of the art review, data sets, challenges, and future prospects,” Comput Biol Med, vol. 149, p. 106060, Oct. 2022, doi: 10.1016/J.COMPBIOMED.2022.106060.
[46] D. Anguita, A. Ghio, L. Oneto, X. Parra, and J. L. Reyes-Ortiz, “Human Activity Recognition on Smartphones Using a Multiclass Hardware-Friendly Support Vector Machine,” Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 7657 LNCS, pp. 216–223, 2012, doi: 10.1007/978-3-642-35395-6_30.
[47] C. Xu, D. Chai, J. He, X. Zhang, and S. Duan, “InnoHAR: A deep neural network for complex human activity recognition,” IEEE Access, vol. 7, pp. 9893–9902, 2019, doi: 10.1109/ACCESS.2018.2890675.
[48] R. Mondal, D. Mukherjee, P. K. Singh, V. Bhateja, and R. Sarkar, “A New Framework for Smartphone Sensor-Based Human Activity Recognition Using Graph Neural Network,” IEEE Sens J, vol. 21, no. 10, pp. 11461–11468, May 2021, doi: 10.1109/JSEN.2020.3015726.
[49] R. Zhu et al., “Efficient Human Activity Recognition Solving the Confusing Activities Via Deep Ensemble Learning,” IEEE Access, vol. 7, pp. 75490–75499, 2019, doi: 10.1109/ACCESS.2019.2922104.
[50] M. Gjoreski et al., “Classical and deep learning methods for recognizing human activities and modes of transportation with smartphone sensors,” Information Fusion, vol. 62, pp. 47–62, Oct. 2020, doi: 10.1016/J.INFFUS.2020.04.004.
[51] H. Haresamudram, D. V. Anderson, and T. Plötz, “On the role of features in human activity recognition,” Proceedings - International Symposium on Wearable Computers, ISWC, pp. 78–88, Sep. 2019, doi: 10.1145/3341163.3347727.
[52] A. Ferrari, D. Micucci, M. Mobilio, and P. Napoletano, “Hand-crafted Features vs Residual Networks for Human Activities Recognition using Accelerometer,” 2019 IEEE 23rd International Symposium on Consumer Technologies, ISCT 2019, pp. 153–156, Jun. 2019, doi: 10.1109/ISCE.2019.8901021.
[53] A. Ignatov, “Real-time human activity recognition from accelerometer data using Convolutional Neural Networks,” Appl Soft Comput, vol. 62, pp. 915–922, Jan. 2018, doi: 10.1016/J.ASOC.2017.09.027.