336 337
Full Length Article
Fusion: Practice and Applications
Volume 13 , Issue 2, PP: 62-70 , 2023 | Cite this article as | XML | Html |PDF

Title

Multi-Sensor Data Fusion for Accurate Human Activity Recognition with Deep Learning

  Edmundo Jalon Arias 1 * ,   Luz M. Aguirre Paz 2 ,   Luis Molina Chalacan 3

1  Universidad Regional Autonoma de los Andes (UNIANDES), Ecuador
    (uq.sistemas@uniandes.edu.ec)

2  Universidad Regional Autonoma de los Andes (UNIANDES), Ecuador
    (direccionadmision@uniandes.edu.ec)

3  Universidad Regional Autonoma de los Andes (UNIANDES), Ecuador
    (uq.luismolina@uniandes.edu.ec)


Doi   :   https://doi.org/10.54216/FPA.130206

Received: April 17, 2023 Revised: July 15, 2023 Accepted: September 16, 2023

Abstract :

In the era of pervasive computing and wearable technology, the accurate recognition of human activities has gained paramount importance across a spectrum of applications, from healthcare monitoring to smart environments. This paper introduces a novel methodology that leverages the fusion of multi-sensor data with deep learning techniques to enhance the precision and robustness of human activity recognition. Our approach commences with the transformation of accelerometer and gyroscope time-series data into recurrence plots, facilitating the distillation of temporal patterns and dependencies. Subsequently, a dual-path convolutional network framework is employed to extract intricate sensory patterns independently, followed by an attention module that fuses these features, capturing their nuanced interactions. Rigorous experimental evaluations, including comparative analyses against traditional machine learning baselines, validate the superior performance of our methodology. The results demonstrate remarkable classification performance, underscoring the efficacy of our approach in recognizing a diverse range of human activities. Our research not only advances the state-of-the-art in activity recognition but also highlights the potential of deep learning and multi-sensor data fusion in enabling context-aware systems for the benefit of society.

Keywords :

Multi-Sensor Data Fusion; Deep Learning; Human Activity Recognition; Sensor Fusion Techniques; Data Fusion Strategies; Cross-Modal Fusion

References :

[1]     Qiu, S., Zhao, H., Jiang, N., Wang, Z., Liu, L., An, Y., ... & Fortino, G. (2022). Multi-sensor information fusion based on machine learning for real applications in human activity recognition: State-of-the-art and research challenges. Information Fusion, 80, 241-265.

[2]     Vidya, B., & Sasikumar, P. (2022). Wearable multi-sensor data fusion approach for human activity recognition using machine learning algorithms. Sensors and Actuators A: Physical, 341, 113557.

[3]     Zhou, H., Zhao, Y., Liu, Y., Lu, S., An, X., & Liu, Q. (2023). Multi-Sensor Data Fusion and CNN-LSTM Model for Human Activity Recognition System. Sensors, 23(10), 4750.

[4]     Zebin, T., Scully, P. J., & Ozanyan, K. B. (2017). Inertial sensor-based modelling of human activity classes: Feature extraction and multi-sensor data fusion using machine learning algorithms. In eHealth 360: International Summit on eHealth, Budapest, Hungary, June 14-16, 2016, Revised Selected Papers (pp. 306-314). Springer International Publishing.

[5]     Adjeisah, M., Liu, G., Nyabuga, D. O., & Nortey, R. N. (2019, December). Multi-Sensor Information Fusion and Machine Learning for High Accuracy Rate of Mechanical Pedometer in Human Activity Recognition. In 2019 IEEE Intl Conf on Parallel & Distributed Processing with Applications, Big Data & Cloud Computing, Sustainable Computing & Communications, Social Computing & Networking (ISPA/BDCloud/SocialCom/SustainCom) (pp. 1064-1070). IEEE.

[6]     Malekzadeh, M., Clegg, R. G., Cavallaro, A., & Haddadi, H. (2019, April). Mobile sensor data anonymization. In Proceedings of the international conference on internet of things design and implementation (pp. 49-58).

[7]     Webber, M., & Rojas, R. F. (2021). Human activity recognition with accelerometer and gyroscope: A data fusion approach. IEEE Sensors Journal, 21(15), 16979-16989.

[8]     Wichit, N., & Choksuriwong, A. (2015). Multi-sensor data fusion model based Kalman filter using fuzzy logic for human activity detection. International Journal of Information and Electronics Engineering, 5(6), 450.

[9]     Zhou, Y., Yang, Z., Zhang, X., & Wang, Y. (2022). A hybrid attention-based deep neural network for simultaneous multi-sensor pruning and human activity recognition. IEEE Internet of Things Journal, 9(24), 25363-25372.

[10]   A. M.Ali and A. Abdelhafeez, “DeepHAR-Net: A Novel Machine Intelligence Approach for Human Activity Recognition from Inertial Sensors”, SMIJ, vol. 1, Nov. 2022.

[11]   Nweke, H. F., Teh, Y. W., Alo, U. R., & Mujtaba, G. (2018, May). Analysis of multi-sensor fusion for mobile and wearable sensor based human activity recognition. In Proceedings of the international conference on data processing and applications (pp. 22-26).

[12]   Nweke, H. F., Teh, Y. W., Mujtaba, G., Alo, U. R., & Al-garadi, M. A. (2019). Multi-sensor fusion based on multiple classifier systems for human activity identification. Human-centric Computing and Information Sciences, 9, 1-44.

[13]   Nafea, O., Abdul, W., & Muhammad, G. (2022). Multi-sensor human activity recognition using CNN and GRU. International Journal of Multimedia Information Retrieval, 11(2), 135-147.

[14]   Yu, Z., Zahid, A., Taha, A., Taylor, W., Le Kernec, J., Heidari, H., ... & Abbasi, Q. H. (2022). An Intelligent Implementation of Multi-Sensing Data Fusion with Neuromorphic Computing for Human Activity Recognition. IEEE Internet of Things Journal, 10(2), 1124-1133.

[15]   San Buenaventura, C. V., Tiglao, N. M. C., & Atienza, R. O. (2019). Deep learning for smartphone-based human activity recognition using multi-sensor fusion. In Wireless Internet: 11th EAI International Conference, WiCON 2018, Taipei, Taiwan, October 15-16, 2018, Proceedings 11 (pp. 65-75). Springer International Publishing.

[16]   Gravina, R., & Li, Q. (2019). Emotion-relevant activity recognition based on smart cushion using multi-sensor fusion. Information Fusion, 48, 1-10.

[17]   Aguileta, A. A., Brena, R. F., Mayora, O., Molino-Minero-Re, E., & Trejo, L. A. (2019). Multi-sensor fusion for activity recognition—A survey. Sensors, 19(17), 3808.

[18]   Patil, B. U., Ashoka, D. V., & Prakash, B. V. A. (2023). Data Integration Based Human Activity Recognition using Deep Learning Models. Karbala International Journal of Modern Science, 9(1), 11.

[19]   Cao, J., Li, W., Ma, C., & Tao, Z. (2018). Optimizing multi-sensor deployment via ensemble pruning for wearable activity recognition. Information Fusion, 41, 68-79.

[20]   Miao, S., Chen, L., Hu, R., & Luo, Y. (2022). Towards a dynamic inter-sensor correlation learning framework for multi-sensor-based wearable human activity recognition. Proceedings of the ACM on interactive, mobile, wearable and ubiquitous technologies, 6(3), 1-25.

[21]   Rodríguez, J. V., Martínez, J. R., & Salazar, F. F. J. (2023). Colorectal Cancer Prediction Using Machine Learning and Neutrosophic MCDM Methodology: A Case Study. International Journal of Neutrosophic Science, 21(2), 118-18.


Cite this Article as :
Style #
MLA Edmundo Jalon Arias, Luz M. Aguirre Paz, Luis Molina Chalacan. "Multi-Sensor Data Fusion for Accurate Human Activity Recognition with Deep Learning." Fusion: Practice and Applications, Vol. 13, No. 2, 2023 ,PP. 62-70 (Doi   :  https://doi.org/10.54216/FPA.130206)
APA Edmundo Jalon Arias, Luz M. Aguirre Paz, Luis Molina Chalacan. (2023). Multi-Sensor Data Fusion for Accurate Human Activity Recognition with Deep Learning. Journal of Fusion: Practice and Applications, 13 ( 2 ), 62-70 (Doi   :  https://doi.org/10.54216/FPA.130206)
Chicago Edmundo Jalon Arias, Luz M. Aguirre Paz, Luis Molina Chalacan. "Multi-Sensor Data Fusion for Accurate Human Activity Recognition with Deep Learning." Journal of Fusion: Practice and Applications, 13 no. 2 (2023): 62-70 (Doi   :  https://doi.org/10.54216/FPA.130206)
Harvard Edmundo Jalon Arias, Luz M. Aguirre Paz, Luis Molina Chalacan. (2023). Multi-Sensor Data Fusion for Accurate Human Activity Recognition with Deep Learning. Journal of Fusion: Practice and Applications, 13 ( 2 ), 62-70 (Doi   :  https://doi.org/10.54216/FPA.130206)
Vancouver Edmundo Jalon Arias, Luz M. Aguirre Paz, Luis Molina Chalacan. Multi-Sensor Data Fusion for Accurate Human Activity Recognition with Deep Learning. Journal of Fusion: Practice and Applications, (2023); 13 ( 2 ): 62-70 (Doi   :  https://doi.org/10.54216/FPA.130206)
IEEE Edmundo Jalon Arias, Luz M. Aguirre Paz, Luis Molina Chalacan, Multi-Sensor Data Fusion for Accurate Human Activity Recognition with Deep Learning, Journal of Fusion: Practice and Applications, Vol. 13 , No. 2 , (2023) : 62-70 (Doi   :  https://doi.org/10.54216/FPA.130206)