  <?xml version="1.0"?>
<journal>
 <journal_metadata>
  <full_title>Fusion: Practice and Applications</full_title>
  <abbrev_title>FPA</abbrev_title>
  <issn media_type="print">2692-4048</issn>
  <issn media_type="electronic">2770-0070</issn>
  <doi_data>
   <doi>10.54216/FPA</doi>
   <resource>https://www.americaspg.com/journals/show/2697</resource>
  </doi_data>
 </journal_metadata>
 <journal_issue>
  <publication_date media_type="print">
   <year>2018</year>
  </publication_date>
  <publication_date media_type="online">
   <year>2018</year>
  </publication_date>
 </journal_issue>
 <journal_article publication_type="full_text">
  <titles>
   <title>Hybrid CNN-XGB Framework for Enhancing Human Activity Recognition</title>
  </titles>
  <contributors>
   <organization sequence="first" contributor_role="author">Department of computer engineering, Collage of engineering, University of Diyala, Iraq</organization>
   <person_name sequence="first" contributor_role="author">
    <given_name>Farah</given_name>
    <surname>Farah</surname>
   </person_name>
   <organization sequence="first" contributor_role="author">Department of computer engineering, Collage of engineering, University of Diyala, Iraq</organization>
   <person_name sequence="additional" contributor_role="author">
    <given_name>Raniah</given_name>
    <surname>Hazim</surname>
   </person_name>
   <organization sequence="first" contributor_role="author">Department of computer engineering, Collage of engineering, University of Diyala, Iraq</organization>
   <person_name sequence="additional" contributor_role="author">
    <given_name>Sarah. A.</given_name>
    <surname>hassan</surname>
   </person_name>
   <organization sequence="first" contributor_role="author">Prosthetic Dental Techniques Department, College of Health and Medical Techniques, Ashur University, Baghdad, Iraq</organization>
   <person_name sequence="additional" contributor_role="author">
    <given_name>Qusay</given_name>
    <surname>Saihood</surname>
   </person_name>
  </contributors>
  <jats:abstract xml:lang="en">
   <jats:p>Human Activity Recognition (HAR) is one of the most important modern research fields concerned with studying and analyzing human actions and behaviors. Human activity recognition applications offer great potential for a wide range of applications in various fields that enhance health, safety, and efficiency. Due to the diversity of human activities and the way people carry out these activities, it is difficult to recognize human activity. The amazing capabilities provided By Artificial Intelligence (AI) tools in analyzing and understanding hidden patterns in complex data can greatly facilitate the HAR process. There has been a huge trend in the past 10 years to use Machine Learning (ML) and Deep Learning (DL) techniques to analyze and understand big data for HAR. Although there are many studies using these techniques, their accuracy still needs to be further improved due to several challenges: Data complexity, class imbalance, determining the appropriate feature selection technique with ML technique, and tuning the hyperparameters of the used ML technique. To overcome these challenges, this study proposes an effective framework based on two stages: a data preprocessing procedure that includes data balance and data normalization. Then, a hybrid CNN-XGB model combining Convolutional Neural Network (CNN) and a fine-tuned XGBoost (XGB) classifier is developed for accurate HAR. The CNN-XGB model achieved excellent results in HAR when trained and tested on the HCI-HAR dataset, achieving an accuracy of up to 99.0%. Effectively HAR provides the opportunity to apply many applications that contribute to improving the quality of life in various areas of our daily lives.</jats:p>
  </jats:abstract>
  <publication_date media_type="print">
   <year>2024</year>
  </publication_date>
  <publication_date media_type="online">
   <year>2024</year>
  </publication_date>
  <pages>
   <first_page>196</first_page>
   <last_page>207</last_page>
  </pages>
  <doi_data>
   <doi>10.54216/FPA.150218</doi>
   <resource>https://www.americaspg.com/articleinfo/3/show/2697</resource>
  </doi_data>
 </journal_article>
</journal>
