  <?xml version="1.0"?>
<journal>
 <journal_metadata>
  <full_title>Fusion: Practice and Applications</full_title>
  <abbrev_title>FPA</abbrev_title>
  <issn media_type="print">2692-4048</issn>
  <issn media_type="electronic">2770-0070</issn>
  <doi_data>
   <doi>10.54216/FPA</doi>
   <resource>https://www.americaspg.com/journals/show/3663</resource>
  </doi_data>
 </journal_metadata>
 <journal_issue>
  <publication_date media_type="print">
   <year>2018</year>
  </publication_date>
  <publication_date media_type="online">
   <year>2018</year>
  </publication_date>
 </journal_issue>
 <journal_article publication_type="full_text">
  <titles>
   <title>A Novel Deep Learning Approach for Automated Melanoma Classification using Hybrid CNN and Vision Transformer Model</title>
  </titles>
  <contributors>
   <organization sequence="first" contributor_role="author">Research Scholar, SoET, CMR University, Bangalore, India; Sr. Assistant Professor, Department of ECE, New Horizon College of Engineering, Bangalore, India</organization>
   <person_name sequence="first" contributor_role="author">
    <given_name>Hamsalekha</given_name>
    <surname>Hamsalekha</surname>
   </person_name>
   <organization sequence="first" contributor_role="author">Professor, ECE and DORI, CMR University, Bangalore, India</organization>
   <person_name sequence="additional" contributor_role="author">
    <given_name>Glan Devadhas</given_name>
    <surname>George</surname>
   </person_name>
   <organization sequence="first" contributor_role="author">Associate Professor, School of CSE, Reva University, Bangalore, India</organization>
   <person_name sequence="additional" contributor_role="author">
    <given_name>T. Y.</given_name>
    <surname>Satheesha</surname>
   </person_name>
  </contributors>
  <jats:abstract xml:lang="en">
   <jats:p>Melanoma Skin cancer is a serious type of cancer affecting people globally in order to improve survival rates, it is crucial to detect the infection at an early stage. Old Traditional methods for cancer detection make use of biopsies, which were time-consuming and involved complex procedures, which delayed diagnosis. However, accurate diagnosis is challenging due its complex imaging techniques. With the advancements in technology, particularly in deep learning techniques like CNN, have significantly improved the accuracy and efficiency of melanoma skin cancer detection. This research paper presents a Novel Hybrid deep learning architecture that combines Convolution Neural Networks (CNNs) and Vision Transformers (ViT) for automated classification of skin lesions into binary categories: Malignant (cancerous) and Benign (Non-cancerous). The proposed model influences CNN's superior ability to extract local features alongside ViT's capability to extract global features. This hybrid architecture was trained and evaluated on ISIC 2020 challenging Dataset of dermatological images representing excellent performance with an accuracy of 94%, with a precision of 91%, recall (sensitivity) of 90%, and an F1 score of 91% after 25 epochs.  The model's robustness is further authorized through confusion matrix analysis, which forms a strong classification capability across various melanoma presentations. The proposed hybrid approach offers a more efficient and less complex approach in the automatic detection and identification of melanoma skin cancer, thus increasing the chances of successful early intervention and improving patient outcomes, thus making it suitable for Clinical use and sets a foundation for future developments in automated skin cancer detection systems. In comparison to other advanced networks, this model displays superior performance. </jats:p>
  </jats:abstract>
  <publication_date media_type="print">
   <year>2025</year>
  </publication_date>
  <publication_date media_type="online">
   <year>2025</year>
  </publication_date>
  <pages>
   <first_page>92</first_page>
   <last_page>101</last_page>
  </pages>
  <doi_data>
   <doi>10.54216/FPA.190207</doi>
   <resource>https://www.americaspg.com/articleinfo/3/show/3663</resource>
  </doi_data>
 </journal_article>
</journal>
