  <?xml version="1.0"?>
<journal>
 <journal_metadata>
  <full_title>Fusion: Practice and Applications</full_title>
  <abbrev_title>FPA</abbrev_title>
  <issn media_type="print">2692-4048</issn>
  <issn media_type="electronic">2770-0070</issn>
  <doi_data>
   <doi>10.54216/FPA</doi>
   <resource>https://www.americaspg.com/journals/show/2567</resource>
  </doi_data>
 </journal_metadata>
 <journal_issue>
  <publication_date media_type="print">
   <year>2018</year>
  </publication_date>
  <publication_date media_type="online">
   <year>2018</year>
  </publication_date>
 </journal_issue>
 <journal_article publication_type="full_text">
  <titles>
   <title>Improving Shape Transformations for RGB Cameras Using Photometric Stereo</title>
  </titles>
  <contributors>
   <organization sequence="first" contributor_role="author">College of Computer Science &amp; Information Technology, University of Kerbala, Kerbala, 56002, Iraq</organization>
   <person_name sequence="first" contributor_role="author">
    <given_name>Ravi</given_name>
    <surname>Ravi</surname>
   </person_name>
   <organization sequence="first" contributor_role="author">Faculty of Basic education, Department of Mathematics, University of Kufa, Najaf, Iraq</organization>
   <person_name sequence="additional" contributor_role="author">
    <given_name>A. N.</given_name>
    <surname>..</surname>
   </person_name>
   <organization sequence="first" contributor_role="author"> College of Engineering, Al-Iraqia University, Baghdad, 10054, Iraq</organization>
   <person_name sequence="additional" contributor_role="author">
    <given_name>Ahmed L.</given_name>
    <surname>..</surname>
   </person_name>
   <organization sequence="first" contributor_role="author">Symbiosis Institute of Technology (SIT) Pune Campus, Symbiosis International (Deemed University) (SIU), Pune, 412115, Maharashtra,India</organization>
   <person_name sequence="additional" contributor_role="author">
    <given_name>Ravi</given_name>
    <surname>Sekhar</surname>
   </person_name>
   <organization sequence="first" contributor_role="author">Symbiosis Institute of Technology (SIT) Pune Campus, Symbiosis International (Deemed University) (SIU), Pune, 412115, Maharashtra,India</organization>
   <person_name sequence="additional" contributor_role="author">
    <given_name>Pritesh</given_name>
    <surname>Shah</surname>
   </person_name>
   <organization sequence="first" contributor_role="author">Department of Medical Instrumentation Technical Engineering, Medical Technical College, Al-Farahidi University, Baghdad, 10070, Iraq</organization>
   <person_name sequence="additional" contributor_role="author">
    <given_name>Jamal F.</given_name>
    <surname>Tawfeq</surname>
   </person_name>
  </contributors>
  <jats:abstract xml:lang="en">
   <jats:p>The emergence of low-cost red, green, and blue (RGB) cameras has significantly impacted various computer vision tasks. However, these cameras often produce depth maps with limited object details, noise, and missing information. These limitations can adversely affect the quality of 3D reconstruction and the accuracy of camera trajectory estimation. Additionally, existing depth refinement methods struggle to distinguish shape from complex albedo, leading to visible artifacts in the refined depth maps. In this paper, we address these challenges by proposing two novel methods based on the theory of photometric stereo. The first method, the RGB ratio model, tackles the nonlinearity problem present in previous approaches and provides a closed-form solution. The second method, the robust multi-light model, overcomes the limitations of existing depth refinement methods by accurately estimating shape from imperfect depth data without relying on regularization. Furthermore, we demonstrate the effectiveness of combining these methods with image super-resolution to obtain high-quality, high-resolution depth maps. Through quantitative and qualitative experiments, we validate the robustness and effectiveness of our techniques in improving shape transformations for RGB cameras.</jats:p>
  </jats:abstract>
  <publication_date media_type="print">
   <year>2024</year>
  </publication_date>
  <publication_date media_type="online">
   <year>2024</year>
  </publication_date>
  <pages>
   <first_page>144</first_page>
   <last_page>156</last_page>
  </pages>
  <doi_data>
   <doi>10.54216/FPA.150112</doi>
   <resource>https://www.americaspg.com/articleinfo/3/show/2567</resource>
  </doi_data>
 </journal_article>
</journal>
