Volume 3 , Issue 1, PP: 70-90 , 2021 | Cite this article as | XML | Html |PDF

1 **Affiliation :
**American University in the Emirates, Dubai, UAE

** Email : **abedallah.abualkishik@aue.ae

2 **Affiliation :
**American University in the Emirates, Dubai, UAE

** Email : **rasha.almajed@aue.ae

Received: November 11, 2020 Revised: January 05, 2020 Accepted: March 19 2021

**Abstract : **

Modern Machine learning fusion approaches tend to extract features depending on two techniques (hand-crafted feature and representation learning). Hand-crafted features can waste time and are not sufficient for downstream tasks. Unlike representation learning, we automatically learn features with minimum time and effort and are suitable for downstream tasks. In our paper, we provide work on graph neural network methods with details on classical graph embedding approaches and the different methods in neural graph networks such as graph filtering, graph pooling, and the learning parameter for graph following each technique with a general framework or mathematical proof for customer satisfaction. To satisfy customer's feel, this research employs NLP techniques. We describe the adversarial attacks and defenses on graph representation approaches. Also, advanced application of neural graph networks is reviewed, such as combinational optimization, learning program representation, physical system modeling, and natural language processing. Finally, the challenges in geometric neural networks and future research work have been introduced.

**Keywords : **

Machine learning; neural graph networks; graph filtering; graph pooling; optimization; fusion based on NLP; customer satisfaction.

**References : **

[1] J. Xu, "Representing Big Data as Networks: New Methods and Insights," University of Notre Dame, 2017.

[2] Y. Ma and J. Tang, Deep learning on graphs: Cambridge University Press, 2021.

[3] A. Sperduti and A. Starita, "Supervised neural networks for the classification of structures," IEEE Transactions on Neural Networks, vol. 8, pp. 714-735, 1997.

[4] Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner, "Gradient-based learning applied to document recognition," Proceedings of the IEEE, vol. 86, pp. 2278-2324, 1998.

[5] B. Perozzi, R. Al-Rfou, and S. Skiena, "Deepwalk: Online learning of social representations," in Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining, 2014, pp. 701-710.

[6] T. Mikolov, K. Chen, G. Corrado, and J. Dean, "Efficient estimation of word representations in vector space," in ICLR, 2013.

[7] A. Grover and J. Leskovec, "node2vec: Scalable feature learning for networks," in Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining, 2016, pp. 855-864.

[8] J. Tang, M. Qu, M. Wang, M. Zhang, J. Yan, and Q. Mei, "Line: Large-scale information network embedding," in Proceedings of the 24th international conference on the world wide web, 2015, pp. 1067-1077.

[9] J. Zhou, G. Cui, S. Hu, Z. Zhang, C. Yang, Z. Liu, et al., "Graph neural networks: A review of methods and applications," AI Open, vol. 1, pp. 57-81, 2020.

[10] M. M. Bronstein, J. Bruna, Y. LeCun, A. Szlam, and P. Vandergheynst, "Geometric deep learning: going beyond euclidean data," IEEE Signal Processing Magazine, vol. 34, pp. 18-42, 2017.

[11] S. Zhang, H. Tong, J. Xu, and R. Maciejewski, "Graph convolutional networks: a comprehensive review," Computational Social Networks, vol. 6, pp. 1-23, 2019.

[12] Z. Wu, S. Pan, F. Chen, G. Long, C. Zhang, and S. Y. Philip, "A comprehensive survey on graph neural networks," IEEE transactions on neural networks and learning systems, vol. 32, pp. 4-24, 2020.

[13] I. Chami, S. Abu-El-Haija, B. Perozzi, C. Ré, and K. Murphy, "Machine learning on graphs: A model and comprehensive taxonomy," arXiv preprint arXiv:2005.03675, 2020.

[14] L. Sun, Y. Dou, C. Yang, J. Wang, P. S. Yu, L. He, et al., "Adversarial attack and defense on graph data: A survey," arXiv preprint arXiv:1812.10528, 2018.

[15] L. Chen, J. Li, J. Peng, T. Xie, Z. Cao, K. Xu, et al., "A survey of adversarial learning on graphs," arXiv preprint arXiv:2003.05730, 2020.

[16] C. Yang, Y. Xiao, Y. Zhang, Y. Sun, and J. Han, "Heterogeneous network representation learning: A unified framework with survey and benchmark," IEEE Transactions on Knowledge and Data Engineering, 2020.

[17] Y. Peng, B. Choi, and J. Xu, "Graph Learning for Combinatorial Optimization: A Survey of State-of-the-Art," Data Science and Engineering, vol. 6, pp. 119-141, 2021.

[18] Z. Zhang, P. Cui, and W. Zhu, "Deep learning on graphs: A survey," IEEE Transactions on Knowledge and Data Engineering, 2020.

[19] J. B. Lee, R. A. Rossi, S. Kim, N. K. Ahmed, and E. Koh, "Attention models in graphs: A survey," ACM Transactions on Knowledge Discovery from Data (TKDD), vol. 13, pp. 1-25, 2019.

[20] Y. Huang, H. Xu, Z. Duan, A. Ren, J. Feng, Q. Zhang, et al., "Modeling Complex Spatial Patterns with Temporal Features via Heterogenous Graph Embedding Networks," arXiv preprint arXiv:2008.08617, 2020.

[21] W. L. Hamilton, "Graph representation learning," Synthesis Lectures on Artificial Intelligence and Machine Learning, vol. 14, pp. 1-159, 2020.

[22] S. Cao, W. Lu, and Q. Xu, "Grarep: Learning graph representations with global structural information," in Proceedings of the 24th ACM international conference on information and knowledge management, 2015, pp. 891-900.

[23] M. Nickel, V. Tresp, and H.-P. Kriegel, "A three-way model for collective learning on multi-relational data," in Icml, 2011.

[24] M. Defferrard, X. Bresson, and P. Vandergheynst, "Convolutional neural networks on graphs with fast localized spectral filtering," Advances in neural information processing systems, vol. 29, pp. 3844-3852, 2016.

[25] T. N. Kipf and M. Welling, "Semi-supervised classification with graph convolutional networks," in ICLR, 2016.

[26] R. Levie, F. Monti, X. Bresson, and M. M. Bronstein, "Cayleynets: Graph convolutional neural networks with complex rational spectral filters," IEEE Transactions on Signal Processing, vol. 67, pp. 97-109, 2018.

[27] R. Li, S. Wang, F. Zhu, and J. Huang, "Adaptive graph convolutional neural networks," in Proceedings of the AAAI Conference on Artificial Intelligence, 2018.

[28] C. Zhuang and Q. Ma, "Dual graph convolutional networks for graph-based semi-supervised classification," in Proceedings of the 2018 World Wide Web Conference, 2018, pp. 499-508.

[29] J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, "Neural message passing for quantum chemistry," in International conference on machine learning, 2017, pp. 1263-1272.

[30] W. L. Hamilton, R. Ying, and J. Leskovec, "Inductive representation learning on large graphs," in Proceedings of the 31st International Conference on Neural Information Processing Systems, 2017, pp. 1025-1035.

[31] P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio, "Graph attention networks," in ICLR, 2017.

[32] F. Monti, D. Boscaini, J. Masci, E. Rodola, J. Svoboda, and M. M. Bronstein, "Geometric deep learning on graphs and manifolds using mixture model cnns," in Proceedings of the IEEE conference on computer vision and pattern recognition, 2017, pp. 5115-5124.

[33] H. Gao, Z. Wang, and S. Ji, "Large-scale learnable graph convolutional networks," in Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2018, pp. 1416-1424.

[34] D. V. Tran, N. Navarin, and A. Sperduti, "On filter size in graph convolutional networks," in 2018 IEEE Symposium Series on Computational Intelligence (SSCI), 2018, pp. 1534-1541.

[35] D. Bacciu, F. Errica, and A. Micheli, "Contextual graph Markov model: A deep and generative approach to graph processing," in International Conference on Machine Learning, 2018, pp. 294-303.

[36] J. Zhang, X. Shi, J. Xie, H. Ma, I. King, and D.-Y. Yeung, "Gaan: Gated attention networks for learning on large and spatiotemporal graphs," in UAI, 2018.

[37] J. Chen, T. Ma, and C. Xiao, "Fastgcn: fast learning with graph convolutional networks via importance sampling," in ICLR, 2018.

[38] J. Chen, J. Zhu, and L. Song, "Stochastic training of graph convolutional networks with variance reduction," in ICML, 2017.

[39] W. Huang, T. Zhang, Y. Rong, and J. Huang, "Adaptive sampling towards fast graph representation learning," in NeurIPS, 2018.

[40] M. Zhang, Z. Cui, M. Neumann, and Y. Chen, "An end-to-end deep learning architecture for graph classification," in Thirty-Second AAAI Conference on Artificial Intelligence, 2018.

[41] R. Ying, J. You, C. Morris, X. Ren, W. L. Hamilton, and J. Leskovec, "Hierarchical graph representation learning with differentiable pooling," in Proceedings of the 32nd International Conference on Neural Information Processing Systems, 2018, pp. 4805-4815.

[42] Z. Liu, C. Chen, L. Li, J. Zhou, X. Li, L. Song, et al., "Geniepath: Graph neural networks with adaptive receptive paths," in Proceedings of the AAAI Conference on Artificial Intelligence, 2019, pp. 4424-4431.

[43] P. Veličković, W. Fedus, W. L. Hamilton, P. Liò, Y. Bengio, and R. D. Hjelm, "Deep graph infomax," in ICLR, 2018.

[44] K. Xu, W. Hu, J. Leskovec, and S. Jegelka, "How powerful are graph neural networks?," in ICLR, 2018.

[45] W.-L. Chiang, X. Liu, S. Si, Y. Li, S. Bengio, and C.-J. Hsieh, "Cluster-GCN: An efficient algorithm for training deep and large graph convolutional networks," in Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2019, pp. 257-266.

Abedallah Z. Abualkishik , Rasha Almajed,