Academia.eduAcademia.edu

Outline

A Data-driven Transfer Learning Method for Radio Map Estimation

Abstract

Estimating accurate radio maps is important for various tasks in wireless communications, such as channel modeling, resource allocation, and network planning, to name a few. Due to the changes in the propagation characteristics of the wireless environments, a radio map model learned under a particular wireless environment cannot be directly used in a new wireless environment. Moreover, learning a new model for every environment requires, in general, a large amount of data and is computationally demanding. In this work, we design an effective novel data-driven transfer learning method that transfers and fine-tunes a deep neural network (DNN)-based radio map model learned from an original wireless environment to other wireless environments with a certain level of similarity, allowing the radio map to be estimated with less amount of training data. As opposed to other widely used similarity measures that do not take into account the wireless propagation characteristics, we design a dat...

References (41)

  1. F. Zhuang, Z. Qi, K. Duan, D. Xi, Y. Zhu, H. Zhu, H. Xiong, and Q. He, "A comprehensive survey on transfer learning," Proceedings of the IEEE, vol. 109, no. 1, pp. 43-76, 2020.
  2. K. Weiss and D. Wang, "A survey of transfer learning," Journal of Big data, vol. 3, no. 1, pp. 1-40, 2016.
  3. C. Zhang, H. Zhang, and J. Qiao, "Deep transfer learning for intelligent cellular traffic prediction based on cross-domain big data," IEEE Journal on Selected Areas in Comm., vol. 37, no. 6, pp. 1389-1401, 2019.
  4. Q. Peng, A. Gilman, N. Vasconcelos, P. C. Cosman, and L. B. Milstein, "Robust deep sensing through transfer learning in cognitive radio," IEEE Wireless Communications Letters, vol. 9, no. 1, pp. 38-41, 2019.
  5. Y. Shen, Y. Shi, J. Zhang, and K. B. Letaief, "Transfer learning for mixed-integer resource allocation problems in wireless networks," in IEEE International Conference on Communications (ICC), 2019, pp. 1-6.
  6. Y. Yang, F. Gao, Z. Zhong, B. Ai, and A. Alkhateeb, "Deep transfer learning-based downlink channel prediction for FDD massive MIMO systems," IEEE Trans. on Communications, vol. 68, no. 12, pp. 7485-7497, 2020.
  7. S. Bai, Y. Luo, and Q. Wan, "Transfer learning for wireless fingerprinting localization based on optimal transport," Sensors, vol. 20, no. 23, p. 6994, 2020.
  8. A. Ramdas, N. G. Trillos, and M. Cuturi, "On wasserstein two-sample testing and related families of nonparametric tests," Entropy, vol. 19, no. 2, p. 47, 2017.
  9. C. Parera, Q. Liao, C. Tatino, A. E. Redondi, and M. Cesana, "Transfer learning for tilt-dependent radio map prediction," IEEE Trans. on Cognitive Commu. and Networking, vol. 6, no. 2, pp. 829-843, 2020.
  10. X. Han, L. Xue, Y. Xu, and Z. Liu, "A two-phase transfer learning-based power spectrum maps reconstruction algorithm for underlay cognitive radio networks," IEEE Access, vol. 8, pp. 81 232-81 245, 2020.
  11. R. Jaiswal, S. Deshmukh, M. Elnourani, and B. Beferull-Lozano, "Transfer Learning Based Joint Resource Allocation for Underlay D2D Communications," in IEEE WCNC, 2022, pp. 1479-1484.
  12. R. Jaiswal, M. Elnourani, S. Deshmukh, and B. Beferull-Lozano, "Deep Transfer Learning Based Radio Map Estimation for Indoor Wireless Communications," in IEEE SPAWC, 2022, pp. 1-5.
  13. --, "Location-free Indoor Radio Map Estimation using Transfer learning," in IEEE VTC, 2023 (accepted).
  14. P. F. Alcantarilla, A. Bartoli, and A. J. Davison, "Kaze features," in 12th European Conference on Computer Vision. Springer, 2012, pp. 214-227.
  15. D. G. Lowe, "Distinctive Image Features from Scale-invariant keypoints," International Journal of Computer Vision, vol. 60, no. 2, pp. 91-110, 2004.
  16. E. Rublee, V. Rabaud, K. Konolige, and G. Bradski, "ORB: An efficient alternative to SIFT or SURF," in International Conference on Computer Vision. IEEE, 2011, pp. 2564-2571.
  17. S. Leutenegger, M. Chli, and R. Y. Siegwart, "BRISK: Binary robust invariant scalable keypoints," in International Conference on Computer Vision. IEEE, 2011, pp. 2548-2555.
  18. W. Yuanji and J. Qinzhong, "Image quality evaluation based on image weighted separating block peak signal to noise ratio," in IEEE Int. Conf. on Neural Networks and Signal Processing, vol. 2, 2003, pp. 994-997.
  19. D. Poobathy and R. M. Chezian, "Edge detection operators: Peak signal to noise ratio based comparison," International Journal of Image, Graphics and Signal Processing (IJIGSP), vol. 10, pp. 55-61, 2014.
  20. Z. Wang and A. C. Bovik, "Mean squared error: Love it or leave it? A new look at signal fidelity measures," IEEE Signal Processing Magazine, vol. 26, no. 1, pp. 98-117, 2009.
  21. Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, "Image quality assessment: from error visibility to structural similarity," IEEE Transactions on Image Processing, vol. 13, no. 4, pp. 600-612, 2004.
  22. M. De Angelis and A. Gray, "Why the 1-Wasserstein distance is the area between the two marginal CDFs," arXiv preprint arXiv:2111.03570, 2021.
  23. Remcom: Electromagnetic Simulation Software, www.remcom.com/wireless-insite-em-propagation-software.
  24. I. Goodfellow, Y. Bengio, and A. Courville, Deep learning. MIT press, 2016.
  25. M. Mohammed and E. B. M. Bashier, Machine Learning: Algorithms and Applications. CRC Press, 2016.
  26. L. Prechelt, "Early stopping-but when?" in Neural Networks: Tricks of the trade. Springer, 2002, pp. 55-69.
  27. J. Shen, Y. Qu, W. Zhang, and Y. Yu, "Wasserstein distance guided representation learning for domain adaptation," in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32, no. 1, 2018.
  28. C. Villani, Optimal Transport: Old and New. Springer, 2009, vol. 338.
  29. Z. Li, F. Liu, W. Yang, S. Peng, and J. Zhou, "A survey of convolutional neural networks: analysis, applications, and prospects," IEEE Transactions on Neural Networks and Learning Systems, pp. 1-21, 2021.
  30. C.-F. Yang, B.-C. Wu, and C.-J. Ko, "A ray-tracing method for modeling indoor wave propagation and penetration," IEEE Transactions on Antennas and Propagation, vol. 46, no. 6, pp. 907-919, 1998.
  31. W. Zhu and N. Wang, "Sensitivity, specificity, accuracy, associated confidence interval and ROC analysis with practical SAS implementations," NESUG Proc.: Health Care and Life Sciences, vol. 19, pp. 67-75, 2010.
  32. M. Khalaf-Allah, "Time of arrival (TOA)-based direct location method," in 16th International Radar Symposium (IRS). IEEE, 2015, pp. 812-815.
  33. R. Johansson, Numerical Python. Apress, 2019.
  34. F. Chollet, "Keras: Deep learning library for Theano and Tensorflow," 2015.
  35. N. Shukla and K. Fricklas, Machine learning with TensorFlow. Manning Publications, 2018.
  36. A. Asesh, "Normalization and Bias in Time Series Data," in 9th Machine Intelligence and Digital Interaction Conference. Springer, 2022, pp. 88-97.
  37. R. Arora, A. Basu, P. Mianjy, and A. Mukherjee, "Understanding deep neural networks with rectified linear units," in 6th International Conference on Learning Representations (ICLR), 2018.
  38. D. P. Kingma and J. Ba, "Adam: A method for stochastic optimization," in 3rd International Conference on Learning Representations (ICLR), 2015.
  39. N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov, "Dropout: a simple way to prevent neural networks from overfitting," The Journal of ML Research, vol. 15, no. 1, pp. 1929-1958, 2014.
  40. E. Elgeldawi, A. Sayed, A. R. Galal, and A. M. Zaki, "Hyperparameter Tuning for Machine Learning Algorithms Used for Arabic Sentiment Analysis," in Informatics, vol. 8, no. 4. MDPI, 2021, p. 79.
  41. M. de Rooij and W. Weeda, "Cross-validation: A method every psychologist should know," Advances in Methods and Practices in Psychological Science, vol. 3, no. 2, pp. 248-263, 2020.