A Comparative Study of MobileNet Architecture Optimizer for Crowd Prediction
Abstract
Keywords
References
W. Hongyuan and D. Mingxing, “OVERVIEW OF THE DEVELOPMENT OF ARTIFICIAL INTELLIGENCE TECHNOLOGY,” Int J Res Eng Technol, vol. 07, no. 08, pp. 92–95, Aug. 2018, doi: 10.15623/ijret.2018.0708011.
Z. Li, Y. Wang, Y. Ji, and W. Yang, “A survey of the development of artificial intelligence technology,” in Proceedings of 2020 3rd International Conference on Unmanned Systems, ICUS 2020, Institute of Electrical and Electronics Engineers Inc., Nov. 2020, pp. 1126–1129. doi: 10.1109/ICUS50048.2020.9274952.
Manjunath Jogin, Mohana, M. Madhulika, G. Divya, R. Meghana, and S. Apoorva, “Feature Extraction using Convolution Neural Networks (CNN) and Deep Learning,” in 2018 3rd IEEE International Conference on Recent Trends in Electronics, Information & Communication Technology (RTEICT), 2018, pp. 2319–2323. doi: 10.1109/RTEICT42901.2018.9012507.
M. Ridley, “Explainable Artificial Intelligence (XAI),” Information Technology and Libraries, vol. 41, no. 2, Jun. 2022, doi: 10.6017/ITAL.V41I2.14683.
F. Yan, Z. Zhang, Y. Liu, and J. Liu, “Design of Convolutional Neural Network Processor Based on FPGA Resource Multiplexing Architecture,” Sensors, vol. 22, no. 16, Aug. 2022, doi: 10.3390/s22165967.
I. Wunderlich, B. Koch, and S. Schönfeld, An Overview of Arithmetic Adaptations for Inference of Convolutional Neural Networks on Re-configurable Hardware. 2020.
N. K. Shaydyuk and E. B. John, “FPGA Implementation of MobileNetV2 CNN Model Using Semi-Streaming Architecture for Low Power Inference Applications,” in 2020 IEEE Intl Conf on Parallel & Distributed Processing with Applications, Big Data & Cloud Computing, Sustainable Computing & Communications, Social Computing & Networking (ISPA/BDCloud/SocialCom/SustainCom), IEEE, Dec. 2020, pp. 160–167. doi: 10.1109/ISPA-BDCloud-SocialCom-SustainCom51426.2020.00046.
R. Shang, J. Wang, L. Jiao, R. Stolkin, B. Hou, and Y. Li, “SAR Targets Classification Based on Deep Memory Convolution Neural Networks and Transfer Parameters,” IEEE J Sel Top Appl Earth Obs Remote Sens, vol. 11, no. 8, pp. 2834–2846, Aug. 2018, doi: 10.1109/JSTARS.2018.2836909.
A. W. Salehi et al., “A Study of CNN and Transfer Learning in Medical Imaging: Advantages, Challenges, Future Scope,” Sustainability (Switzerland), vol. 15, no. 7. MDPI, Apr. 01, 2023. doi: 10.3390/su15075930.
C. Li, J. Feng, L. Hu, J. Li, and H. Ma, “Review of Image Classification Method Based on Deep Transfer Learning,” in Proceedings - 2020 16th International Conference on Computational Intelligence and Security, CIS 2020, Institute of Electrical and Electronics Engineers Inc., Nov. 2020, pp. 104–108. doi: 10.1109/CIS52066.2020.00031.
T. Ridnik, E. Ben-Baruch, A. Noy, and L. Zelnik-Manor, “ImageNet-21K Pretraining for the Masses,” Apr. 2021, [Online]. Available: http://arxiv.org/abs/2104.10972
Y. Gulzar, “Fruit Image Classification Model Based on MobileNetV2 with Deep Transfer Learning Technique,” Sustainability (Switzerland), vol. 15, no. 3, Feb. 2023, doi: 10.3390/su15031906.
Q. Xiang, G. Zhang, X. Wang, J. Lai, R. Li, and Q. Hu, “Fruit image classification based on Mobilenetv2 with transfer learning technique,” in ACM International Conference Proceeding Series, Association for Computing Machinery, Oct. 2019. doi: 10.1145/3331453.3361658.
J. Kaddour, L. Liu, R. Silva, and M. J. Kusner, “When Do Flat Minima Optimizers Work?,” Feb. 2022, [Online]. Available: http://arxiv.org/abs/2202.00661
N. Fatima, “Enhancing Performance of a Deep Neural Network: A Comparative Analysis of Optimization Algorithms,” ADCAIJ: Advances in Distributed Computing and Artificial Intelligence Journal, vol. 9, no. 2, pp. 79–90, Jun. 2020, doi: 10.14201/adcaij2020927990.
T. Fofana, S. Ouattara, and A. Clement, “Optimal Flame Detection of Fires in Videos Based on Deep Learning and the Use of Various Optimizers,” Open Journal of Applied Sciences, vol. 11, no. 11, pp. 1240–1255, 2021, doi: 10.4236/ojapps.2021.1111094.
419∼430 I Hepatika Zidny Ilmadina, N. Muhammad, and W. D. Surono, “Drowsiness Detection Based on Yawning Using Modified Pre-trained Model MobileNetV2 and ResNet50,” MATRIK, vol. 22, no. 3, pp. 419–430, 2023, doi: 10.30812/matrik.v22i3.2785.
S. Shao et al., “CrowdHuman: A Benchmark for Detecting Human in a Crowd,” Apr. 2018, [Online]. Available: http://arxiv.org/abs/1805.00123
X. Shao et al., “Multi-scale feature pyramid network: A heavily occluded pedestrian detection network based on resnet,” Sensors, vol. 21, no. 5, pp. 1–15, Mar. 2021, doi: 10.3390/s21051820.
M. Sandler, A. Howard, M. Zhu, A. Zhmoginov, and L.-C. Chen, “MobileNetV2: Inverted Residuals and Linear Bottlenecks,” Jan. 2018, [Online]. Available: http://arxiv.org/abs/1801.04381
E. Belcore and V. Di Pietra, “LAYING THE FOUNDATION FOR AN ARTIFICIAL NEURAL NETWORK FOR PHOTOGRAMMETRIC RIVERINE BATHYMETRY,” in International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives, International Society for Photogrammetry and Remote Sensing, Aug. 2022, pp. 51–58. doi: 10.5194/isprs-archives-XLVIII-4-W1-2022-51-2022.
B. Kaddar, H. Fizazi, M. Hernandez-Cabronero, V. Sanchez, and J. Serra-Sagrista, “DivNet: Efficient Convolutional Neural Network via Multilevel Hierarchical Architecture Design,” IEEE Access, vol. 9, pp. 105892–105901, 2021, doi: 10.1109/ACCESS.2021.3099952.
W. Sun, Q. Wei, L. Ren, J. Dang, and F. F. Yin, “Adaptive respiratory signal prediction using dual multi-layer perceptron neural networks,” Phys Med Biol, vol. 65, no. 18, Sep. 2020, doi: 10.1088/1361-6560/abb170.
A. M. Javid, S. Das, M. Skoglund, and S. Chatterjee, “A ReLU Dense Layer to Improve the Performance of Neural Networks,” Oct. 2020, [Online]. Available: http://arxiv.org/abs/2010.13572
C. Fang, H. He, Q. Long, and W. J. Su, “Exploring Deep Neural Networks via Layer-Peeled Model: Minority Collapse in Imbalanced Training,” Jan. 2021, doi: 10.1073/pnas.2103091118.
W. Pengtao, “Based on adam optimization algorithm: Neural network model for auto steel performance prediction,” in Journal of Physics: Conference Series, IOP Publishing Ltd, Nov. 2020. doi: 10.1088/1742-6596/1653/1/012012.
Institute of Electrical and Electronics Engineers, IEEE Computational Intelligence Society, and Victoria University of Wellington, 2019 IEEE Congress on Evolutionary Computation (CEC) : 2019 proceedings.
X. Wang, H. Ren, and A. Wang, “Smish: A Novel Activation Function for Deep Learning Methods,” Electronics (Switzerland), vol. 11, no. 4, Feb. 2022, doi: 10.3390/electronics11040540.
J. Yun, “An Efficient Approach to Mitigate Numerical Instability in Backpropagation for 16-bit Neural Network Training,” Jul. 2023. [Online]. Available: http://arxiv.org/abs/2307.16189
L. Wu and C. Ma, “How SGD Selects the Global Minima in Over-parameterized Learning: A Dynamical Stability Perspective,” in 32nd Conference on Neural Information Processing Systems (NeurIPS 2018), 2018.
F. Kunstner, J. Chen, J. W. Lavington, and M. Schmidt, “Noise Is Not the Main Factor Behind the Gap Between SGD and Adam on Transformers, but Sign Descent Might Be,” Apr. 2023, [Online]. Available: http://arxiv.org/abs/2304.13960
L. Hodgkinson and M. W. Mahoney, “Multiplicative noise and heavy tails in stochastic optimization,” Jun. 2020, [Online]. Available: http://arxiv.org/abs/2006.06293
DOI: https://doi.org/10.30591/jpit.v8i3.5703
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution 4.0 International License.
JPIT INDEXED BY
This work is licensed under a Creative Commons Attribution 4.0 International License.