Leveraging activation and optimisation layers as dynamic strategies in the multi-task fuzzing scheme

  • Sadegh Bamohabbat Chafjiri* (Corresponding / Lead Author)
  • , Phil Legg
  • , Michail-Antisthenis Tsompanas
  • , Jun Hong
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Fuzzing is a common technique for identifying vulnerabilities in software. Recent approaches, like She et al.’s Multi-Task Fuzzing (MTFuzz), use neural networks to improve fuzzing efficiency. However, key elements like network architecture and hyperparameter tuning are still not well-explored. Factors like activation layers, optimisation function design, and vanishing gradient strategies can significantly impact fuzzing results by improving test case selection. This paper delves into these aspects to improve neural network-driven fuzz testing. We focus on three key neural network parameters to improve fuzz testing: the Leaky Rectified Linear Unit (LReLU) activation, Nesterov-accelerated Adaptive Moment Estimation (Nadam) optimisation, and sensitivity analysis. LReLU adds non-linearity, aiding feature extraction, while Nadam helps to improve weight updates by considering both current and future gradient directions. Sensitivity analysis optimises layer selection for gradient calculation, enhancing fuzzing efficiency. Based on these insights, we propose LMTFuzz, a novel fuzzing scheme optimised for these Machine Learning (ML) strategies. We explore the individual and combined effects of LReLU, Nadam, and sensitivity analysis, as well as their hybrid configurations, across six different software targets. Experimental results demonstrate that LReLU, individually or when paired with sensitivity analysis, significantly enhances fuzz testing performance. However, when combined with Nadam, LReLU shows improvement on some targets, though less pronounced than its combination with sensitivity analysis. This combination improves accuracy, reduces loss, and increases edge coverage, with improvements of up to 23.8%. Furthermore, it leads to a significant increase in unique bug detection, with some targets detecting up to 2.66 times more bugs than baseline methods.
Original languageEnglish
Article number104011
JournalComputer Standards and Interfaces
Volume94
DOIs
Publication statusPublished (VoR) - 15 Apr 2025

Keywords

  • Fuzzing
  • Neural network
  • LReLU
  • Nadam optimisation
  • Sensitivity analysis

Fingerprint

Dive into the research topics of 'Leveraging activation and optimisation layers as dynamic strategies in the multi-task fuzzing scheme'. Together they form a unique fingerprint.

Cite this