TY - JOUR
T1 - Few-Shot Learning With Prototypical Networks for Improved Memory Forensics
AU - Malik, Muhammad Fahad
AU - Gul, Ammara
AU - Saadia, Ayesha
AU - Alserhani, Faeiz M
PY - 2025/4/30
Y1 - 2025/4/30
N2 - Securing computer systems requires effective methods for malware detection. Memory forensics analyzes memory dumps to identify malicious activity, but faces challenges including large and complex datasets, constantly evolving malware threats, and limited labeled data for training algorithms among others. This research introduces a novel approach for malware detection using memory forensics and prototypical networks. As the first application of prototypical networks to the Dumpware10 dataset (to the best of authors knowledge), our findings highlight the potential of few-shot learning for memory forensics-based malware detection, opening new avenues for research in this domain. Prototypical networks are a type of few-shot learning algorithm that excels at classifying new categories with minimal examples. Utilizing the publicly available Dumpware10 dataset, which includes 10 malware classes and one benign class, we preprocess memory dumps using denoising and A-Hash functions to reduce noise and redundancy. The prototypical network is trained on the first four malware classes and the benign class. It’s then tested on a dataset with one additional class (first five malware classes and the benign class). We progressively increase the number of test classes to eleven. Within each training episode, five training images are used as support samples, with all remaining images designated as query samples. Our goal isn’t to predict exact class labels, but to assess the similarity between query images and prototypes using a distance metric. If the label of a prototype matches the query image and the distance falls below a threshold, it’s considered a true positive. This approach achieves an average accuracy of 92% with eleven classes, the highest across all scenarios and comparable to previous work using machine and deep learning algorithms on this dataset.
AB - Securing computer systems requires effective methods for malware detection. Memory forensics analyzes memory dumps to identify malicious activity, but faces challenges including large and complex datasets, constantly evolving malware threats, and limited labeled data for training algorithms among others. This research introduces a novel approach for malware detection using memory forensics and prototypical networks. As the first application of prototypical networks to the Dumpware10 dataset (to the best of authors knowledge), our findings highlight the potential of few-shot learning for memory forensics-based malware detection, opening new avenues for research in this domain. Prototypical networks are a type of few-shot learning algorithm that excels at classifying new categories with minimal examples. Utilizing the publicly available Dumpware10 dataset, which includes 10 malware classes and one benign class, we preprocess memory dumps using denoising and A-Hash functions to reduce noise and redundancy. The prototypical network is trained on the first four malware classes and the benign class. It’s then tested on a dataset with one additional class (first five malware classes and the benign class). We progressively increase the number of test classes to eleven. Within each training episode, five training images are used as support samples, with all remaining images designated as query samples. Our goal isn’t to predict exact class labels, but to assess the similarity between query images and prototypes using a distance metric. If the label of a prototype matches the query image and the distance falls below a threshold, it’s considered a true positive. This approach achieves an average accuracy of 92% with eleven classes, the highest across all scenarios and comparable to previous work using machine and deep learning algorithms on this dataset.
UR - https://www.open-access.bcu.ac.uk/16618/
U2 - 10.1109/ACCESS.2025.3565802
DO - 10.1109/ACCESS.2025.3565802
M3 - Article
SN - 2169-3536
VL - 13
JO - IEEE Access
JF - IEEE Access
ER -