Auto-encoder-based technique for effective detection of frauds in social networks Online publication date: Mon, 05-Sep-2022
by S. Jamuna Rani; S. Vagdevi
International Journal of Information and Computer Security (IJICS), Vol. 18, No. 3/4, 2022
Abstract: Detection of these spam accounts has recently attracted significant attraction in the literature. Most of the spam-account detection techniques presented in the literature employ supervised learning models to achieve their goal. These models require sufficient size of spam-account samples in their training set to be trained effectively. However, obtaining such large sample sizes is a significant challenge. In many real-world scenarios, the number of such available samples is extremely limited. Due to this limitation in the training set, the spam-account detection techniques can exhibit extremely poor detection accuracy. Hence, in this paper, an effective supervised learning model-based spam-account detection technique is presented, which utilises only limited size of spam-account samples in its training set, and to achieve this desired goal, the dimension of the feature vectors in the training set is reduced through the aid of auto-encoders. Further, the spam-accounts are detected based on their corresponding hazard rates. The hazard rates are generated through recurrent neural network. An empirical analysis study is presented, in which, the proposed spam-account detection technique is compared against the contemporary technique. In this study, the proposed technique exhibits relatively superior performance in-terms of classification accuracy.
Online publication date: Mon, 05-Sep-2022
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Information and Computer Security (IJICS):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email firstname.lastname@example.org