Title: A rapid mining model for extracting sparse distribution association semantic link from large-scale web resources

Authors: Shunxiang Zhang; Kui Lu; Xiaobo Yin; Guangli Zhu

Addresses: School of Computer Science and Engineering, Anhui University of Science and Technology, Huainan, 232001, China ' School of Computer Science and Engineering, Anhui University of Science and Technology, Huainan, 232001, China ' School of Computer Science and Engineering, Anhui University of Science and Technology, Huainan, 232001, China ' School of Computer Science and Engineering, Anhui University of Science and Technology, Huainan, 232001, China

Abstract: Association semantic link (ASL) can provide theoretical support for many web intelligent activities. However, when we extract the keywords-level association semantic link (k-ASL), some sparse distribution k-ASL are easily discarded. To solve this problem, this paper proposes a rapid mining model for extracting sparse distribution k-ASL from large-scale web resources. First, the time validity for three types of k-ASL is analysed to clear their semantic characteristics. Second, three existing problems for mining sparse distribution k-ASL are presented to analyse why this kind of k-ASL is easily discarded. After that, we present the rapid mining theoretical foundation for mining sparse distribution k-ASL. Furthermore, the rapid mining model for extracting sparse distribution k-ASL is proposed, which is based on the presented theory and set computation such as 'difference computation', 'union computation'. At last, the evaluation method is presented and the correctness of the proposed model is validated by experiments.

Keywords: association semantic link; ASL; sparse distribution; time validity; mining model; splitting time window.

DOI: 10.1504/IJAHUC.2017.083482

International Journal of Ad Hoc and Ubiquitous Computing, 2017 Vol.25 No.1/2, pp.52 - 64

Received: 01 Jun 2015
Accepted: 13 Jan 2016

Published online: 07 Apr 2017 *

Full-text access for editors Access for subscribers Purchase this article Comment on this article