Using deep belief networks to extract Chinese entity attribute relation in domain-specific
by Yantuan Xian; Fa Shao; Jianyi Guo; Lanjiang Zhou; Zhengtao Yu; Wei Chen
International Journal of Computing Science and Mathematics (IJCSM), Vol. 7, No. 2, 2016

Abstract: The state-of-the-art methods used for entity attribute relation extraction are primarily based on statistical machine learning, and the performance strongly depends on the quality of the extracted features. Deep belief networks (DBN) has been successful in the high dimensional feature space information extraction task, which can without complicated pre-processing. In this paper, the DBN, which consists of one or more restricted Boltzmann machine (RBM) layers and a back-propagation (BP) layer, is presented to extract Chinese entity attribute relation in domain-specific. First, the word tokens are transformed to vectors by looking up word embeddings. Then, the RBM layers maintain as much information as possible when feature vectors are transferred to next layer. Finally, the BP layer is trained to classify the features generated by the last RBM layer, and adopting Levenberg-Marquard (LM) optimisation algorithm to do the training. The experimental results show that the proposed method outperforms state-of-the-art learning models in specific domain entity attribute relation extraction.

Online publication date: Fri, 06-May-2016

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Computing Science and Mathematics (IJCSM):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com