Authors: Jiun-Wei Liou; Wei-Chen Cheng; Jau-Chi Huang; Cheng-Yuan Liou
Addresses: Department of Computer Science and Information Engineering, National Taiwan University, No. 1, Sec. 4, Roosevelt Rd., Taipei, 10617, Taiwan ' Institute of Statistical Science, Academia Sinica, 128 Academia Road, Section 2, Nankang, Taipei 115, Taiwan ' Information and Communication Security Laboratory, Telecommunication Laboratories, Chunghwa Telecom Co., Ltd. No. 99, Dianyan Rd., Yangmei City, Taoyuan County, 32601, Taiwan ' Department of Computer Science and Information Engineering, National Taiwan University, No. 1, Sec. 4, Roosevelt Rd., Taipei 10617, Taiwan
Abstract: This paper presents a novel training method for Elman network to encode the words in literary works. This network has been used in studying limited simple artificial sentences with varying degrees of success. This paper shows how to use it to process real-world works. Both word codes and network weights can be accomplished by the method. Each trained code is a distributed representation of its word. The training error can be drastically reduced by iteratively re-encoding the representations. Several distinct findings and results during the training process are reported.
Keywords: Elman network; computational linguistics; content addressable memory; semantic indexing; personalised codes; neural networks; form-based similarity; function-based similarity; compositional representation; intelligent systems; literary works; word codes; network weights.
International Journal of Intelligent Information and Database Systems, 2013 Vol.7 No.4, pp.373 - 386
Available online: 10 Sep 2013 *Full-text access for editors Access for subscribers Purchase this article Comment on this article