Multi-label legal text classification with BiLSTM and attention
by Liriam Enamoto; Andre R.A.S. Santos; Ricardo Maia; Li Weigang; Geraldo P. Rocha Filho
International Journal of Computer Applications in Technology (IJCAT), Vol. 68, No. 4, 2022

Abstract: Like many other knowledge fields, the legal area has experienced an information-overloaded scenario. However, to extract data from legal documents is a challenge due to the complexity of legal concepts and terms. This work aims to address Bidirectional Long Short-Term Memory (BiLSTM) to perform Portuguese legal text classification to solve such challenges. The proposed model is a shallow network with one BiLSTM layer and one Attention layer trained over two small data sets extracted from two Brazilian courts: the Superior Labour Court (TST) and 1st Region Labour Court. The experimental results show that combining the BiLSTM layer and the Attention layer for long judicial texts helps capture the past and future contexts and extract multiple tags. As the main contribution of this research, the proposed model can quickly process multi-label and multi-class data sets and adapt to new contexts in different languages.

Online publication date: Thu, 01-Sep-2022

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Computer Applications in Technology (IJCAT):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?

Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email