Can Model Fusing Help Transformers in Long Document Classification? : An Empirical Study

Premasiri, Damith and Ranasinghe, Tharindu and Mitkov, Ruslan (2023) Can Model Fusing Help Transformers in Long Document Classification? : An Empirical Study. Other. Arxiv.

[thumbnail of 2307.09532v1]
Text (2307.09532v1)
Download (0B)
[thumbnail of 2307.09532v1]
Text (2307.09532v1)
Download (0B)
[thumbnail of 2307.09532v1]
Text (2307.09532v1)
2307.09532v1.pdf - Published Version
Available under License Creative Commons Attribution.

Download (484kB)

Abstract

Text classification is an area of research which has been studied over the years in Natural Language Processing (NLP). Adapting NLP to multiple domains has introduced many new challenges for text classification and one of them is long document classification. While state-of-the-art transformer models provide excellent results in text classification, most of them have limitations in the maximum sequence length of the input sequence. The majority of the transformer models are limited to 512 tokens, and therefore, they struggle with long document classification problems. In this research, we explore on employing Model Fusing for long document classification while comparing the results with well-known BERT and Longformer architectures.

Item Type:
Monograph (Other)
Additional Information:
Accepted in RANLP 2023
Uncontrolled Keywords:
Research Output Funding/no_not_funded
Subjects:
?? cs.clno - not fundedno ??
ID Code:
220372
Deposited By:
Deposited On:
28 May 2024 14:45
Refereed?:
No
Published?:
Published
Last Modified:
15 Jul 2024 08:02