loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Authors: Anum Afzal 1 ; Juraj Vladika 1 ; Daniel Braun 2 and Florian Matthes 1

Affiliations: 1 Department of Computer Science, Technical University of Munich, Boltzmannstrasse 3, 85748 Garching bei Muenchen, Germany ; 2 Department of High-tech Business and Entrepreneurship, University of Twente, Hallenweg 17, 7522NH Enschede, The Netherlands

Keyword(s): Text Summarization, Natural Language Processing, Efficient Transformers, Model Hallucination, Natural Language Generation Evaluation, Domain-Adaptation of Language Models.

Abstract: Large Language Models work quite well with general-purpose data and many tasks in Natural Language Processing. However, they show several limitations when used for a task such as domain-specific abstractive text summarization. This paper identifies three of those limitations as research problems in the context of abstractive text summarization: 1) Quadratic complexity of transformer-based models with respect to the input text length; 2) Model Hallucination, which is a model’s ability to generate factually incorrect text; and 3) Domain Shift, which happens when the distribution of the model’s training and test corpus is not the same. Along with a discussion of the open research questions, this paper also provides an assessment of existing state-of-the-art techniques relevant to domain-specific text summarization to address the research gaps.

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 18.118.145.114

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Afzal, A.; Vladika, J.; Braun, D. and Matthes, F. (2023). Challenges in Domain-Specific Abstractive Summarization and How to Overcome Them. In Proceedings of the 15th International Conference on Agents and Artificial Intelligence - Volume 3: ICAART; ISBN 978-989-758-623-1; ISSN 2184-433X, SciTePress, pages 682-689. DOI: 10.5220/0011744500003393

@conference{icaart23,
author={Anum Afzal. and Juraj Vladika. and Daniel Braun. and Florian Matthes.},
title={Challenges in Domain-Specific Abstractive Summarization and How to Overcome Them},
booktitle={Proceedings of the 15th International Conference on Agents and Artificial Intelligence - Volume 3: ICAART},
year={2023},
pages={682-689},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0011744500003393},
isbn={978-989-758-623-1},
issn={2184-433X},
}

TY - CONF

JO - Proceedings of the 15th International Conference on Agents and Artificial Intelligence - Volume 3: ICAART
TI - Challenges in Domain-Specific Abstractive Summarization and How to Overcome Them
SN - 978-989-758-623-1
IS - 2184-433X
AU - Afzal, A.
AU - Vladika, J.
AU - Braun, D.
AU - Matthes, F.
PY - 2023
SP - 682
EP - 689
DO - 10.5220/0011744500003393
PB - SciTePress