Harnessing Mixture of Experts for Enhanced Abstractive Text Summarization: A Leap Towards Scalable and Efficient NLP Models
Pramod Patil, Akanksha Songire
2025
Abstract
The exploding area of Abstractive Text Summarization (ATS) in Natural Language Processing (NLP) marks a shift from traditional extractive methods, providing more coherent and human-like summaries by creating unique phrases and sentences. While there have been many advancements in ATS in recent years, they encompass unique challenges and opportunities in NLP. The present models encounter concerns like content preservation, factual inconsistency, semantic understanding, etc. This paper outlines an implementation of ATS that adopts the Mixture of Experts (MoE) Model, it improves the efficiency of complex tasks by using multiple small models and activating only necessary ones while processing data. This method enhances the content's quality and the produced output's relevancy. The experiments show that implementing the MoE approach within the framework of ATS improves the content’s accuracy and expands the horizons for developing more effective and efficient NLP.
DownloadPaper Citation
in Harvard Style
Patil P. and Songire A. (2025). Harnessing Mixture of Experts for Enhanced Abstractive Text Summarization: A Leap Towards Scalable and Efficient NLP Models. In Proceedings of the 3rd International Conference on Futuristic Technology - Volume 3: INCOFT; ISBN 978-989-758-763-4, SciTePress, pages 420-426. DOI: 10.5220/0013620600004664
in Bibtex Style
@conference{incoft25,
author={Pramod Patil and Akanksha Songire},
title={Harnessing Mixture of Experts for Enhanced Abstractive Text Summarization: A Leap Towards Scalable and Efficient NLP Models},
booktitle={Proceedings of the 3rd International Conference on Futuristic Technology - Volume 3: INCOFT},
year={2025},
pages={420-426},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0013620600004664},
isbn={978-989-758-763-4},
}
in EndNote Style
TY - CONF
JO - Proceedings of the 3rd International Conference on Futuristic Technology - Volume 3: INCOFT
TI - Harnessing Mixture of Experts for Enhanced Abstractive Text Summarization: A Leap Towards Scalable and Efficient NLP Models
SN - 978-989-758-763-4
AU - Patil P.
AU - Songire A.
PY - 2025
SP - 420
EP - 426
DO - 10.5220/0013620600004664
PB - SciTePress