Progress, Applications, and Challenges of Large Language Models
Yuxiang Li
2025
Abstract
Large Language Models (LLMs) have significantly reshaped the Natural Language Processing (NLP) landscape, demonstrating unprecedented capabilities in text generation, machine translation, and knowledge extraction. These models leverage massive datasets and advanced neural architectures to achieve high levels of fluency and coherence. This paper provides a comprehensive review of recent advancements in LLMs, analysing the key technological improvements, diverse applications, and persisting challenges. The evolution of model architectures, fine-tuning techniques, and data processing strategies is discussed, along with an evaluation of how LLMs enhance automation and decision-making across industries. While LLMs offer transformative benefits, challenges related to interpretability, ethical concerns, and computational constraints remain pressing. The increasing size of these models raises concerns about efficiency, energy consumption, and accessibility, prompting research into more sustainable AI development. Additionally, addressing biases and ensuring the responsible use of LLMs is crucial for their broader adoption in sensitive domains such as healthcare, finance, and law. This review highlights potential directions for future research, emphasizing the need for efficient, transparent, and responsible AI deployment while balancing innovation with ethical considerations.
DownloadPaper Citation
in Harvard Style
Li Y. (2025). Progress, Applications, and Challenges of Large Language Models. In Proceedings of the 2nd International Conference on Data Science and Engineering - Volume 1: ICDSE; ISBN 978-989-758-765-8, SciTePress, pages 645-648. DOI: 10.5220/0013703400004670
in Bibtex Style
@conference{icdse25,
author={Yuxiang Li},
title={Progress, Applications, and Challenges of Large Language Models},
booktitle={Proceedings of the 2nd International Conference on Data Science and Engineering - Volume 1: ICDSE},
year={2025},
pages={645-648},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0013703400004670},
isbn={978-989-758-765-8},
}
in EndNote Style
TY - CONF
JO - Proceedings of the 2nd International Conference on Data Science and Engineering - Volume 1: ICDSE
TI - Progress, Applications, and Challenges of Large Language Models
SN - 978-989-758-765-8
AU - Li Y.
PY - 2025
SP - 645
EP - 648
DO - 10.5220/0013703400004670
PB - SciTePress