Optimization Strategies for Large Model Training in Distributed Cloud Computing Environment

Dayi Wang, Zhe He

2025

Abstract

In the context of large model processing, the environment of distributed cloud computing has been improved, but the complexity of platform processing has also increased. This paper analyzes the distributed cloud computing environment with the help of a large model and proposes an optimization strategy. Through in-depth analysis of large models, distributed environments, and data shutdown mechanisms, strategies such as efficient allocation of available resources, large model compression, and training optimization are proposed. After practical application, it can be seen that the above optimization methods can be used to carry out effective large model training and achieve good results. Specific data shows that after optimization, the training time is shortened by more than 50%, the GPU utilization rate is increased to 95%, the distributed environment is reduced by 50%, and the overall performance is significantly improved. The final conclusion shows that this strategy can greatly improve the training efficiency in the distributed cloud environment, give full play to the advantages of large models, and be suitable for a variety of application scenarios.

Download


Paper Citation


in Harvard Style

Wang D. and He Z. (2025). Optimization Strategies for Large Model Training in Distributed Cloud Computing Environment. In Proceedings of the 3rd International Conference on Futuristic Technology - Volume 1: INCOFT; ISBN 978-989-758-763-4, SciTePress, pages 97-102. DOI: 10.5220/0013536100004664


in Bibtex Style

@conference{incoft25,
author={Dayi Wang and Zhe He},
title={Optimization Strategies for Large Model Training in Distributed Cloud Computing Environment},
booktitle={Proceedings of the 3rd International Conference on Futuristic Technology - Volume 1: INCOFT},
year={2025},
pages={97-102},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0013536100004664},
isbn={978-989-758-763-4},
}


in EndNote Style

TY - CONF

JO - Proceedings of the 3rd International Conference on Futuristic Technology - Volume 1: INCOFT
TI - Optimization Strategies for Large Model Training in Distributed Cloud Computing Environment
SN - 978-989-758-763-4
AU - Wang D.
AU - He Z.
PY - 2025
SP - 97
EP - 102
DO - 10.5220/0013536100004664
PB - SciTePress