Enhancing BERT with Prompt Tuning and Denoising Disentangled Attention for Robust Text Classification

Qingjun Mao

2024

Abstract

With the increasing complexity of internet data, expanding model parameters and fine-tuning entire models have become inefficient, particularly under limited computational resources. This paper proposes a novel model, Prompt-enhanced Bidirectional Encoder Representation from Transformers (BERT) with Denoising Disentangled Attention Layer (PE-BERT-DDAL), and introduces an efficient fine-tuning strategy to address these challenges. The approach enhances BERT's robustness in handling complex data while reducing computational costs. Specifically, the study introduces a dynamic deep prompt tuning technique within BERT and incorporates a Denoising Disentangled Attention Layer (DDAL) to improve the model’s ability to denoise and manage content-position interaction information. The implementation of deep prompt tuning facilitates the model's rapid adaptation to downstream tasks, while DDAL strengthens content comprehension. Comparative experiments are conducted using three datasets: Twitter entity sentiment analysis, fake news detection, and email spam detection. The results demonstrate that PE-BERT-DDAL outperforms baseline models in terms of accuracy and loss reduction, achieving an average peak accuracy of 0.9059. PE-BERT-DDAL also improves accuracy by 6.75%, 5.93%, and 8.97%, respectively, over baseline models. These findings validate PE-BERT-DDAL's effectiveness, showcasing its capacity for rapid task adaptation and robustness in complex data environments.

Download


Paper Citation


in Harvard Style

Mao Q. (2024). Enhancing BERT with Prompt Tuning and Denoising Disentangled Attention for Robust Text Classification. In Proceedings of the 2nd International Conference on Data Analysis and Machine Learning - Volume 1: DAML; ISBN 978-989-758-754-2, SciTePress, pages 438-445. DOI: 10.5220/0013525700004619


in Bibtex Style

@conference{daml24,
author={Qingjun Mao},
title={Enhancing BERT with Prompt Tuning and Denoising Disentangled Attention for Robust Text Classification},
booktitle={Proceedings of the 2nd International Conference on Data Analysis and Machine Learning - Volume 1: DAML},
year={2024},
pages={438-445},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0013525700004619},
isbn={978-989-758-754-2},
}


in EndNote Style

TY - CONF

JO - Proceedings of the 2nd International Conference on Data Analysis and Machine Learning - Volume 1: DAML
TI - Enhancing BERT with Prompt Tuning and Denoising Disentangled Attention for Robust Text Classification
SN - 978-989-758-754-2
AU - Mao Q.
PY - 2024
SP - 438
EP - 445
DO - 10.5220/0013525700004619
PB - SciTePress