DiT-Head: High Resolution Talking Head Synthesis Using Diffusion Transformers

Aaron Mir, Eduardo Alonso, Esther Mondragón

2024

Abstract

We propose a novel talking head synthesis pipeline called ”DiT-Head,” which is based on diffusion transformers and uses audio as a condition to drive the denoising process of a diffusion model. Our method is scalable and can generalise to multiple identities while producing high-quality results. We train and evaluate our proposed approach and compare against existing methods of talking head synthesis. We show that our model can compete with these methods in terms of visual quality and lip-sync accuracy. Our results highlight the potential of our proposed approach to be used for a wide range of applications including virtual assistants, entertainment, and education. For a video demonstration of results and our user study, please refer to our supplementary material.

Download


Paper Citation


in Harvard Style

Mir A., Alonso E. and Mondragón E. (2024). DiT-Head: High Resolution Talking Head Synthesis Using Diffusion Transformers. In Proceedings of the 16th International Conference on Agents and Artificial Intelligence - Volume 3: ICAART; ISBN 978-989-758-680-4, SciTePress, pages 159-169. DOI: 10.5220/0012312200003636


in Bibtex Style

@conference{icaart24,
author={Aaron Mir and Eduardo Alonso and Esther Mondragón},
title={DiT-Head: High Resolution Talking Head Synthesis Using Diffusion Transformers},
booktitle={Proceedings of the 16th International Conference on Agents and Artificial Intelligence - Volume 3: ICAART},
year={2024},
pages={159-169},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0012312200003636},
isbn={978-989-758-680-4},
}


in EndNote Style

TY - CONF

JO - Proceedings of the 16th International Conference on Agents and Artificial Intelligence - Volume 3: ICAART
TI - DiT-Head: High Resolution Talking Head Synthesis Using Diffusion Transformers
SN - 978-989-758-680-4
AU - Mir A.
AU - Alonso E.
AU - Mondragón E.
PY - 2024
SP - 159
EP - 169
DO - 10.5220/0012312200003636
PB - SciTePress