loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Paper Unlock

Authors: Ozan Cetinaslan 1 ; John Lewis 2 and Verónica Orvalho 1

Affiliations: 1 Universidade do Porto, Portugal ; 2 Victoria University, New Zealand

Keyword(s): Blendshape, Facial Expression, Deformation, 3D Mesh, Facial Animation.

Related Ontology Subjects/Areas/Topics: Animation Algorithms and Techniques ; Animation and Simulation ; Computer Vision, Visualization and Computer Graphics ; Facial Animation

Abstract: The blendshape approach is a predominant technique for creating high quality facial animation. Facial poses are generated by altering the corresponding weight parameters manually for each key-frame by using traditional slider interfaces. However, authoring a production quality facial animation with this process requires time-consuming, labor intensive and iterative work the artists. Direct manipulation interfaces address this problem with a “pin-and-drag” operation inspired by the inverse kinematics approaches. The mathematical frameworks of the direct manipulation techniques are mostly based on pseudo-inverse of the blendshape matrices which include all target shape’s vertex positions. However, the pseudo-inverse approaches often give unexpected results during the facial pose editing process because of its unstable behavior. To this end, we propose the transposition approach to enhance the direct manipulation by reducing unexpected movements during weight editing. Our approach extra cts the deformation directions from the blendshape matrix, and directly maps the sparse constrained point movements to the extracted directions. Our experiments show that, instead of psuedo-inverse based formulations, transposition based framework gives more smooth and reliable facial poses during the weight editing process. The proposed approach improves the fidelity of the generated facial expressions by keeping the hazardous movements in a minimum level. It is robust, efficient, easy to implement and operate on any blendshape model. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 44.222.82.119

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Cetinaslan, O.; Lewis, J. and Orvalho, V. (2017). Transposition Based Blendshape Direct Manipulation. In Proceedings of the 12th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2017) - GRAPP; ISBN 978-989-758-224-0; ISSN 2184-4321, SciTePress, pages 105-115. DOI: 10.5220/0006131901050115

@conference{grapp17,
author={Ozan Cetinaslan. and John Lewis. and Verónica Orvalho.},
title={Transposition Based Blendshape Direct Manipulation},
booktitle={Proceedings of the 12th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2017) - GRAPP},
year={2017},
pages={105-115},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0006131901050115},
isbn={978-989-758-224-0},
issn={2184-4321},
}

TY - CONF

JO - Proceedings of the 12th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2017) - GRAPP
TI - Transposition Based Blendshape Direct Manipulation
SN - 978-989-758-224-0
IS - 2184-4321
AU - Cetinaslan, O.
AU - Lewis, J.
AU - Orvalho, V.
PY - 2017
SP - 105
EP - 115
DO - 10.5220/0006131901050115
PB - SciTePress