The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Kertas kerja ini memperkenalkan sistem penyuntingan ekspresi interaktif yang membolehkan pengguna mereka bentuk ekspresi muka dengan mudah. Pada masa ini, kaedah berasaskan contoh yang popular membina model muka berdasarkan contoh muka sasaran. Kelemahan kaedah ini ialah mereka tidak boleh mencipta ekspresi untuk wajah baru: wajah sasaran yang tidak direkodkan dalam pangkalan data sebelum ini. Kami mencadangkan penyelesaian untuk mengatasi had ini. Kami mempersembahkan sistem animasi muka-geometrik-ciri interaktif untuk menjana ekspresi wajah novel. Sistem kami mudah digunakan. Dengan menyeret titik kawalan klik pada muka sasaran, pada paparan skrin komputer, ekspresi unik dijana secara automatik. Untuk menjamin hasil animasi semula jadi, model animasi kami menggunakan pengetahuan sedia ada berdasarkan pelbagai ekspresi individu. Satu model sebelum ini dipelajari daripada medan vektor gerakan untuk menjamin gerakan muka yang berkesan. Satu lagi model terdahulu yang berbeza dipelajari dari ruang bentuk muka untuk memastikan hasilnya mempunyai bentuk muka yang sebenar. Masalah animasi interaktif dirumuskan dalam rangka kerja maksimum posterior (MAP) untuk mencari hasil yang optimum dengan menggabungkan priors dengan kekangan yang ditentukan pengguna. Kami memberikan lanjutan algoritma Propagasi Gerakan (MP) untuk membuat kesimpulan gerakan muka untuk muka sasaran baru daripada subset titik kawalan. Keputusan eksperimen pada animasi muka yang berbeza menunjukkan keberkesanan kaedah yang dicadangkan. Selain itu, satu aplikasi sistem kami dipamerkan dalam kertas ini, di mana pengguna mencipta ekspresi untuk lakaran muka secara interaktif.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Salinan
Yang YANG, Zejian YUAN, Nanning ZHENG, Yuehu LIU, Lei YANG, Yoshifumi NISHIO, "Interactive Facial-Geometric-Feature Animation for Generating Expressions of Novel Faces" in IEICE TRANSACTIONS on Information,
vol. E94-D, no. 5, pp. 1099-1108, May 2011, doi: 10.1587/transinf.E94.D.1099.
Abstract: This paper introduces an interactive expression editing system that allows users to design facial expressions easily. Currently, popular example-based methods construct face models based on the examples of target face. The shortcoming of these methods is that they cannot create expressions for novel faces: target faces not previously recorded in the database. We propose a solution to overcome this limitation. We present an interactive facial-geometric-feature animation system for generating expressions of novel faces. Our system is easy to use. By click-dragging control points on the target face, on the computer screen display, unique expressions are generated automatically. To guarantee natural animation results, our animation model employs prior knowledge based on various individuals' expressions. One model prior is learned from motion vector fields to guarantee effective facial motions. Another, different, model prior is learned from facial shape space to ensure the result has a real facial shape. Interactive animation problem is formulated in a maximum a posterior (MAP) framework to search for optimal results by combining the priors with user-defined constraints. We give an extension of the Motion Propagation (MP) algorithm to infer facial motions for novel target faces from a subset of the control points. Experimental results on different facial animations demonstrate the effectiveness of the proposed method. Moreover, one application of our system is exhibited in this paper, where users create expressions for facial sketches interactively.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.E94.D.1099/_p
Salinan
@ARTICLE{e94-d_5_1099,
author={Yang YANG, Zejian YUAN, Nanning ZHENG, Yuehu LIU, Lei YANG, Yoshifumi NISHIO, },
journal={IEICE TRANSACTIONS on Information},
title={Interactive Facial-Geometric-Feature Animation for Generating Expressions of Novel Faces},
year={2011},
volume={E94-D},
number={5},
pages={1099-1108},
abstract={This paper introduces an interactive expression editing system that allows users to design facial expressions easily. Currently, popular example-based methods construct face models based on the examples of target face. The shortcoming of these methods is that they cannot create expressions for novel faces: target faces not previously recorded in the database. We propose a solution to overcome this limitation. We present an interactive facial-geometric-feature animation system for generating expressions of novel faces. Our system is easy to use. By click-dragging control points on the target face, on the computer screen display, unique expressions are generated automatically. To guarantee natural animation results, our animation model employs prior knowledge based on various individuals' expressions. One model prior is learned from motion vector fields to guarantee effective facial motions. Another, different, model prior is learned from facial shape space to ensure the result has a real facial shape. Interactive animation problem is formulated in a maximum a posterior (MAP) framework to search for optimal results by combining the priors with user-defined constraints. We give an extension of the Motion Propagation (MP) algorithm to infer facial motions for novel target faces from a subset of the control points. Experimental results on different facial animations demonstrate the effectiveness of the proposed method. Moreover, one application of our system is exhibited in this paper, where users create expressions for facial sketches interactively.},
keywords={},
doi={10.1587/transinf.E94.D.1099},
ISSN={1745-1361},
month={May},}
Salinan
TY - JOUR
TI - Interactive Facial-Geometric-Feature Animation for Generating Expressions of Novel Faces
T2 - IEICE TRANSACTIONS on Information
SP - 1099
EP - 1108
AU - Yang YANG
AU - Zejian YUAN
AU - Nanning ZHENG
AU - Yuehu LIU
AU - Lei YANG
AU - Yoshifumi NISHIO
PY - 2011
DO - 10.1587/transinf.E94.D.1099
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E94-D
IS - 5
JA - IEICE TRANSACTIONS on Information
Y1 - May 2011
AB - This paper introduces an interactive expression editing system that allows users to design facial expressions easily. Currently, popular example-based methods construct face models based on the examples of target face. The shortcoming of these methods is that they cannot create expressions for novel faces: target faces not previously recorded in the database. We propose a solution to overcome this limitation. We present an interactive facial-geometric-feature animation system for generating expressions of novel faces. Our system is easy to use. By click-dragging control points on the target face, on the computer screen display, unique expressions are generated automatically. To guarantee natural animation results, our animation model employs prior knowledge based on various individuals' expressions. One model prior is learned from motion vector fields to guarantee effective facial motions. Another, different, model prior is learned from facial shape space to ensure the result has a real facial shape. Interactive animation problem is formulated in a maximum a posterior (MAP) framework to search for optimal results by combining the priors with user-defined constraints. We give an extension of the Motion Propagation (MP) algorithm to infer facial motions for novel target faces from a subset of the control points. Experimental results on different facial animations demonstrate the effectiveness of the proposed method. Moreover, one application of our system is exhibited in this paper, where users create expressions for facial sketches interactively.
ER -