Which would be the best way of doing facial expressions with JME3?
It would be nice if one could do the expression in Blender and export it to JME3.
Is it possible to transfer facial expressions between different models? I guess you always need to do at least some tuning per different face mesh.
I describe a process for this in the jMonkeyEngine 3.0 Cookbook.
It uses regular animations (actually, I went through Jaime’s animations to look for suitable frames and made them into 1 frame animations) and AnimationChannels tied to specific Bones to accomplish both a form of lip sync (phonemes) and facial expressions.
As long as it’s tied to the skeleton you can transfer it between characters (given that they use the same skeleton).
Sounds like good technique. Requires some finesse with blend weights when using with human models though?