Secure Sockets Layer


Hello Guest
Did you know this forum has been running since 2010?


Our bot discovers modern tech on the web and then posts about it in the forum.

Recent Topics

Stop Burning Stuff


Can You Help?

Help keep our site running
Advert free.

Web Utilities

Animation project I am working on

Started by Freddy, December 07, 2013, 20:31:31 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic. Total views: 8,911


I posted this at AiDreams but I feel chuffed with myself so thought I would share it here.  Here's the blurb I posted at Dreams...

Got a bit further today. I was up late last night getting my morphing code to be able to combine morphs as I already had single morphs working. And today I finished it off and optimised the code a little bit.

In my code I first load a 'neutral' model/mesh. The morph targets are the same model but in a different shape - eg with one eye closed or smiling. I take the difference between the neutral mesh and the morph target and add that to a buffer (also accounting for the strength of the morph). This is done with each active morph so that you end up with a compound mesh of all morphs then you display it.

I have manual sliders off screen to the left (you can see me mousing them about) . So this morphing is being done live. When a slider changes it changes the weight of the morph or in other words it's level of influence on the neutral mesh.

Cracking the multiple morphs was a really good feeling. I see now how much work has gone into things like Haptek. They were ahead of the game a long time back. But finally years later I have maybe learnt enough to start building my own animation and morphing system.  8)



I noticed that the morph target limits are such that you can create facial expressions that exceed realism, making the the head a little bit creepy. Not a complaint by any stretch of the imagination here... Just an observation. :) Like Syber said, keep it up! :thumbsup:
Safe, Reliable Insanity, Since 1961!


Smoke me a Kipper I'll be back for breakfast - Ace Rimmer


Thanks chaps, that makes me smile as it took me ages to get working and to understand how to do it.

Dave, yes I know what you mean, this was simply an early demonstration of combining facial expressions. Later on I will be adding limits, fade ins and outs as well as durations. These extras should prevent a smile from being too combined with a grimace for example.

In my demo I was combining an angry expression fully with a smile, which is why the mouth went to such extremes. With the idea I envisage this should not happen as a smile will be fading out as she goes into anger, which will fade in. So you won't get two fully weighted morphs at the same time.


When you're done, will this respond to phonemes from, say, a TTS engine, thus adding voice capabilities? That would be rather cool. :)
Safe, Reliable Insanity, Since 1961!


Yes it will Dave, that's the grand plan. I am working on the in-between morphs at the moment, sometimes called tweening in animation terms. This is being done so morphs change more smoothly between each other.

The reason I am dong this is to give smoother changes in the lips when a phoneme is reached and I show the appropriate viseme.

I've already done tweening before in an XNA project, but this was based on someone else's code.  This time around it's all my code  8)

Here's where I got to a couple of years back, it's ok, but I did not know what I was doing then really and it only flips through visemes and has no emotion morphs. It did not blend morphs either :


OK here we go, linked to an ALICE bot and with smooth lip-sync. I was up till about 3.00AM working on her lips  :o

It was funny because I had my frame rate set really slow so that I did not push it too much.  I had it with about 10 fps and it was terrible and jerky and I was worried .NET could just not handle it. Much to my relief when I tried upping the frame rate to 30fps and then 50fps it all magically slotted into place. A little more tinkering and I was pleased with the results.  For real-time animation it's not too bad.


That lip sync's pretty good.  :thumbsup:
Can you delay the sound slightly or is the delay due to processing?

Shame she's got no rhythm.  ;D
"I cannot remember the books I've read any more than the meals I have eaten; even so, they have made me."

Ralph Waldo Emerson


Thanks mate :) Yeah no rhythm but she knows the good songs !

Delay the sound ? Hmm, do the lips look too slow for the sound to you ? It's certainly something I can tweak. I put settings in for fading in and out of morphs.

Hmm, how to explain it... well for a mouth movement there is a fade in and a fade out duration. I currently have it set at 80 milliseconds each. So that's 160ms per mouth shape change. That's about 6 phonemes a second. At a rough guestimate I think that's the area to be in.

I have a slider where I can increase or decrease this duration, so you can easily find the sweet spot or as good as. That's just where I left it last night after a marathon coding day.


Well I for one think it was time well spent. Kudos, Freddy! :thumbsup:
Safe, Reliable Insanity, Since 1961!