...
The easiest way to use your own Virtual Human is to create a Unity Project using the included .unitypackages as a starting point. From here, you can add/change features on the default character, or bring in your own and customize using the existing character for reference.
Smartbody initialization
- SmartbodyInit class (Attached to SmartbodyManager gameobject)
- asset paths
- joint mappings
- mapped skeletons/motions
- SmartbodyCharacterInit class (Attached to each character gameobject)
- skeleton name (.sk)
- voice type (prerecorded audio or tts)
- voice "code" (path to audio files or tts voice name)
- backup voice type and backup "code" (if prerecorded audio file is not found, you can use TTS as a backup
- starting posture
- locomotion information
- SmartbodyFaceDefinition class (Attached to each character gameobject)
- defines visemes and facial expressions for the character.
- visemes and facial expression are single pose animations (.skm) for doing blended lip-sync and expressions.
- neutral pose, action units and visemes
- SoundNode gameobject
- Empty gameobject named 'SoundNode' attached as a child of the character's gameobject
- Attach a Sound Source script to the SoundNode gameobject
- Manually position the SoundNode gameobject where you want the character's speech to originate (eg, his mouth)
...