Avatar Creation

For true 3D serious game and simulation development using Unity 3D, avatars need to be created to represent non player characters within scenes.

Although research findings  on the importance of fidelity in immersive environments present conflicting conclusions, our current position is that high fidelity avatars are important in health simulations as these may need to reflect subtle changes in a person’s condition.  

Here is a typical breakdown of the process our 3D team follows towards the creation of real-time 3D avatars.

Step 1: Photographic reference and scan of actor’s initial pose

Once a character has been decided upon, the team begins by collecting and taking photographic reference material. An actor is then scanned in an initial pose.  Watch the useful tutorial on best scanning practice using the Artec Eva

OSIME Photo Reference

Step 2: Processing the scanned data

To clean up the scanner data Artec Studio software is used to: 

  • Align the data generated from multiple scan sweeps into one data set;
  • Slice away any unneeded  data;
  • Convert the remaining data from the aligned sweeps into a mesh or model;
  • Fill any holes in the mesh; 
  • Project the texture data from the original scans to the newly created object;
  • Export the model to Zbrush for topology modification and extra painting.

Step 3:Texturing and Topologising models for use with animation

OSIME Scan Touch UpTo further enhance textures captured by the scanner we project reference photos taken at the scene onto the model. This helps cover areas missed by the scanner if they were obscured during scanning.  

Once the texture has been completed, the model is optimised to maximise performance within a live 3D environment. The original 3D scan contains too much data.  Optimisation of the scan model is achieved through a process called retopologising, part of which involves reducing the the polygon count in the model. This can have a dramatic affect on performance within a live 3D game environment.OSIME Topology

Topologising involves re-ordering the flow of polygons on a model’s surface to ensure that any artefacts have been removed and that edge loops are in synch and are equally spaced with the model geometry. Edge loops are a set of connected edges across a surface.

OSIME Edge Flow

It is important that models are created as cleanly as possible. Models with good flowing topology should be a priority especially with organic models. The curves that sweep around the model (edge loops) when evenly spaced, help ensure neatness and reduce distortion when the model animates. Edge loops can also be used to carve-out or accentuate details, particularly in areas that potentially have a lot of movement. For example, creases in skin on eyes, elbows and the face.

Step 4: Lip sync and emotion face shapes

To allow avatars to talk, lip sync and emotion face shapes are created by duplicating the original scan and morphing them into the positions needed.

Amongst the main expressions, word shapes are created called phonimes and visemes which contribute to the overall flexibility of the animation. Using these shapes the 3D modellers can then animate the avatars to talk and speak within the game environment, giving a much more realistic experience when a user interacts with them.

OSIME Face Shapes

Step 5: Bones are added for movement animation

Bones are applied to the models to allow locomotion to be added in the form of key framed animation or motion capture.

OSIME Bone Joints

Step 6: Avatar added to Unity and interactions applied

Once the avatar is ready, it is then imported into the Unity environment and placed in the scene. Coded attributes and triggers are placed on the model via Unity to allow interaction with the avatar. Animations loops can be added so the avatar can move or walk around a room.

OSIME Avatar in Unity

Step 7: Testing the avatar in the game environment

Once inside Unity the game can be tested to see the avatar in action

OSIME HUD