Adding interactions to Unity is achieved by programming using C# and JavaScript. These languages align tightly with skills already present within the OSIME team.

Some interesting work during 2016 involved the creation of a head tracking script that allows the tracking of avatar eye and head movement with the view of the player controlled camera. This means that as a player moves towards a non-player character then that character’s head will lift up and move in line with the movement of the player. When they answer questions or engage with the player, the non-player character’s head and eyes will follow the player as they move around the room, resetting back to what they were doing before they were approached, should the player move too far away. Enabling more reactive non-player avatars adds realism to a scene.

OSIME Unity 1

Additional avatar behaviours include animations in response to interaction within a scene. Animations used thus far include walking, breathing, removal of oxygen mask, typing and bed curtain movement. 

OSIME Unity 2