Sugarcane Island with Alfred

From IDSwiki
Revision as of 14:02, 20 December 2011 by Birgit Endrass (talk | contribs)
Jump to navigationJump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

IRIS Wiki - IS Systems - Sugarcane Island with Alfred

Availability

Sugarcane Island with Alfred is an IS system which was developed as a research prototype. It is not available for download. Nevertheless, you can get most of the technical components used for implementing the system.

Technical Description

Sugarcane Island with Alfred is a two player Interactive Storytelling application that adapts a part of the story of the game book "Sugarcane Island" by Edward Packard. The users find themselves stranded on an unknown island and need to find a way to survive. The story is narrated by an embodied virtual character named Alfred. Decisions between the text parts are realized with a Wizard-of-Oz speech input. Alfred asks for a decision and the users have to speak out their choice. Full Body Gestures are added in Quick Time Events, were the users have to perform a specific gesture given a limited amount of time. The recognition of these gestures using Microsoft Kinect is done with the FUBI Full Body Interaction Framework.

QuickTimeSetup.jpg

The general application runs on the Horde3D GameEngine. In Addition, SceneMaker was used to model and execute the story as a hierarchical finite state machine extended with multimodal scene scripts that consist of the text to be spoken including additional commands like animations or sounds.

Video on Youtube


Strong Points

The potential of this ITS application lies in the interaction interface designed for two players. It uses the Microsoft Kinect sensor to recognize full body gestures of the users that do not have to hold or wear any interaction device. The story itself is adapted from the game book Sugarcane Island. The strength of this approach is, that there is no need for writing a complex story, but the given story can be directly used to investigate the interaction interface.

Limitations

The full body gestures are currently applied in so-called Quick Time Events, where the users have to perform a specific gesture during a limited amount of time. A next step would be to provide real decisions by offering a choice of different gestures or even some kind of gesture syntax.

Main Publications

  • Felix Kistler, Dominik Sollfrank, Nikolaus Bee, and Elisabeth André, Full Body Gestures enhancing a Game Book for Interactive Story Telling, Proc. of the 4th Int. Conf. on Interactive Digital Storytelling, 2011

Supporting Narrative Theories

None.

Computational Model

The story is modeled with SceneMaker as a hierarchical finite state machine extended with multimodal scene scripts.

Type of interaction

Interaction using Microsoft Kinect to recognize full body gestures. In addition, Wizard-of-Oz speech input for decisions.