Robot structure is adapted to working with two emotional systems - Wasabi and a dynamic PAD-based model of emotion. Both systems are based on dimensional theories of emotion, in which affective states are not only represented as discrete labels (like fear or anger), but as points or areas in a space equipped with a coordinate system. Emotions which can be directly represented in this space are called primary (basic). Some theories also introduce secondary emotions which are a mixture of two or more basic ones. The most popular theory in this group is PAD, proposed by Mehrabian and Russell, whose name is an abbreviation of three orthogonal coordinate  axes: Pleasure, Arousal, Dominance.

UWASABI

The Wasabi emotional system was proposed by Becker-Asano. It is based on PAD theory, but is extended with new components like internal dynamics in emotion-mood-boredom coordinates, implementation of secondary emotions (hope, relief and confirmed fears) and a direct way of interaction with software engine (assessment of events directly influences emotion coordinate). This system was implementes as a UWASABI module, and it must be enabled in the robot configuration. For more details about WASABI you can find in the module documentation.

   // get empotion (name and intensity) 
robot.emotion.Get();
   // get value of boredom (same as Z axis)
robot.emotion.GetBoredom();
   // reset emotional state
robot.emotion.Reset();
   // make emotional impulse
robot.emotion.Impulse(value);
   // make emotional impulse with Dominance value
robot.emotion.ImpulseD(value,D);
   // activate hope emotion (see WASABI docu.)
robot.emotion.ActivateHope();
   // activate relief emotion (see WASABI docu.)
robot.emotion.ActivateRelief();
   // activate fears-confirmed emotion (see WASABI docu.)
robot.emotion.ActivateFearsConf();
   // stimulate emotion by weather forecast
   // it's deoends on temperature and weather condition
robot.emotion.attractor.Weather();
   // stimulate emotion by no answer from user 
robot.emotion.attractor.DialogueTimeout(); 
   // stimulate emotion by network error
robot.emotion.attractor.NetworkFailed();
   // stimulate emotion by click gesture (be quite) 
robot.emotion.attractor.ClickGesture();
    .
    .
    .
   // add your own attractors, ex. from ANEW or SentiWordnet appraisal modules
   // get pleasure value <-100..100> 
robot.emotion.P;
   // get arousal value <-100..100>
robot.emotion.A;
   // get or set dominance value at range <-100..100>
robot.emotion.D;
   // get emotion value <-100..100>
robot.emotion.X;
   // get mood value <-100..100>
robot.emotion.Y;
   // get boredom value <-100..0>
robot.emotion.Z;
   // probability of that emotion
robot.emotion.prob;

UPAD

The implementation of dynamic PAD-based model of emotion is centered around the assumption that our emotional state is similar to the response of a dynamic object. Experience suggests that our emotions expire with time, so this dynamic object should be stable. Inputs of emotional system are called attractors. According to aforementioned intuitions, the module implements emotional system as an inertial first, second or third order element with programmable time constants and gain. All input vectors are linearly transformed to three dimensional PAD space. Output of the module is the robot's mood defined as the integral of all emotional impulses over time. This system was implemented as UPAD module. Additional this part of structure utilise UKNearest module to train emotions from continuous PAD space. More information about UPAD you can find in the module documentation. A small part os 

   // get emotion (name and intensity) 
robot.emotion.Get()
   // stimulate emotion by ANEW appraisal
robot.emotion.attractor.ANEW();
   // stimulate emotion by Sentiwordnet appraisal
robot.emotion.attractor.SentiWordNet();
   // stimulate emotion by Weather forecast (temperature weather condition,...)
robot.emotion.attractor.Weather();
   // start backing to the neutral
robot.emotion.attractor.ActNeutral(coefficient);
   // start boerd stimulate
robot.emotion.attractor.ActBored(coefficient);
   // stimulate emotion by user lost (can not detect by video sytstem)
robot.emotion.attractor.UserLost();
   // stimulate emotion by user found 
robot.emotion.attractor.UserFound();
   // stimulate emotion by correct answere by user
robot.emotion.attractor.GuessYes();
   // stimulate emotion by wrong answere by user
robot.emotion.attractor.GuessNo();
   // stimulate emotion by no dialogue
robot.emotion.attractor.DialogueTimeout();

 

 

EMYS and FLASH are Open Source and distributed according to the GPL v2.0 © Rev. 0.8.0, 27.04.2016

FLASH Documentation