The Core Mood SDK and APIs utilize the following data input types to determine the user’s mood:
- Device Movement (e.g. roll, pitch, yaw)
- Gesture Movement (e.g. taps and swipes, including pressure)
- App Events
- User Events
- Location (*optional)
- FERN (*optional - facial expression recognition)
Boundless Core Mood SDK accesses the devices core motion to report motion- and environment-related data from the onboard hardware of iOS devices, including from the accelerometers and gyroscopes, and from the pedometer, magnetometer, and barometer.
Core Affect (CAff)
- Valence (Sad → Happy)
- Arousal (Calm → Excited)
- Focus (Fluttering → Zoned)
- Elevation (Disgust → Elevated)
Ekman Seven (EK7)
Core Mood Stress is a prediction of the emotional response to adversity resulting in fear and indecision where 1 represents the highest confidence in stress.
Semantic Event Score
Semantic event scores are customized predictive models whereby the API returns a prediction for a particular emotional state related to your application or service. Examples of a semantic event include a user’s likelihood of needing a particular real-world action (e.g. taking a break) or in-app experience (such as upgrading their account).