Feel & Drive
let your feelings drive you

Why Feel & Drive?
Our behaviour is highly influenced by the emotions that we feel and that becomes particularly relevant while driving a vehicle, since it requires a very focused and clear-minded approach from us. What happens instead is that the most diverse emotions that we feel during the day take over and make us do actions that can be quite risky: think of an employee that drives just after a bad argue with its boss or the very risky manoeuvres that many do when are nervous for being late.
Feel and Drive is an AmI system designed to help the car driver keep in check its emotional state to have a safer and happier drive, by relying on the vehicle’s cockpit itself. The vehicle is able to gauge the emotional state of the driver and take action to modify the cockpit condition accordingly, in order to encourage the user to take up a more driving-appropriate attitude and become more aware of his emotions. For example, if the driver is bored or unhappy the system can play motivational music while if he is nervous or angry it can play relaxing songs, changing the cockpit lights according to the emotional situation.
Let's see it in action:
AMI Main Steps

Sensing

The system collects information about the driver emotional state from a camera

Reasoning

Feel and Drive interprets the data from the sensors to make a prediction on the driver emotion and select how to modify the cockpit conditions accordingly

Acting

The system can alter the lights in the cockpit, play music and spray perfumes to properly modify the cockpit environment

Interacting

The system primarily interacts with driver by means of voice commands and secondarily through a display to show the features of the music player

AMI Characteristics

Sensitive

The system is capable of sensing the emotional state of the driver and its vocal commands

Responsive

Feel & Drive is able to modify the conditions of the cockpit to guide the driver toward the desired emotional state

Adaptive

According to the driver feelings, Feel & Drive can modify the ambient to encourage him to mantain or change his mood

Transparent

The system becomes totally integrated with the vehicle: the driver interacts with Feel and Drive as if he was interacting with the vehicle itself

Ubiquitous

No matter what is the destination, if you plan to get there by car Feel and Drive will be with you

Intelligent

Feel and Drive can perceive the driver emotions and understand his vocal commands to take up a proactive behaviour that guides him toward an happier state
Let's get more technical

Feel & Drive is concerned with making car driving as secure and comfortable as possible by dealing with the negative effects that some range of emotions can have on the driver. While he is driving the system continuously analyses his feelings through a camera: if he is detected as upset or angry the system will become proactive, tweaking the cockpit lights colour, the music and the ambient scent to gently guide him toward a happier and healthier mood.
In particular, the following known effects on mood are exploited:

  • Anger/Stress: cool colour tones and low bpm chill music are proven to have a calming and softening effect on our mood: RELAX is the keyword here!
  • Sadness: on the other hand, warm colour tones are renown to boost self-confidence which, together with songs having higher bpm, result in making the entire driving environment motivational for the user: ENERGY is the keyword here!
  • Happiness: lights are set to the driver favourite colour and joyful songs are played to REWARD him for having reached a positive and healthier attitude.


System Features
# AREA FEATURE DESCRIPTION
#1 Ambient Tuning Light tuning Lights will change color depending on the mood
#2 Ambient Tuning Adaptative Music Player Music will change depending on the mood, to make the user feel more comfortable
#3 Emotion Prediction Facial Emotion Recognition The system will understand the mood of the user and will adapt to it through facial expressions recognition
#4 User Interaction Vocal Requests The user can interact with the system vocally, asking to modify music, lights or perfumes
#5 User Info User Music Database A personalized music database will be created, according to each mood recognised
#6 User Info User Login Using a screen, the user can log in to the system
#7 System Remote Database The database of user profiles can be made remote, in order to make this system shareable
# AREA FEATURE DESCRIPTION
#8 Ambient Tuning Perfume tuning Depending on the mood, perfume will be sprayed to comfort the user
#9 User Interaction Vocal Interaction The user can tag/untag his favourite songs through vocal commands
#10 User Info Favourite Colour During the registration, the user can choose his favourite colour to be used when he is happy
# AREA FEATURE DESCRIPTION
#11 Ambient Tuning Focus Assistant Two opposite colours will alternate to keep the driver focused
#12 Emotion Prediction Improved Emotion Recognition A skin galvanic resistance will be used to better understand his level of stress and emotional arousal
#13 User Interaction Hot-Word Trigger The system will continuously listen in background and will be triggered with hot-words
#14 System User Learning The system is able to learn from user choices and feedback in order to improve
#15 System Music client The system will rely on an official music server to retrieve a wider music catalogue


System Architecture

Network Architecture

Vehicle network: The on-board wifi of the vehicle is used to connect the Controller node, to which all the sensros and actuators are physically connected, with the remote lights. The database connection is mapped internally as a wifi connection
Central node: Each vehicle connects to the central node using 4G technology
Z-Wave: The z-wave technology is used to remotely activate the perfume sprayer

Selected Components

Hardware Components


Raspberry PI 3 Model B
USB Microphone (PS3 eye)
PiCamera
Music Speaker
Razberry Z-Wave
On/off switch z-wave
Philips Hue lights
Brightness sensor

Software Components


Python 3.7.5
Google text-to-speech API
SnowBoy HotWord Detector
OpenCV library
Tensorflow
Music Player and GUI based on pygame packet
REST based central server
About Us

Simone Eandi
Emotion recognition developer
seandi
s235229@studenti.polito.it

Pietro Oricco
Music player and database developer
PietroOricco
s235075@studenti.polito.it

Marco Vinai
Website, audio and lights developer
vina97
s234319@studenti.polito.it

Open issues


It may be difficult to associate with discrete confidence a particular setting of the cockpit state with a change in the mood of the user.
The state of the art accurancy for facial expression recognition is 66%, improvement would require training a specific classifier for each user (unfeasible)
Computer Vision algorithms for detecting facial expression may require too much computational power to be run in near real time on a single board computer
Using official online music database is right now impossible (possible partnership with spotify?)