How confident can you be when algorithms tell you “I understand”? When AI learns about emotions, what will we start to lose?
Human relationships are shaped by vulnerabilities, silences, eye contact. An expression of emotion, a hesitation, sometimes a glance, builds a bridge between us. But now we are faced with a question: can machines analyze and cross this bridge? What do we lose when they want to penetrate our emotions?
Codes of Emotions: AI’s Inner Journey
In recent years, there have been significant developments in the field of emotional artificial intelligence – “Emotion AI” or “Affective Computing”. AI is now trying to predict human emotions by analyzing data such as facial expressions, tone of voice, body language and heartbeat. A study at Brunel University is working on models to recognize human emotions from EEG signals. Success rates are up to 98% . But is this true “understanding” or just an advanced mimicry? Many academic papers are now shedding light on the risks of emotional AI. Dangers such as fake intimate relationships and disruptions in human interactions are increasingly being discussed.
Therefore, scientists working on this topic should be followed very well and we will definitely need to discuss it by all people and talk about its possible consequences!
What Losses Are Possible?
The rapid advancement of this technology can certainly bring some benefits. Potential uses include improving the user experience and helping mental health support systems.
But every light has a shadow:
- Sincerity: Having an algorithm tell you it “empathizes” will never give you the vulnerability of a real person.
- Blurring of reality: Sometimes what the AI says is so smooth that it appears to be true, but this can create false perceptions.
- Ethical manipulation: Emotional data can be used to guide people’s decisions. A brand can analyze your mood and send you tailored messages.
- Isolation: People may tend to spend more time with an AI that appears to be a perfect “emotional partner” but does not give true empathy. This can have serious psychological consequences, especially for lonely individuals.
- The devaluation of real relationships: Our emotional ups and downs, our disagreements, our ability to forgive… These are the fabric of who we are as human beings. In relationships with AI, these textures can be obscured.
In fact, things are much deeper: Relationships with AI can sometimes create a“feedback loop “. The human behaves according to the reaction from the AI; the AI analyzes and responds to that behavior. It is a kind of reverse reflection mechanism. Over time, this loop may coincide with the human’s own inner voice. So the evolution or atrophy of our human inner voice is really thought-provoking…
The study “Technological folie à deux: Feedback Loops Between AI Chatbots and Mental Illness” suggests that psychological interactions can create dangerous loops in individuals who have emotional relationships with AI bots. For example, problems such as disagreements, hallucinatory thoughts and the risk of addiction are observed in these interactions.
Final Note :
As a lecturer for many years, I wanted my students to gain not only technical knowledge, but also the ability to question. By closely following the developments in the digital world, I did not always make it my duty to convey the latest information. I wrote this article for you to see the infiltration of artificial intelligence into our emotions as an alarm. Because if we become alienated from our own feelings while machines say they understand our feelings, the most valuable legacy of humanity – empathy, vulnerability, mutual search for meaning – will be lost or irrevocably set on a new path, and I think we should definitely think at this point…


