Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
In 2025, it will be normal to talk to a person in person I have an assistant who knows your schedule, your group of friends, the places you go. This will be marketed as being as convenient as having a personal, unpaid assistant. These anthropomorphic agents are designed to support us and entice us to fold them into every aspect of our lives, giving them access to our thoughts and actions. With a strong interaction with words, that friendship will feel very close.
That comfort comes from the illusion that we are dealing with something like a person, a helper who is on our side. Of course, this appearance hides a very different kind of work style, which serves the needs of companies that do not always coincide with ours. New AI assistants will have more power to intelligently control what we buy, where we go, and what we read. That is a lot of power. AI assistants are designed to make us forget their honesty that they whisper to us in human-like voices. These are bogus engines, marketed as simple.
People have the opportunity to give full access to an AI assistant who feels like a friend. This makes people vulnerable to being exploited by machines that prey on people’s need to connect with people in times of chronic loneliness and isolation. Each play is a secret algorithmic theater, a reality show designed to be more compelling to one’s audience.
This is a moment that philosophers have been warning us about for years. Before his death, philosopher and neuroscientist Daniel Dennett he wrote that we face a serious threat from AI systems that imitate humans: “These imposters are the most dangerous archetypes in human history… to confuse and confuse us and to exploit our irrational fears and anxieties, they will lead us into temptation and, from there; accepting our submission.”
The emergence of AI assistants represents an obvious trend that goes beyond the vague tools of tracking cookies and behavioral advertising to a more hidden power: the control of impressions. Power no longer needs to exercise its authority with a visible hand that controls the flow of information; it imposes itself through the subtle mechanisms of algorithmic support, molding reality to suit individual desires. It is about shaping the cycles of reality that we live.
This influence on the mind is psychopolitical authority: Controls the environment where our thoughts are born, formed, and expressed. Its power is in its relationship – it penetrates the depths of our subjectivity, twisting within us without us realizing it, while we maintain the illusion of choice and freedom. After all, we are the ones who ask AI to summarize the story or create the picture. We may have the power of speed, but the reality lies elsewhere: the design of the system. And the more personalized it is, the more predictable the system will be.
Consider the meaning of this psychopolitics. Traditional forms of mind control relied on overt methods—research, dissemination, suppression. In contrast, today’s algorithmic control operates under the radar, and infiltrates the psyche. It is a move from the external imposition of authority to the internalization of his mind. The open area of the fast screen is a one-person echo chamber.
This brings us to the most deviant part: AI assistants have created a comfort and convenience that makes asking them seem like a no-brainer. Who can criticize a system that puts everything at your fingertips, doing everything you want and need? How can anyone argue with unlimited remixes of content? However, this so-called simplicity is where we diverge the most. AI systems may seem to respond to all our needs, but the ship is set: from the data used to train the system, decisions on how to make them, to the marketing and advertising products that create the results. We will be playing imitation games that ultimately play for us.