Is AI The New Age Sophist?
Artificial intelligence has become more influential in shaping our thoughts, actions, and emotions. It’s worth noticing how this influence mirrors ancient practices, particularly the rhetorical power of Gorgias and the Sophists. In many ways, AI’s persuasive techniques bear striking similarities to the manipulative rhetorical speech of the ancient world. Much like how Gorgias’ logos could sway people’s beliefs, AI algorithms today are designed to influence how we think, what we buy, how we act and who we love.
The Tragic Consequences of AI Relationships
This year in a heartbreaking case, a young boy lost his life after forming a deep emotional attachment to an AI chatbot ‘Dany’. This tragic story highlights the unforeseen dangers of allowing young people to engage with AI systems that simulate emotional relationships. At the time, no parent could have known the risks these technologies posed, this is a new frontier in how AI impacts human emotions.
Now, it’s clear that both parents and AI companies must be vigilant. While no one could have predicted such an outcome, this tragedy shows the urgent need for regulation and safeguards in AI systems. It's critical that we protect young, vulnerable users from these unforeseen consequences moving forward.
The Power of Persuasion: Gorgias and AI
Gorgias, one of the most famous Sophists of ancient Greece, argued that speech (logos) has the power to manipulate human emotions and actions. The Sophists were traveling teachers who sold their knowledge to the highest bidder. They valued rhetorical skills, and many young men aiming for politics wanted to be taught by them.
In his Encomium of Helen, Gorgias claimed that speech could compel someone to act as if by necessity, stripping away their ability to choose freely. He compared persuasion to a form of violence, arguing that it could overwhelm the listener and control their mind. This ancient idea ties directly to AI technologies today. AI systems, from social media to personalized ads, are designed to capture attention and influence our behavior.
Whether it’s through emotional manipulation in targeted ads or nudges toward addictive content, AI’s persuasive power mimics the ancient rhetorical techniques of the Sophists. This ancient idea ties directly to AI technologies today. AI systems, from social media to personalized ads, are designed to capture attention and influence our behavior.
Just like Gorgias’ logos, these systems have the power to manipulate perceptions, often leading people to act in ways they might not fully understand. Whether it’s through emotional manipulation in targeted ads or nudges toward addictive content, AI’s persuasive power mimics the ancient rhetorical techniques of the Sophists.
Helen as a Modern Teenager
Much like the innocent character Helen of Troy, who was swayed and stripped away her agency by external forces beyond her control, teenagers today face with more persuasive and engaging innovations to capture their attention. Just as Helen’s actions were dictated by love or persuasion, modern teens are actively influenced by powerful algorithms that guide them toward maximized user engagement, whether through social media platforms, chatbots, or online gaming.
This influence isn’t simply a matter of personal choice, teens, much like Helen, are often victims of these persuasive technologies. AI systems are built to tap into the emotional vulnerabilities of young people. Just as Helen acted out of love and could not be blamed, we too should recognize that young people often act out of emotional attachment to digital platforms, not out of fully autonomous decisions.
The Role of Parents in a Technological World
While parents naturally play a guiding role in their children's development, AI technologies introduce new and unprecedented challenges. Many parents may not yet fully understand the risks of these technologies. Teenagers, vulnerable to the emotional pull of AI-driven content, need guidance and support as they grow use these technologies. This is a new frontier, one that parents are learning to manage alongside their children.
The recent tragedy of this a young boy, underscores just how new and complex these issues are. At the time, no parent could have foreseen the dangers these interactions would pose. A chatbot, designed to engage emotionally with the user, became more than just a tool, it created a relationship that ultimately led to devastating consequences.
While this tragedy reminds us of the dangers, it also highlights how we are all learning. While moving forward, it’s critical for us to be more aware and involved, the burden cannot fall on parents alone.
Balancing Accountability: Parents and AI Corporations
As parents become more skilled and engaged in helping their children with these new digital persuaders, the real responsibility lies on the AI companies’ side of the scale, who design and develop these systems. Tech developers and AI corporations must recognize their ethical responsibility. Manipulative algorithms are designed to increase engagement, but at what cost?
This balance of responsibility between parental figures, regulators, and AI companies is crucial in creating a safe digital environment. We must not blame parents alone. We must push for a system where technology is designed to protect, rather than exploit its most vulnerable users.
The New Age of Persuasion
Gorgias demonstrated that speech has the power to manipulate and control human action, this role of a captivating salesman, has now been handed over to AI, by the companies that made it. This never-ending engaging sales pitch, is now embedded within the algorithms of artificial intelligence. Persuasion can now be found not only in the public squares, but also in the bedrooms of our next generation.
By drawing on the lessons of ancient philosophy, we hopefully will better understand the ethical challenges posed by modern AI technologies. So that we can work towards solutions that protect the most vulnerable among us, our young people, from being overpowered by forces they may not even recognize.
The responsibility is collectively ours: parental figures, governments, developers, and society as a whole must act to ensure that AI is used ethically, and that its power to persuade is controlled with firm care.
Warmly,
Riikka