Is ChatGPT a Labourite or a Nationalist?
While it may seem absurd to attribute political leanings to an AI like ChatGPT, the underlying biases in these technologies have real and significant implications
At first glance, questioning the political leanings of ChatGPT might seem as absurd as asking about the voting intentions of one's toaster. Indeed, it's a silly notion to attribute political preference to a household appliance. However, the question has a reply that gains a surprising degree of relevance and complexity when we delve deeper into the implications of large language models like ChatGPT.
First and foremost, it's paramount to remember that ChatGPT is just a tool without personal beliefs, aspirations, or political affiliations. It doesn't harbour sympathies towards political candidates or parties, nor does it aspire to become a delegate for any political movement. In essence, ChatGPT is similar to a highly-sophisticated calculator: you input a question, and it generates a response based on its programming and training.
Yet, here lies a crucial difference — unlike a traditional calculator, which will unwaveringly output '2' in response to '1 + 1', ChatGPT's responses can differ. This is because large language models are non-deterministic. Their outputs are unpredictable, so their validity cannot be guaranteed in every instance. Furthermore, this raises an intriguing question of whether such a model can exhibit political viewpoints.
Whilst they definitely do not hold personal views as we do, their replies typically exhibit biases. Remember that ChatGPT, like all AI models, is shaped by the data it was trained on. If its training data skews towards left-leaning texts, the model would exhibit a leftist bias, and vice versa. So much so that a study by the Massachusetts Institute of Technology (MIT) suggests that ChatGPT tends to lean more towards Labourite ideologies, indicating a left-leaning bias in its training data.
So, when asking ChatGPT for information, users should be mindful that the model will provide replies with some degree of bias.
But bias isn't unique to this model and afflicts all the Large Language Models. This is inevitable since they learn from vast amounts of data that inherently reflect societal biases. Because of this, AI biases can become truly problematic, as they can influence real-world decisions and exacerbate societal inequalities.
Let me give you an example. Picture yourself applying for your dream job, equipped with the right qualifications and enthusiasm. However, an unseen barrier stands in your way - an AI recruitment system. A comprehensive Reuters report has shed light on disturbing instances where such systems, driven by biased historical hiring data, have unfairly discriminated against candidates based on gender, age, or ethnicity.
These digital gatekeepers, supposedly neutral, instead enforce outdated prejudices. They make critical decisions about who gets a foot in the door, often overlooking genuinely qualified individuals simply because they don't align with the system's skewed idea of an 'ideal candidate'. This is not just a faceless statistic; it's a reality for many. It could be you, a family member, or a close friend unjustly side-lined in their professional journey, not by a human but by an algorithm.
While it may seem absurd to attribute political leanings to an AI like ChatGPT, the underlying biases in these technologies have real and significant implications. As we continue integrating AI into various aspects of our lives, addressing and mitigating these biases becomes increasingly crucial. Failing to do so risks emphasising existing societal inequalities while undermining the principles of fairness and equality we strive to uphold in a democratic society.
We must actively seek to identify these biases within AI systems and strive relentlessly to mitigate them. It is only through such conscientious effort that we can edge closer towards creating a world that is more equitable and just.
This opinion article first appeared in Business Today on 22 February 2024
-
Court & Police
Man granted bail over alleged ‘politically motivated’ headbutt in Valletta festa
-
Court & Police
Tourist, 19, gets suspended sentence for attempted break-in while drunk
-
Court & Police
Neville Gafa illegally received medical visa payments from Libyans, court ruling reveals
More in News-
Business News
MIDI agrees to transfer T15 Building at Tigné Point for €5.5 million
-
Business News
HSBC reports €109 million profit in 2025, down from €154.5 million
-
Business News
MFSA publishes supervisory priorities for 2026
More in Business-
Sportsbetting
RTP in casino games: What does it mean?
-
Football
Joseph Portelli applauded by Nocerina supporters outside stadium
-
Other Sports
From Tirana to Liège: Iron Taekwondo League brings home five medals
More in Sports-
Theatre & Dance
Gwilym Bugeja's one-man comedy show to premiere this April
-
Music
Over 150 artists, cultural workers call on government to withdraw Eurovision participation
-
Theatre & Dance
Big Brother is always watching: Dystopian classic 1984 to be staged at Teatru Manoel this March
More in Arts-
Opinions
Malta cannot continue to overlook its growing loneliness crisis
-
Editorial
Abela’s position is vulnerable after judge’s letter
-
Opinions
We’re living in a crazy world
More in Comment-
Recipes
Porchetta with pumpkin mash and salsa verde
-
Recipes
Wild fennel and hazelnut pesto
-
Recipes
Caramel brownie trifle cups
More in Magazines