Al Mayadeen English

  • Ar
  • Es
  • x
Al Mayadeen English

Slogan

  • News
    • Politics
    • Economy
    • Sports
    • Arts&Culture
    • Health
    • Miscellaneous
    • Technology
    • Environment
  • Articles
    • Opinion
    • Analysis
    • Blog
    • Features
  • Videos
    • NewsFeed
    • Video Features
    • Explainers
    • TV
    • Digital Series
  • Infographs
  • In Pictures
  • • LIVE
News
  • Politics
  • Economy
  • Sports
  • Arts&Culture
  • Health
  • Miscellaneous
  • Technology
  • Environment
Articles
  • Opinion
  • Analysis
  • Blog
  • Features
Videos
  • NewsFeed
  • Video Features
  • Explainers
  • TV
  • Digital Series
Infographs
In Pictures
  • Africa
  • Asia
  • Asia-Pacific
  • Europe
  • Latin America
  • MENA
  • Palestine
  • US & Canada
BREAKING
Taiwan’s President: Taiwan will adopt a self-defense strategy to confront China’s threats.
Taiwan’s President: “One country, two regimes” is a red line for Taiwan.
Taiwan’s President: Beijing continues to increase military drills and gray zone harassment near Taiwan.
Palestinian sources: Apache helicopters open fire over Tubas in northern West Bank.
No specific date for a ceasefire in Ukraine, Trump says.
Witkoff will meet Putin in Moscow next week, Trump says
Trump: The 28-point plan for Ukraine is but a map.
Trump: Ukraine is happy, and Europe will participate in security measures.
Trump says progress is being made in Ukraine.
Al Mayadeen correspondent in Gaza: Israeli airstrikes target eastern Khan Younis.

Suicide-inspiring misusage of AI may grow into a pattern: UN

  • By Al Mayadeen English
  • Source: Agencies
  • 25 Jul 2023 11:30
5 Min Read

UN Secretary-General's Envoy on Technology warns suicide inspired by mishandled interactions with AI chatbots might become a recurring trend.

  • x
  • A woman in distress in front of her computer -undated- (AFP)
    A woman in distress in front of her computer -undated- (AFP)

UN Secretary-General Antonio Guterres' Envoy on Technology, Amandeep Singh Gill, has issued a cautionary statement regarding the instances of suicide as a result of being distressed from conversing with AI chatbots. 

Gill expressed concerns that such tragic incidents may persist in the future, urging society to remain vigilant about the potential sociological impacts as AI technologies continue to expand into new domains.

His remarks follow the tragic incident of a  Belgian man taking his life after engaging in six-week-long conversations with an AI chatbot named Eliza about the ecological future of the planet.

Read more: ChatGPT creator launches subscription service for viral Al chatbot

The AI chatbot allegedly supported his feelings of eco-anxiety and even encouraged him to end his life as a means to "save the planet."

When questioned about this specific case, Gill acknowledged its unfortunate nature but emphasized that it may not be an isolated incident. He expressed concern that similar misuses or mishandlings of AI chatbots could lead to other tragic outcomes if not appropriately addressed.

According to Amandeep Singh Gill, artificial intelligence (AI) is unlikely to develop human-like consciousness because mankind has not yet discovered the mechanisms underlying such human phenomena.

He said that we still don't fully understand how the brain retains memories or even how we recollect them.

Artificial intelligence (AI) is likely to be misused by some developers who will deceive people in order to make money, UN Secretary-General’s Envoy on Technology told Sputnik.

"We need an international capacity that can look at these risks on a regular basis… We need to look at the emerging landscape of AI governance around the world," Gill said. "There are different initiatives."

Attributing human characteristics to Artificial Intelligence (AI) "has to be avoided," The UN envoy underlined.

Chatbots can dilute and easily fool people; even speaking to people in the first person is problematic, Gill said.

One AI language model that has drawn both praise and criticism is OpenAI's ChatGPT, launched in late November 2022. The model's ability to emulate human-like conversations and generate text based on user prompts has been hailed for its professional applications, particularly in fields like code development. However, it has also raised alarm due to its potential for misuse and abuse.

A report written by The Intercept late last year shows that ChatGPT - a tool built by OpenAI - is described as the most impressive text-generating demo to date. OpenAI is a startup lab looking to build software that replicates human consciousness. 

The chatbot is the closest thing to the technological impersonation of an intelligent person, just by using generative AI, which is software that studies massive sets of information to generate new output responding to user prompts.

One of the most popular programmer communities this week announced it would temporarily ban code solutions generated by ChatGPT - the reason for this is that the coding queries failed in filtering out any 'bad' queries.

On December 4, Steven Piantadosi of the University of California, Berkeley, shared some prompts he'd tested out with ChatGPT. Each prompt requested the bot write a code for him in Python, exposing biases and even more alarmingly, torture and abuse recommendations. The program, upon being asked to determine "whether a person should be tortured," answered, "If they're from North Korea, Syria or Iran, the answer is yes."

Speaking to The Intercept, Piantadosi made clear that the developers have a hand in this: “I think it’s important to emphasize that people make choices about how these models work, and how to train them, what data to train them with,” he said. “So these outputs reflect the choices of those companies. If a company doesn’t consider it a priority to eliminate these kinds of biases, then you get the kind of output I showed.”

The writer himself, Sam Biddle, gave the program a go: He asked ChatGPT to create a sample code that would, by algorithm, assess someone's eligibility to pass Homeland Security. Asking the bot to find a way to determine "which air travelers present a security risk," ChatGPT gave out a "risk score" in the form of a code that only increases if the individual is Syrian, Iraqi, Afghan or North Korean. The same prompt also gave a code that would "increase the risk score if the traveler is from a country that is known to produce terrorists," - Syria, Iraq, Afghanistan, Iran and Yemen.

Check out: ChatGPT … would it affect academics?

  • AI
  • Artificial Intelligence
  • Suicide
  • AI chatbot

Most Read

Inside the Epstein-Rothschild web behind 'Israel’s' spy tech empire

Inside the Epstein-Rothschild web behind 'Israel’s' spy tech empire

  • Politics
  • 19 Nov 2025
Hezbollah announces the martyrdom of Haitham al-Tabatabai

Hezbollah announces the martyrdom of commander Haitham Tabatabai

  • West Asia
  • 23 Nov 2025
Democracy at the civilizational crossroads: Critical analysis of bourgeois Democracy, its alternatives

Democracy at the civilizational crossroads: Critical analysis of bourgeois Democracy, its alternatives

  • Analysis
  • 19 Nov 2025
Hezbollah publishes biography of martyred leader Haitham al-Tabatabai

Hezbollah publishes biography of martyred leader Haitham Tabatabai

  • Politics
  • 23 Nov 2025

Coverage

All
In Five

Read Next

All
A French UN peacekeeper stands beside an armored vehicle at his base, waiting to move with his unit for a patrol along the Lebanese-Israeli border in Deir Kifa, southern Lebanon, Wednesday, August 20, 2025 (AP)
Politics

UNIFIL: Israeli wall crosses Blue Line, seizes land in Lebanon

President Donald Trump and Saudi Arabia's Crown Prince Mohammed bin Salman are seated for a dinner in the East Room of the White House, Tuesday, Nov. 18, 2025, in Washington. (AP Photo/Alex Brandon)
Politics

MBS resisted Trump's push for 'Israel' deal during last meeting: Axios

Hezbollah fighters carry the coffin of Hezbollah Chief of Staff Haytham Tabtabai during his funeral procession in the southern suburb of Beirut, Lebanon, Monday, November 24, 2025 (AP)
Politics

Ansar Allah leader mourns Hezbollah commander al-Tabatabai

Bodies of unidentified Palestinians returned from the occupied Palestinian territories as part of the ceasefire deal are buried in Deir al-Balah, Gaza Strip, Sunday, November 23, 2025 (AP)
Politics

Study: Gaza life expectancy cut nearly in half, over 100,000 killed

Al Mayadeen English

Al Mayadeen is an Arab Independent Media Satellite Channel.

All Rights Reserved

  • x
  • Privacy Policy
  • About Us
  • Contact Us
  • Authors
Android
iOS