Artwork

Sisällön tarjoaa SWI swissinfo.ch. SWI swissinfo.ch tai sen podcast-alustan kumppani lataa ja toimittaa kaiken podcast-sisällön, mukaan lukien jaksot, grafiikat ja podcast-kuvaukset. Jos uskot jonkun käyttävän tekijänoikeudella suojattua teostasi ilman lupaasi, voit seurata tässä https://fi.player.fm/legal kuvattua prosessia.
Player FM - Podcast-sovellus
Siirry offline-tilaan Player FM avulla!

New wars, new weapons and the Geneva Conventions

24:46
 
Jaa
 

Manage episode 415425329 series 2789582
Sisällön tarjoaa SWI swissinfo.ch. SWI swissinfo.ch tai sen podcast-alustan kumppani lataa ja toimittaa kaiken podcast-sisällön, mukaan lukien jaksot, grafiikat ja podcast-kuvaukset. Jos uskot jonkun käyttävän tekijänoikeudella suojattua teostasi ilman lupaasi, voit seurata tässä https://fi.player.fm/legal kuvattua prosessia.

Send us a text

In the wars in Ukraine and in the Middle East, new, autonomous weapons are being used. Our Inside Geneva podcast asks whether we’re losing the race to control them – and the artificial intelligence systems that run them.

“Autonomous weapons systems raise significant moral, ethical, and legal problems challenging human control over the use of force and handing over life-and-death decision-making to machines,” says Sai Bourothu, specialist in automated decision research with the Campaign to Stop Killer Robots.

How can we be sure an autonomous weapon will do what we humans originally intended? Who’s in control?

Jean-Marc Rickli from the Geneva Centre for Security Policy adds: “AI and machine learning basically lead to a situation where the machine is able to learn. And so now, if you talk to specialists, to scientists, they will tell you that it's a black box, we don't understand, it's very difficult to backtrack.”

Our listeners asked if an autonomous weapon could show empathy? Could it differentiate between a fighter and a child? Last year, an experiment asked patients to rate chatbot doctors versus human doctors.

“Medical chatbots ranked much better in the quality. But they also asked them to rank empathy. And on the empathy dimension they also ranked better. If that is the case, then you opened up a Pandora’s box that will be completely transformative for disinformation,” explains Rickli.

Are we going to lose our humanity because we think machines are not only more reliable, but also kinder?

“I think it's going to be an incredibly immense task to code something such as empathy. I think almost as close to the question of whether machines can love,” says Bourothu.

Join host Imogen Foulkes on the Inside Geneva podcast to learn more about this topic.

Get in touch!

Thank you for listening! If you like what we do, please leave a review or subscribe to our newsletter.
For more stories on the international Geneva please visit www.swissinfo.ch/
Host: Imogen Foulkes
Production assitant: Claire-Marie Germain
Distribution: Sara Pasino
Marketing: Xin Zhang

  continue reading

Luvut

1. New wars, new weapons and the Geneva Conventions (00:00:00)

2. The Ethics of Autonomous Weapons (00:00:07)

3. The Rise of Empathetic Machines (00:15:49)

133 jaksoa

Artwork
iconJaa
 
Manage episode 415425329 series 2789582
Sisällön tarjoaa SWI swissinfo.ch. SWI swissinfo.ch tai sen podcast-alustan kumppani lataa ja toimittaa kaiken podcast-sisällön, mukaan lukien jaksot, grafiikat ja podcast-kuvaukset. Jos uskot jonkun käyttävän tekijänoikeudella suojattua teostasi ilman lupaasi, voit seurata tässä https://fi.player.fm/legal kuvattua prosessia.

Send us a text

In the wars in Ukraine and in the Middle East, new, autonomous weapons are being used. Our Inside Geneva podcast asks whether we’re losing the race to control them – and the artificial intelligence systems that run them.

“Autonomous weapons systems raise significant moral, ethical, and legal problems challenging human control over the use of force and handing over life-and-death decision-making to machines,” says Sai Bourothu, specialist in automated decision research with the Campaign to Stop Killer Robots.

How can we be sure an autonomous weapon will do what we humans originally intended? Who’s in control?

Jean-Marc Rickli from the Geneva Centre for Security Policy adds: “AI and machine learning basically lead to a situation where the machine is able to learn. And so now, if you talk to specialists, to scientists, they will tell you that it's a black box, we don't understand, it's very difficult to backtrack.”

Our listeners asked if an autonomous weapon could show empathy? Could it differentiate between a fighter and a child? Last year, an experiment asked patients to rate chatbot doctors versus human doctors.

“Medical chatbots ranked much better in the quality. But they also asked them to rank empathy. And on the empathy dimension they also ranked better. If that is the case, then you opened up a Pandora’s box that will be completely transformative for disinformation,” explains Rickli.

Are we going to lose our humanity because we think machines are not only more reliable, but also kinder?

“I think it's going to be an incredibly immense task to code something such as empathy. I think almost as close to the question of whether machines can love,” says Bourothu.

Join host Imogen Foulkes on the Inside Geneva podcast to learn more about this topic.

Get in touch!

Thank you for listening! If you like what we do, please leave a review or subscribe to our newsletter.
For more stories on the international Geneva please visit www.swissinfo.ch/
Host: Imogen Foulkes
Production assitant: Claire-Marie Germain
Distribution: Sara Pasino
Marketing: Xin Zhang

  continue reading

Luvut

1. New wars, new weapons and the Geneva Conventions (00:00:00)

2. The Ethics of Autonomous Weapons (00:00:07)

3. The Rise of Empathetic Machines (00:15:49)

133 jaksoa

كل الحلقات

×
 
Loading …

Tervetuloa Player FM:n!

Player FM skannaa verkkoa löytääkseen korkealaatuisia podcasteja, joista voit nauttia juuri nyt. Se on paras podcast-sovellus ja toimii Androidilla, iPhonela, ja verkossa. Rekisteröidy sykronoidaksesi tilaukset laitteiden välillä.

 

Pikakäyttöopas