Artwork

Sisällön tarjoaa re:publica. re:publica tai sen podcast-alustan kumppani lataa ja toimittaa kaiken podcast-sisällön, mukaan lukien jaksot, grafiikat ja podcast-kuvaukset. Jos uskot jonkun käyttävän tekijänoikeudella suojattua teostasi ilman lupaasi, voit seurata tässä https://fi.player.fm/legal kuvattua prosessia.
Player FM - Podcast-sovellus
Siirry offline-tilaan Player FM avulla!

An approach to adversarial research (Video-in)

27:53
 
Jaa
 

Manage episode 310831483 series 3074243
Sisällön tarjoaa re:publica. re:publica tai sen podcast-alustan kumppani lataa ja toimittaa kaiken podcast-sisällön, mukaan lukien jaksot, grafiikat ja podcast-kuvaukset. Jos uskot jonkun käyttävän tekijänoikeudella suojattua teostasi ilman lupaasi, voit seurata tässä https://fi.player.fm/legal kuvattua prosessia.
The use of data-driven algorithmic systems to run our lives has become commonplace. It is also becoming increasingly clear that they don’t work equally for everyone. Interrogating these systems is challenging because they are usually protected by terms and conditions that keep their code opaque and their data inaccessible to outsiders. So how do you fight injustice if you can’t see it? One approach is to find the stories of who these systems harm rather than focusing on how they work.
  • Surya Mattu

In today’s digital world social, economic, and racial injustice lurks in the shadows of the unseen Facebook post, the hidden algorithm used to sort employment resumes, and the risk assessment tool used in criminal sentencing. These systems tend to be opaque and beyond scrutiny. Access is usually restricted to large companies and governing bodies whose interests are often unaligned with large parts of their customer base and citizens. Much of the criticism of the technology industry tends to be hypothetical or speculative because it can be very difficult to measure the ways in which people are being harmed. The methods of personalization that have transformed how we use the internet have also obfuscated the disparate impact that takes place there. This makes it significantly harder for those interested in regulation to collect the evidence necessary to hold tech companies accountable. It is possible to collect some of this information by harnessing the network and communications infrastructure that the internet is made up of. The data traveling through these systems tell compelling stories if you know how to look for them. They also often reflect systemic biases and prejudices prevalent in society.

  continue reading

33 jaksoa

Artwork
iconJaa
 
Manage episode 310831483 series 3074243
Sisällön tarjoaa re:publica. re:publica tai sen podcast-alustan kumppani lataa ja toimittaa kaiken podcast-sisällön, mukaan lukien jaksot, grafiikat ja podcast-kuvaukset. Jos uskot jonkun käyttävän tekijänoikeudella suojattua teostasi ilman lupaasi, voit seurata tässä https://fi.player.fm/legal kuvattua prosessia.
The use of data-driven algorithmic systems to run our lives has become commonplace. It is also becoming increasingly clear that they don’t work equally for everyone. Interrogating these systems is challenging because they are usually protected by terms and conditions that keep their code opaque and their data inaccessible to outsiders. So how do you fight injustice if you can’t see it? One approach is to find the stories of who these systems harm rather than focusing on how they work.
  • Surya Mattu

In today’s digital world social, economic, and racial injustice lurks in the shadows of the unseen Facebook post, the hidden algorithm used to sort employment resumes, and the risk assessment tool used in criminal sentencing. These systems tend to be opaque and beyond scrutiny. Access is usually restricted to large companies and governing bodies whose interests are often unaligned with large parts of their customer base and citizens. Much of the criticism of the technology industry tends to be hypothetical or speculative because it can be very difficult to measure the ways in which people are being harmed. The methods of personalization that have transformed how we use the internet have also obfuscated the disparate impact that takes place there. This makes it significantly harder for those interested in regulation to collect the evidence necessary to hold tech companies accountable. It is possible to collect some of this information by harnessing the network and communications infrastructure that the internet is made up of. The data traveling through these systems tell compelling stories if you know how to look for them. They also often reflect systemic biases and prejudices prevalent in society.

  continue reading

33 jaksoa

Kaikki jaksot

×
 
Loading …

Tervetuloa Player FM:n!

Player FM skannaa verkkoa löytääkseen korkealaatuisia podcasteja, joista voit nauttia juuri nyt. Se on paras podcast-sovellus ja toimii Androidilla, iPhonela, ja verkossa. Rekisteröidy sykronoidaksesi tilaukset laitteiden välillä.

 

Pikakäyttöopas