If you suspend your transcription on amara.org, please add a timestamp below to indicate how far you progressed! This will help others to resume your work!
Please do not press “publish” on amara.org to save your progress, use “save draft” instead. Only press “publish” when you're done with quality control.
Many people (well, mostly business people) welcome this new era of data analysis and the associated vision of an „intelligent planet“. Not so many people seem to be concerned about the other side of the coin though, which is an ever-growing influence of algorithms on our personal life and the accompanying shift of decision power from humans to machines. In as little as 10 years, algorithms might decide if you get a new job – or if you get fired from your current one –, how much you will pay for your health insurance, whether you will be allowed to travel to a given country and who you will marry. So it’s time to say hi to your new boss: the algorithm.
Often people talk either about the consequences of a data-driven society, or about the technological aspects of it, but rarely about the two together. With my talk I want to change that by discussing concrete technologies and algorithms that are used in data analysis today, together with their societal and political implications. I will show how algorithms can be trained to be racist, misogynic and plenty of other things, and that this actually happens in practice if no care is taken to avoid it. Finally, I will discuss various approaches to solve this dilemma, both technological and political.
Outline:
* Introduction to „big data“ and data analysis,
* Parts of our lives that are already under algorithmic control,
* Parts of our lives that soon will be under algorithmic control,
* Example use case of algorithms in data science,
* How machine learning can discriminate against certain groups of people,
* Example algorithm: Classifying people in good and bad customers,
* How the bias comes about: Algorithm-based discrimination,
* How we can fix these problems.
* Outlook.