Truth v10.101708

● A-Text


Status: archived
Aspects: AI
Team: So Jin Park
Semester: F_2019-20


a text, algorithmic text, artificial text
Algorithms are considered neutral. They are actually not, because they are tools created by humans. Social, political and personal aspects play a significant role in the production and use of a tool. Moreover, data - the learning material for algorithms - is quite the same as society.

For these reasons, algorithms and AI can perform a task that no one expected: Human discrimination. But not everyone is affected, but those who are neither male, white-skinned, heterosexual, nor indigenous. This is called "algorithmic bias" or "algorithmic bias."

How then would an algorithm "opine" on the texts I collected on the topic of "algorithmic bias"? This collection was transformed as a dataset and an algorithm (RNN) generated four texts based on this dataset. Also I wrote a text about the following points, based on the same dataset: blind trust in technology, non-inclusive and fragile data, examples and reasons of discriminatory algorithms and AI, ways to digital democracy. What meanings do generated texts have? What might the future of books look like? And most importantly, how can we create objective algorithms to build an equal and digitally democratic society through them?


a.text_cover_1.jpg a.text_innen_11.png a.text_innen_4.png a.text_innen_8.png a.text_innen_6.png a.text_innen_3.png a.text_innen_9.png a.text_innen_5.png a.text_innen_7.png a.text_innen_1.png a.text_innen_10.png a.text_innen_2.png a.text_cover_4.jpg a.text_cover_3.jpg

All rights reserved. If you want to know more about this project or if you are interested in a collaboration, please let us know by sending an email