Truth v10.101668
PROJECT INDEX

Log in to edit entries. All copyright Studiengruppe Informationsdesign (SI)

STATUS
ASPECTS
PARTNERS

A-Text

YEAR
F_2019-20
STATUS
Archived

ASPECTS

AI

TEAM

So Jin Park

ABSTRACT

a text, algorithmic text, artificial text Algorithms are considered neutral. They are actually not, because they are tools created by humans. Social, political
a text, algorithmic text, artificial text
Algorithms are considered neutral. They are actually not, because they are tools created by humans. Social, political and personal aspects play a significant role in the production and use of a tool. Moreover, data - the learning material for algorithms - is quite the same as society.

For these reasons, algorithms and AI can perform a task that no one expected: Human discrimination. But not everyone is affected, but those who are neither male, white-skinned, heterosexual, nor indigenous. This is called "algorithmic bias" or "algorithmic bias."

How then would an algorithm "opine" on the texts I collected on the topic of "algorithmic bias"? This collection was transformed as a dataset and an algorithm (RNN) generated four texts based on this dataset. Also I wrote a text about the following points, based on the same dataset: blind trust in technology, non-inclusive and fragile data, examples and reasons of discriminatory algorithms and AI, ways to digital democracy. What meanings do generated texts have? What might the future of books look like? And most importantly, how can we create objective algorithms to build an equal and digitally democratic society through them?

FILES

a.text_cover_1.jpg a.text_innen_11.png a.text_innen_4.png a.text_innen_8.png a.text_innen_6.png a.text_innen_3.png a.text_innen_9.png a.text_innen_5.png a.text_innen_7.png a.text_innen_1.png a.text_innen_10.png a.text_innen_2.png a.text_cover_4.jpg a.text_cover_3.jpg
A-Text archived F_2019-20 So Jin Park