Materializing Data Bias(1)| FMP

Luchen Peng
2 min readSep 23, 2021

14/7/21–18/7/21

🤝Teammates: Tiana Robison, Jinsong (Sylvester) Liu,Luchen Peng.

Why this project?

In the past, the development of the Internet has caused explosive growth
of information, and at the same time broadened the richness of information
we are exposed to. The emergence of algorithms began to shape our online
experience on major social platforms negatively or positively. From the negative influence. it is increasing social injustice and manipulate the public mind secretly. For example, Facebook’s Job Ad delivery system leads to skewed outcomes, reinforcing gender stereotypes and racial discrimination (Ali et al., 2019). Social media filter bubbles and algorithms invisibly influence the election (Hern, 2017).

My undergraduate major — advertising is related to information dissemination, and I have learned some basic knowledge like “ Filter bubble” and “ Information cocoon”, the phenomenon caused by algorithms. Combined with what I learned in MA UX, like Weaponised Design, Design justice, I raised these questions, and want to investigate these problems to make the attempts.

1. “How do algorithms influence our life and strength our biases?”

2. “ Because these negative influence is invisible, how do we reveal them to the public and make it tangible?”

3.” Are there any other ways to solve or mitigate them?”

The plan to make it real.

I, Tiana, and Jinsong (Sylvester) Liu will be in a group to explore this topic
together. In this project, the major methods we gonna use are metaphor, codesign. We wanna invite our target participants and expert in our workshop to discuss this problem together. We hope they can help us to truly understand the present situation and the future that people really want.

Timeline ( made by group)

Discussion and tutorial

In the tutorials, tutors expressed their excitement and positivity to our team and project. Al Reminded us that it is vital to recognize the bias the participants will bring when using metaphor, and it is important to know how do we use a common language to explain “Metaphor”, a very academic term, and how it is used as a method in design. Wang showed her worries about the technology part because none of us know professional knowledge about algorithms. But after she expressed her view on the neutrality of technology, she give us a pathway that when technology is stripped away, the rest is bias.

References

Ali, M., Sapiezynski, P., Bogen, M., Korolova, A., Mislove, A., & Rieke,
A. (2019). Discrimination through optimization. Proceedings of the
ACM on Human-Computer Interaction, 3(CSCW), 1–30. https://doi.
org/10.1145/3359301

Hern, A. (2017, May 22). ‘How social media filter bubbles and algorithms
influence the election’. The Guardian. Available at: https://www.theguardian. com/technology/2017/may/22/social-media-election-facebook-filter-bubbles. (Accessed: 16 June 2021).

--

--