Materializing Data Bias(6)| FMP
18/7/21–25/7/21
🤝Teammates: Tiana Robison, Jinsong (Sylvester) Liu,Luchen Peng.
Facing the wicked problem ( algorithmic bias), Co-design is also the design method we gonna use. The most important purpose is to understand the ideas of groups from different backgrounds and let them participate in the discussion and construction of future life which is plausible, probable or preferred in a way of critical thinking.
Workshop Flow
Synthesis Critical Objects
After collecting critical objects, we construction analysis framework and conducted artefact analysis.
Example from Jasper (high school student)
Background:
The participant jasper chose the case study in which AI system detects potentially cancerous lesions on the skin. However,the software is that it relies on data primarily from lighter skinned population putting those with darker skin at risk for being misdiagnosed or undetected.
Scanning machine:
To tackle this issue, Jasper recreated the the scanning machine where no matter the humans race or gender, it would create every combination of each and test each one.
Who gonna use it and how ?
Doctors would use this on all their patients who are seeking out to test for skin cancer to make sure everyone is given equal diagnosis.
Patients step into the cylindrical scanner where an AI software will scan them for any cancerous skin lesions. It produces several versions of you. These different versions are combinations of color and gender.
Where gonnoe use it?
Hospital
Theme of Critique:
It try to tackles racial and gender bias in this specific technology by making everyone have different versions of themselves, Essentially the AI doesnt know which one you truly are and has to diagnose all versions of you.
Findings and Reflections
After serious workshops, we find here are some obstacles happening during the practice.
Firstly, it is difficult to come up with critical objects‘ ideas within a short time for the participants who are not professionals in this area. I think we lack a methodology suitable for the current context provided under this workshop. Like what Mr. Chan said:” It’s hard for me to come up with some counter-empirical and counter-intuitive ideas,It’s easy for me to fall into over-solving problems and neglect how to integrate critical ideas. .” I can deeply understand the anxiety of participants who are unable to construct ideas or dissatisfied with their own ideas.
Secondly, participants will unconsciously bring their own biases into their works. For example, Jasper’s “scanning machine”, whose aim is to try to tackle racial and gender bias in the medical area, uses blue and pink to symbolize male and female respectively.Although he has considered the importance of applying intersectionality in AI slightly, I questioned whether this color setting would deepen prejudice and ignore gender diversity, such as transgender people. It made me realize that insufficiency of our artefact analysis framework,and we need to add an evaluation part in the workshop.
In the next step, our group decided to iterate the workshop targetting these problems.