what_kind_of_ai_do_we_want

In this seminar, we look at “artificial intelligence” (AI) as a historical-material practice. That is, we understand AI as shaped by the concrete conditions of its development and use. We will address the current discourse within our democratically shaped society around bias in AI, trustworthy AI, and look at decolonial as well as indigenous approaches to AI.

The is a joint module by ZHdK (Felix Stalder) and by ETH Zurich (Nora al-Badri / Adrian Notz)

Date: March 14-18, 2022

Time: 10:00-13:00 / 14:00-17:00 Uhr

Location: Monday-Wednesday

  • ETH, Room ML H37.1 Map
  • ETH MaschinenLabor
  • Sonneneggstrasse 3, 8092 Zürich

Location: Thursday-Friday

  • ZHDK, Room ZT 6.K04
  • Zürcher Hochschule der Künste Toni-Areal,
  • Pfingstweidstrasse 96, 8031 Zürich

Course requirements:

  • Presence in class (at least 80% of the time)
  • Active contribution to discussions in class
  • Active participation in group work and group presentations

Morning: art & science

Introduction to the seminar

Introduction: Art and Science

Guest: Aparna Rao, artist Bangalore; Robotics Aesthetics & Usability Center (RAUC), ETH

Further Reading

Afternoon: Bias in AI

Avoidable and unavoidable biases in AI

Agonistic Machine Learning

Pad for group exercise

Videos/Artistic Works

Amazon Go - SNL, 13.03.2022

Bias In Data

Bias in Labelling

Bias in Institutional Interest

Bias in Modelling

Literature

Morning: Digital Colonialism

Readings in Class

Videos, artistic work

Further Reading

  • Chun, Wendy Hui Kyong. 2021. Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition. Cambridge, Massachusetts: The MIT Press.
  • Mohamed, Shakir, Marie-Therese Png, and William Isaac. 2020. “Decolonial AI: Decolonial Theory as Sociotechnical Foresight in Artificial Intelligence.” Philosophy & Technology 33 (4): 659–84. https://doi.org/10.1007/s13347-020-00405-8
  • Mejias, Ulises A., and Nick Couldry. 2019. “Datafication.” Internet Policy Review 8 (4).
  • Arun, Chinmayi. 2020. “AI and the Global South: Designing for Other Worlds.” In The Oxford Handbook of Ethics of AI, edited by Markus Dirk Dubber, Frank Pasquale, and Sunit Das. Oxford Handbooks Series. New York, NY: Oxford University Press.
  • María do Mar Castro Varela / Nikita Dhawan. 2015. “Postkoloniale Theorie. Eine kritische Einführung”, Transcript Verlag.

Afternoon: Trustworthy AI

Introduction by Prof. Dr. Alexander Ilic, Head of ETH AI Center

Lecture: Hoda Heidari, ETH alum and now faculty member at CMU

Further Reading

Morning

Trustworth AI

Lecture: Menna El-Assady, research fellow AI Center, ETH Zurich Presenation slides

Art/Design Project:

Indigenous (perspectives on) AI

Introduction: Possibilities and limits of making available indigenous knowledge/experience for non-indigenous people

Readings in Class

  • Lewis, Jason Edward, Noelani Arista, Archer Pechawis, and Suzanne Kite. 2018. “Making Kin with the Machines.” Journal of Design and Science, July.
  1. Introduction & Hāloa : the long breath, I = Author 2
  2. Introduction & wahkohtawin: kinship within and beyond the immediate family, the state of being related to others, I = Author 3
  3. Introduction & wakȟáŋ: that which cannot be understood, I = Author 4

Further Reading:

Afternoon: Indigenous AI

Guest: Tiara Roxanne, Postdoctoral Fellow at Data & Society in NYC, Indigenous Mestiza scholar and artist based in Berlin.

Morning: Art & AI

Discussion of texts from Tuesday.

Presentation: Nora al-Badry

Further Works

Let me into your home: artist Lauren McCarthy on becoming Alexa for a day” (Guardian.co.uk, May 2019)

Group Work: Task for each group:

Develop a conceptual sketch of a project that deals with one or more issue(s) that are particularly relevant to the group from the discussions on bias, trustworthy AI, digital colonialism, or indigenous AI. The sketch project can be based on AI, but doesn't need to be. You can use whatever medium you like to address the issues.

Breakout rooms (12:00 - 17:00)

ZT 5.F11 & ZT 6.F09

Afternoon: Group Work

16:00 -17:00

Group mentoring

Nora Al-Badri (ZT 6.K04)

16:00 - 16:20 Group 1

16:20 - 16:40 Group 2

16:40 -17:00 Group 3

Felix Stalder (ZT 6.F09)

16:00 - 16:20 Group 4

16:20 - 16:40 Group 5

Morning: Group Work

Breakout Rooms

ZT 5.F12

T 5.F04

Afternoon: Group Presentations

Each group 10 minutes presentation, 10 minutes discussion

  1. Investigating Youtube Recommendations
  2. AI writing Sci-Fi Stories
  3. generative native fashion
  4. (A)I heard you. Can you stop?
  5. Augmenting Polyterasse: Alternative Retelling of History

Feedback and Wrap-up

  • what_kind_of_ai_do_we_want.txt
  • Last modified: 2022/03/23 13:30
  • by fstalder