Table of Contents

Art and AI. Machines Visions.

Date: Week 9, March 1 - 5, 2021

Time: 10:00 - 17:00

The course takes place online: The link is sent to you by email.

Course requirements:

Course Abstract

Artificial intelligence (AI) is all around us, transforming the way we see the world and the world sees us. Sometimes, we barely notice, sometimes, the results of AI can be very impressive, though sometimes so impressively bad, that Hito Steyrl and others started to talk about “Artificial Stupidity”. But almost always, the actual procedures of AI remain mysterious, locked away in a black box.

In this module, we will look at a recent wave of artistic and theoretical work that tries to open this black box, creating a language and an aesthetics for critical engagement. In particular, we will look at image recognition AI, what the artist Trevor Paglen calls “predatory vision” and the data scientist Joy Buolamwini calls the “coded gaze”, referring to gender and racial biases in these systems.

Course Works and Notes

Pad:http://pad.vmk.zhdk.ch/machine_visions

Monday Morning

Similarity & Difference: Observing the results of machine visions

How Image Recognition works, technically

Monday Afternoon

Tuesday Morning

Bias in Vision

The racist soap dispenser @ Facebook, 2017

PortraitAI Sarah L. Fossheim's experiment with poc faces

Colour Bias in Analogue Photography

The Shirley Card

Color film was built for white people, 4:39, 2015

Roth, Lorna. 2009. “Looking at Shirley, the Ultimate Norm: Colour Balance, Image Technologies, and Cognitive Equity.” Canadian Journal of Communication 34(1).

* http://colourbalance.lornaroth.com

Osinubi, Ade. 2018. “The Inherent Color Bias of Image Capturing Technology: Colorism in Photography.” Medium.com

Ewart, Asia. 2020. “The Shirley Card: Racial Photographic Bias through Skin Tone.” The Shutterstock Blog (June 30).

Joy Buolamwini The coded gaze

Another form of Bias:

Further Reading:

Tuesday Afternoon

Timnit Gebru: Computer Vision: Who is Helped and Who is Harmed? (02.2021), 03-46

Trevor Paglen

Is Photography Over?, 4 Parts, Fotomuseum Winterthur, 03.2014

Wednesday Morning

Trevor Paglen & Cate Crawford

Wednesday Afternoon

Reading your character (and future behavior) from your face

Levin, Sam. 2017. “Face-Reading AI Will Be Able to Detect Your Politics and IQ, Professor Says.” The Guardian (Sept. 12).

Critique of this paper

Agüera y Arca, Blaise, Margret Mitchell, and Alexander Todorov. 2018. “Do Algorithms Reveal Sexual Orientation or Just Expose Our Stereotypes?” Medium (Jan 11).

A fundamental critique of the approach

Agüera y Arca, Blaise, Margret Mitchell, and Alexander Todorov. 2017. “ Physiognomy’s New Clothes.” Medium (May 7).

Further reading

Wang, Yilun, and Kosinski, Michal. 2017. “Deep Neural Networks Are More Accurate than Humans at Detecting Sexual Orientation from Facial Images.” ((Original “ ai gaydar” paper)

Thursday

Artist Talk and Workshop by Adam Harvey (Full day)

Adam Harvey's Art projects about privacy, computer vision, and surveillance

Friday Morning

Discussion of Readings Wednesday Afternoon

Fairness oder Vorurteil. Einsatz Künstlicher Intelligenz bei der Jobbewerbung fragwürdig Bayrischer Rundfunk. 2021 English Version

Hito Steyerl (2016) Keynote Conversation: Anxious to Act transmediale 2016, 23 Min

Steyerl, Hito. 2016. “A Sea of Data: Apophenia and Pattern (Mis-)Recognition.” e-flux Journal (april)

Salvaggio, Eryk. 2020. “Creative Strategies for Algorithmic Resistance!” Cybernetic Forests (June 29).

Friday Afternoon

Writing of short story

Material added after the course

Schiller, Devon. 2020. On the Basis of Face: Biometric Art as Critical Practice, Its History and Politics. Amsterdam: Insitute for Network Cultures.