====== Art and AI. Machines Visions. ====== **Date:** Week 9, March 1 - 5, 2021 **Time:** 10:00 - 17:00 The course takes place online: The link is sent to you by email. Course requirements: * active participation in group activity * active individual work (via pad) * fictional short story, 2-3 pages, to be handed until Sunday, March 7 (via email) ===== Course Abstract ===== Artificial intelligence (AI) is all around us, transforming the way we see the world and the world sees us. Sometimes, we barely notice, sometimes, the results of AI can be very impressive, though sometimes so impressively bad, that Hito Steyrl and others started to talk about “Artificial Stupidity”. But almost always, the actual procedures of AI remain mysterious, locked away in a black box. In this module, we will look at a recent wave of artistic and theoretical work that tries to open this black box, creating a language and an aesthetics for critical engagement. In particular, we will look at image recognition AI, what the artist Trevor Paglen calls “predatory vision” and the data scientist Joy Buolamwini calls the “coded gaze”, referring to gender and racial biases in these systems. ===== Course Works and Notes ===== **Pad:**[[http://pad.vmk.zhdk.ch/machine_visions]] ===== Monday Morning ===== **Similarity & Difference: Observing the results of machine visions** * !Mediengruppe Bitnik [[https://wwwwwwwwwwwwwwwwwwwwww.bitnik.org/samesame/|SAME SAME. WATCHING ALGORITHMS - CABARET VOLTAIRE EDITION]] 2015 * Mario Klingemann & Simon Doury [[https://experiments.withgoogle.com/x-degrees-of-separation|X Degrees of Separation]], 2018 **How Image Recognition works, technically** * [[https://www.youtube.com/watch?v=2-Ol7ZB0MmU|A friendly introduction to Convolutional Neural Networks and Image Recognition]], 2017, 32 Min. * [[https://openai.com/blog/introducing-activation-atlases|Activation Atlas]], 2019 * [[https://www.youtube.com/watch?v=76jZkqlGIMY| Understanding image models and predictions using an Activation Atlas]], Google AI Adventures, 2019, 7:49 ===== Monday Afternoon ===== * Graham, Mark, Rob Kitchin, Shannon Mattern, and Joe Shaw. How to Run a City like Amazon, and Other Fables, 2019. [[https://issuu.com/meatspacepress/docs/how_to_run_a_city_like_amazon_and_other_fables|issu.com]], [[https://meatspacepress.com/go/how-to-run-a-city-like-amazon|PDF]] * Select one story, read it. Write short summary in the pad and how the story relates to the actual company, add your name to the entry * short research on image recognition. Select one news story that you fund interesting. Enter the link to the pad and write why you selected this story. Add your name to the entry ===== Tuesday Morning ===== **Bias in Vision** [[https://www.youtube.com/watch?v=YJjv_OeiHmo| The racist soap dispenser]] @ Facebook, 2017 [[https://portraitai.app| PortraitAI]] [[https://twitter.com/liatrisbian/status/1368960616023666688| Sarah L. Fossheim's experiment with poc faces]] **Colour Bias in Analogue Photography** [[https://kimon.hosting.nyu.edu/physical-electrical-digital/items/show/1009|The Shirley Card]] [[https://www.youtube.com/watch?v=d16LNHIEJzs&feature=emb_logo| Color film was built for white people]], 4:39, 2015 Roth, Lorna. 2009. “[[https://www.cjc-online.ca/index.php/journal/article/view/2196|Looking at Shirley, the Ultimate Norm: Colour Balance, Image Technologies, and Cognitive Equity.]]” Canadian Journal of Communication 34(1). * [[http://colourbalance.lornaroth.com]] Osinubi, Ade. 2018. “[[https://medium.com/africana-feminisms/the-inherent-color-bias-of-image-capturing-technology-colorism-in-photography-3e1dff554548|The Inherent Color Bias of Image Capturing Technology: Colorism in Photography.]]” Medium.com Ewart, Asia. 2020. “[[https://www.shutterstock.com/blog/shirley-card-racial-photographic-bias|The Shirley Card: Racial Photographic Bias through Skin Tone.]]” The Shutterstock Blog (June 30). **Joy Buolamwini** The coded gaze * [[https://www.youtube.com/watch?v=QxuyfWoVV98|AI, Ain't I A Woman?]], 2018, 3:33 * [[https://www.notflawless.ai]], see stories there, * [[http://gendershades.org/overview.html]] * [[https://www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms?language=en#t-111018| How I am fighting Bias in Algorithms.]] TEDx Talk, 2017 * [[https://www.safefacepledge.org|Safe Face Pledge]] * [[https://www.ajlunited.org| Algorithmic Justice League]] Another form of Bias: * !Mediengruppe Bitnik! [[https://wwwwwwwwwwwwwwwwwwwwww.bitnik.org/sor/|Dada. State of the Reference]], 2017 Further Reading: * Eubanks, Virginia. Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. First Edition. New York, NY: St. Martin’s Press, 2017. * Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism. New York: New York University Press, 2018 ===== Tuesday Afternoon ===== Timnit Gebru: [[https://youtu.be/23MxOh99N54?t=197 |Computer Vision: Who is Helped and Who is Harmed?]] (02.2021), 03-46 * Watch the talk and note the main points in the pad. * [[https://twitter.com/BLMPrivacyBot|BLMPrivacyBot]] * Rothkopf, Joshua. 2020. “[[https://www.nytimes.com/2020/07/01/movies/deepfakes-documentary-welcome-to-chechnya.html|Deepfake Technology Enters the Documentary World]].” The New York Times. Trevor Paglen Is Photography Over?, 4 Parts, Fotomuseum Winterthur, 03.2014 - [[https://www.fotomuseum.ch/en/explore/still-searching/articles/26977_is_photography_over|Is Photography Over?]] - [[https://www.fotomuseum.ch/en/explore/still-searching/articles/26978_seeing_machines|Seeing Machines]] - [[https://www.fotomuseum.ch/en/explore/still-searching/articles/26979_scripts|Scripts]] - [[https://www.fotomuseum.ch/en/explore/still-searching/articles/26980_geographies_of_photography|Geographies of Photography]] * Read these four short parts. Select four sentences that you find interesting and copy them to the pad. Please organize them in the pad according to the section. ===== Wednesday Morning ===== Trevor Paglen & Cate Crawford * [[https://www.hkw.de/en/app/mediathek/video/69622|Datafication of Science]] Lecture at HKW, Jan 12, 2019. 32 Min * [[https://www.hkw.de/en/app/mediathek/video/69578||Discussion with Kate Crawford, Trevor Paglen, Felix Stalder]] HKW, Jan 12, 2019, 14Min * Kate Crawford and Trevor Paglen, [[https://www.excavating.ai/Excavating|AI. The Politics of Images in Machine Learning Training Sets]] * [[http://www.fondazioneprada.org/project/training-humans/?lang=en|Fondazoione Prada: Training Humans]], Fondazione Prada, Venice, 12 Sep 2019 – 24 Feb 2020 [[https://we-make-money-not-art.com/training-humans-how-machines-see-and-judge-us|Review]], we-make-money-not-art.com, Dec. 16.2019 * Paglen, Trevor. [[https://thenewinquiry.com/invisible-images-your-pictures-are-looking-at-you/|“Invisible Images (Your Pictures Are Looking at You).”]] The New Inquiry (blog), December 8, 2016 * [[https://www.paulekman.com/resources/universal-facial-expressions| Paul Eckman: Seven Universal Facial Expressions]] * [[https://www.bbc.com/future/article/20180510-why-our-facial-expressions-dont-reflect-our-feelings|Why our facial expressions don’t reflect our feelings]], BBC, 10.05.2018 ===== Wednesday Afternoon ===== **Reading your character (and future behavior) from your face** Levin, Sam. 2017. “[[http://www.theguardian.com/technology/2017/sep/12/artificial-intelligence-face-recognition-michal-kosinski|Face-Reading AI Will Be Able to Detect Your Politics and IQ, Professor Says.]]” The Guardian (Sept. 12). **Critique of this paper** Agüera y Arca, Blaise, Margret Mitchell, and Alexander Todorov. 2018. “[[https://medium.com/@blaisea/do-algorithms-reveal-sexual-orientation-or-just-expose-our-stereotypes-d998fafdf477|Do Algorithms Reveal Sexual Orientation or Just Expose Our Stereotypes?]]” Medium (Jan 11). **A fundamental critique of the approach** Agüera y Arca, Blaise, Margret Mitchell, and Alexander Todorov. 2017. “[[https://medium.com/@blaisea/physiognomys-new-clothes-f2d4b59fdd6a | Physiognomy’s New Clothes.]]” Medium (May 7). **Further reading** Wang, Yilun, and Kosinski, Michal. 2017. “[[https://osf.io/zn79k|Deep Neural Networks Are More Accurate than Humans at Detecting Sexual Orientation from Facial Images.]]” ((Original " ai gaydar" paper) ===== Thursday ===== **Artist Talk and Workshop by Adam Harvey** (Full day) Adam Harvey's [[https://ahprojects.com | Art projects about privacy, computer vision, and surveillance]] ==== Friday Morning ==== Discussion of Readings Wednesday Afternoon [[https://web.br.de/interaktiv/ki-bewerbung/|Fairness oder Vorurteil. Einsatz Künstlicher Intelligenz bei der Jobbewerbung fragwürdig]] Bayrischer Rundfunk. 2021 [[https://web.br.de/interaktiv/ki-bewerbung/en| English Version]] Hito Steyerl (2016) [[https://youtu.be/iyEBXewn7ys?t=1571 | Keynote Conversation: Anxious to Act]] transmediale 2016, 23 Min Steyerl, Hito. 2016. “[[https://www.e-flux.com/journal/72/60480/a-sea-of-data-apophenia-and-pattern-mis-recognition/|A Sea of Data: Apophenia and Pattern (Mis-)Recognition.]]” e-flux Journal (april) Salvaggio, Eryk. 2020. “[[https://www.cyberneticforests.com/news/creative-strategies-for-algorithmic-resistance|Creative Strategies for Algorithmic Resistance!]]” Cybernetic Forests (June 29). ==== Friday Afternoon ==== Writing of short story ==== Material added after the course ==== Schiller, Devon. 2020. [[https://networkcultures.org/longform/2020/06/22/on-the-basis-of-face-biometric-art-as-critical-practice-its-history-and-politics|On the Basis of Face: Biometric Art as Critical Practice, Its History and Politics]]. Amsterdam: Insitute for Network Cultures.