Both sides previous revision Previous revision Next revision | Previous revision |
art_ai [2025/03/30 15:07] – [Afternoon] fstalder@zhdk.ch | art_ai [2025/04/08 16:13] (current) – [Thursday: Subjectivity and Personas] fstalder@zhdk.ch |
---|
====== Course Outline ====== | ===== Art and AI. What Kind of AI Do We Want? Bringing Artistic and Technological Practices Together. ===== |
| |
Main Teaching Staff:[[https://www.nora-al-badri.de/| Nora al Badri]] & [[https://felix.openflows.com|Felix Stalder]] | Main Teaching Staff:[[https://www.nora-al-badri.de/| Nora al Badri]] & [[https://felix.openflows.com|Felix Stalder]] |
| |
At the end of the seminar, interdisciplinary teams will develop concepts for joint practice-related projects in AI and art. | At the end of the seminar, interdisciplinary teams will develop concepts for joint practice-related projects in AI and art. |
| |
| [[https://pad.vmk.zhdk.ch/artaiFS2025|PAD FOR NOTES]] |
| |
===== Logistics and Requirements ===== | ===== Logistics and Requirements ===== |
* Retrieval Augmented Generation (RAG) for LLMs | * Retrieval Augmented Generation (RAG) for LLMs |
| |
| |
| * https://huggingface.co/playground |
| * https://huggingface.co/chat/assistants |
| |
| * **{{ ::ap_what_ai_2025.pdf | Alexandre Puttick, Slides from Presentation, PDF}}** |
===== Tuesday: Human and Non-Human Personas ===== | ===== Tuesday: Human and Non-Human Personas ===== |
| |
| |
* Introduction to the course and overview of the week | * Introduction to the course and overview of the week |
* First Group Work: What persona does the AI app you are using have` What does it expect from the user? | * **Group Work** |
| * Session Zero: What is the persona of the AI you are using? What does it expect from the user? |
* 5 Minute presentation by each group | * 5 Minute presentation by each group |
| |
* Artist presentation & discussion by Nora Al-Badri | * Artist presentation & discussion by Nora Al-Badri. https://www.nora-al-badri.de/works-index |
| * The Other Nefretiti |
| * Nefretiti Bot |
| * Fossil Futures |
| * Babylonian Visions |
| |
=== Afternoon === | === Afternoon === |
* Technological Persona. Input by Felix Stalder | * Technological Persona. Input by Felix Stalder |
| |
* Group work: Reading, Presenting, Discussion | * ** Group Work:** Reading, Presenting, Discussion |
* Weizenbaum, Joseph. Computer Power and Human Reason: From Judgment to Calculation. San Francisco: W.H. Freeman and Company, 1976. [[https://cyborgdigitalculture.files.wordpress.com/2013/09/24-weizenbaum-03.pdf|Introduction, 1-16]] | * Weizenbaum, Joseph. Computer Power and Human Reason: From Judgment to Calculation. San Francisco: W.H. Freeman and Company, 1976. [[https://cyborgdigitalculture.files.wordpress.com/2013/09/24-weizenbaum-03.pdf|Introduction, 1-16]] |
* Reeves, Byron, and Clifford Nass. The Media Equation: How People Treat Computers, Television and New Media like Real People and Places. New York: Cambridge university press, 1996. {{ ::the_media_equation_how_people_treat_computers_television_--_byron_reeves_clifford_ivar_nass.pdf |PDF}} | * Reeves, Byron, and Clifford Nass. The Media Equation: How People Treat Computers, Television and New Media like Real People and Places. New York: Cambridge university press, 1996. {{ ::the_media_equation_how_people_treat_computers_television_--_byron_reeves_clifford_ivar_nass.pdf |PDF}} |
* Chapter 15. Voices, 171-180 | * Chapter 15. Voices, 171-180 |
* Niederberger, Shusha. “[[https://aprja.net//article/view/140449|Calling the User: Interpellation and Narration of User Subjectivity in Mastodon and Trans*Feminist Servers.]]” A Peer-Reviewed Journal About 12, no. 1 (September 7, 2023): 177–91. | * Niederberger, Shusha. “[[https://aprja.net//article/view/140449|Calling the User: Interpellation and Narration of User Subjectivity in Mastodon and Trans*Feminist Servers.]]” A Peer-Reviewed Journal About 12, no. 1 (September 7, 2023): 177–91. |
* Lewis, Jason Edward, Noelani Arista, Archer Pechawis, and Suzanne Kite. “[[https://jods.mitpress.mit.edu/pub/lewis-arista-pechawis-kite/release/1Making Kin with the Machines.|Making Kin with the Machines.]]” Journal of Design and Science, July 16, 2018. | * Lewis, Jason Edward, Noelani Arista, Archer Pechawis, and Suzanne Kite. “[[https://jods.mitpress.mit.edu/pub/lewis-arista-pechawis-kite/release/1Making Kin with the Machines.|Making Kin with the Machines.]]” Journal of Design and Science, July 16, 2018. (read until end of "Aloha as moral discipline" and then again, "Resisting Reduction: An Indigenous Path Forward") |
| |
**Art Works** | **Art Works** |
| |
* Cornelia Sollfrank: Thoughts of a Server, Sound piece, 2024. [[https://soundcloud.com/purplenoise2018/thouhgts-of-a-server]] | * Cornelia Sollfrank: Thoughts of a Server, Sound piece, 2024. [[https://soundcloud.com/purplenoise2018/thouhgts-of-a-server]] |
| |
| ===== Wendesday: Artistic Approaches and Post-Colonial Perspectives ===== |
| |
===== Wendesday: Artistic Approaches and Post-Colonial Perspectives ===== | === Morning === |
| |
==== Morning ==== | |
| |
**Artistic Works** | **Artistic Works** |
* !Mediengruppe Bitnik, Ashley Madison. Angels at Work. 2016, https://wwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwww.bitnik.org/a | |
| * !Mediengruppe Bitnik, Ashley Madison. Angels at Work. 2016, [[https://wwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwww.bitnik.org/a]] |
* Giacomo Meli, Infinite Coversation, 2022 | * Giacomo Meli, Infinite Coversation, 2022 |
* https://jamez.it/project/the-infinite-conversation/ | * [[https://jamez.it/project/the-infinite-conversation/]] |
* https://www.infiniteconversation.com/ | * [[https://www.infiniteconversation.com/]] |
* Lauren Lee McCarthy | * Lauren Lee McCarthy |
* Lauren, 2023 https://lauren-mccarthy.com/LAUREN | * Lauren, 2023 [[https://lauren-mccarthy.com/LAUREN]] |
* Someone, https://lauren-mccarthy.com/SOMEONE (NA) | * Someone, [[https://lauren-mccarthy.com/SOMEONE]] (NA) |
* Mikala Hyldig Dal: Eco Oracle, https://www.mikala-dal.art/ (NA) | * Mikala Hyldig Dal: Eco Oracle, [[https://www.mikala-dal.art/]] (NA) |
* Tega Brain, Alex Nathanson and Benedetta Piantella. Solar Protocol, https://solarprotocol.net/index.html (FS) | * Tega Brain, Alex Nathanson and Benedetta Piantella. Solar Protocol, [[https://solarprotocol.net/index.html]] (FS) |
* Feileacan Kirkbride McCormick and Sofia Crespo. Entangled Others, https://entangledothers.studio (NA) | * Feileacan Kirkbride McCormick and Sofia Crespo. Entangled Others, [[https://entangledothers.studio]] (NA) |
* Nora Al Badri, Nefertiti Bot, https://www.nora-al-badri.de/works-index#nefertitibot | * Nora Al Badri, Nefertiti Bot, [[https://www.nora-al-badri.de/works-index#nefertitibot]] |
* Stefanie, Dinkins, Not the Only One, 2018, https://www.stephaniedinkins.com/ntoo.html (FS) | * Stefanie, Dinkins, Not the Only One, 2018, [[https://www.stephaniedinkins.com/ntoo.html]] (FS) |
* Mimi Ọnụọha. Library of Missing Datasets, Version 1.0, 2016, https://mimionuoha.com/the-library-of-missing-datasets (NA) | * Mimi Ọnụọha. Library of Missing Datasets, Version 1.0, 2016, [[https://mimionuoha.com/the-library-of-missing-datasets]] (NA) |
* Caroline Sniders, Feminist Data Set, 2017 — Current, https://carolinesinders.com/feminist-data-set/ | * Caroline Sniders, Feminist Data Set, 2017 — Current, [[https://carolinesinders.com/feminist-data-set/]] |
* Linda Dunia Rebeiz, Once Upon A Garden, https://onceuponagarden.xyz/ | * Linda Dunia Rebeiz, Once Upon A Garden, [[https://onceuponagarden.xyz/]] |
* Holly Herndon: holly+ und xhairymutantx https://www.youtube.com/watch?v=yZe5fnFB-ZE (FS & NA) | * Holly Herndon: holly+ und xhairymutantx [[https://www.youtube.com/watch?v=yZe5fnFB-ZE]] (FS & NA) |
* AJL: Voicing Erase, 2020, https://www.ajl.org/voicing-erasure (FS) | * AJL: Voicing Erase, 2020, [[https://www.ajl.org/voicing-erasure]] (FS) |
* Maya Indira Ganesh: A is for another https://aisforanother.net/pages/project.html | * Maya Indira Ganesh: A is for another [[https://aisforanother.net/pages/project.html]] |
| |
| **Group Work** |
| |
Group Work | |
* Introduction | * Introduction |
* First Session | * First Session |
| |
==== Afternoon ==== | === Afternoon === |
| |
Input Nora Al-Badri: **Post-Colonial Perspectives & who is speaking** | Input Nora Al-Badri: **Post-Colonial Perspectives & who is speaking** |
**Further reading:** | **Further reading:** |
| |
* AI-Myths: Ethics guidelines will save us (2020), https://www.aimyths.org/ethics-guidelines-will-save-us | * AI-Myths: Ethics guidelines will save us (2020), [[https://www.aimyths.org/ethics-guidelines-will-save-us]] |
* Arun, Chinmayi. 2020. “AI and the Global South: Designing for Other Worlds.” In The Oxford Handbook of Ethics of AI, edited by Markus Dirk Dubber, Frank Pasquale, and Sunit Das. Oxford Handbooks Series. New York, NY: Oxford University Press. | * Arun, Chinmayi. 2020. “AI and the Global South: Designing for Other Worlds.” In The Oxford Handbook of Ethics of AI, edited by Markus Dirk Dubber, Frank Pasquale, and Sunit Das. Oxford Handbooks Series. New York, NY: Oxford University Press. |
* T.C. Boyle, "Water Music", 1981 | * T.C. Boyle, "Water Music", 1981 |
* María do Mar Castro Varela / Nikita Dhawan. 2015. “Postkoloniale Theorie. Eine kritische Einführung”, Transcript Verlag. | * María do Mar Castro Varela / Nikita Dhawan. 2015. “Postkoloniale Theorie. Eine kritische Einführung”, Transcript Verlag. |
* Mejias, Ulises A., and Nick Couldry. 2019. “Datafication.” Internet Policy Review 8 (4) | * Mejias, Ulises A., and Nick Couldry. 2019. “Datafication.” Internet Policy Review 8 (4) |
* Mohamed, Shakir, Marie-Therese Png, and William Isaac. 2020. “Decolonial AI: Decolonial Theory as Sociotechnical Foresight in Artificial Intelligence.” Philosophy & Technology 33 (4): 659–84. https://doi.org/10.1007/s13347-020-00405-8 | * Mohamed, Shakir, Marie-Therese Png, and William Isaac. 2020. “Decolonial AI: Decolonial Theory as Sociotechnical Foresight in Artificial Intelligence.” Philosophy & Technology 33 (4): 659–84. [[https://doi.org/10.1007/s13347-020-00405-8]] |
* George Orwell, "Burmese Days”, 1934 | * George Orwell, "Burmese Days”, 1934 |
* Shankar, Shreya, Yoni Halpern, Eric Breck, James Atwood, Jimbo Wilson, and D. Sculley. 2017. “No Classification without Representation: Assessing Geodiversity Issues in Open Data Sets for the Developing World.” Presented at NIPS 2017 Workshop on Machine Learning for the Developing World | * Shankar, Shreya, Yoni Halpern, Eric Breck, James Atwood, Jimbo Wilson, and D. Sculley. 2017. “No Classification without Representation: Assessing Geodiversity Issues in Open Data Sets for the Developing World.” Presented at NIPS 2017 Workshop on Machine Learning for the Developing World |
* Scheuerman, Morgan Klaus, Madeleine Pape, and Alex Hanna. 2021. “Auto-Essentialization: Gender in Automated Facial Analysis as Extended Colonial Project.” Big Data & Society 8 (2): 205395172110537. | * Scheuerman, Morgan Klaus, Madeleine Pape, and Alex Hanna. 2021. “Auto-Essentialization: Gender in Automated Facial Analysis as Extended Colonial Project.” Big Data & Society 8 (2): 205395172110537. |
| |
| **Group Work** |
**Group Work** | |
* Second Session, 120 Minutes | * Second Session, 120 Minutes |
| |
| ===== Thursday: Subjectivity and Personas ===== |
| |
| === Morning === |
| |
===== Thursday: Subjectivity and Personas ===== | Short Texts, associative read and present one sentence |
| |
==== Afternoon: ==== | https://drive.google.com/drive/folders/12kZiSpMozZaky9dVodt22OnW4pxuGtgm |
| |
Guests: | * Gayatry Spivak: Can the Subaltern Speak?, 1989 |
| * Aimeé Césaire: Discourse on Colonialism, 1950 |
| * Tuck/Yang: Decolonization is Not a Metaphor, 2012 |
| * Frantz Fanon: The Wretched of the Earth, 1961 |
| * Edward Said: Orientalism, 1978 |
| |
**Dr. [[https://ets.ethz.ch/people/person-detail.MzM5NzY0.TGlzdC80NTY5LDIwMTgyMzYxMDk=.html|Kebene Wodajo]], **Lecturer at the Department of Humanities, Social and Political Sciences, ETHZ "**Subjectivity, Personhood, and AI Systems: Afro-Communitarian Gaze** The modernist gaze narrates the story of data and data-driven technologies, such as AI systems, through an individualistic, state-centric, and market-oriented lens. Within this narrative, a person is conceptualised as an individual data subject who exercises control and retains rights over their data, while relying on state-centric frameworks for protection—or so the prevailing rhetoric suggests. But what if we were to shift our perspective, to cast our gaze towards the 'otherwise', towards the non-mainstream, towards pluralistic ways of seeing, being, and becoming? What if we viewed data and data-driven technologies in their full complexity—not merely as products/commodities, but rather as intricate sociotechnical, material, discursive, and more-than-human? This lecture invites students to embark together on this journey towards the 'otherwise', guided by Afro-communitarian perspectives. This viewpoint begins with 'we' rather than 'I', emphasises multiplicity over singularity, and foregrounds multiple ontologies and diverse forms of becoming. Adopting this gaze neither erases individuality nor diminishes the pursuit of equality, fairness, and transparency in AI systems. Rather, these concepts are illuminated in views from the 'otherwise'. They are understood not merely as claims grounded in legally prescribed rights but as necessities arising from the inherent multiplicity within individuality itself—captured vividly in various Afro-communitarian principles, from the well-known 'I am because we are' to ‘//Namummaa’//. This gaze remains deeply attentive to personhood as entangled within plurality and relationality; and subjectivity, as a form of irreducibly emergent becoming. | **Group Work** |
| |
| * Session Three |
| |
| === Afternoon: === |
| |
| Guests: |
| |
| **Dr. [[https://ets.ethz.ch/people/person-detail.MzM5NzY0.TGlzdC80NTY5LDIwMTgyMzYxMDk=.html|Kebene Wodajo]], **Lecturer at the Department of Humanities, Social and Political Sciences, ETHZ "**Subjectivity, Personhood, and AI Systems: Afro-Communitarian Gaze** The modernist gaze narrates the story of data and data-driven technologies, such as AI systems, through an individualistic, state-centric, and market-oriented lens. Within this narrative, a person is conceptualised as an individual data subject who exercises control and retains rights over their data, while relying on state-centric frameworks for protection—or so the prevailing rhetoric suggests. But what if we were to shift our perspective, to cast our gaze towards the 'otherwise', towards the non-mainstream, towards pluralistic ways of seeing, being, and becoming? What if we viewed data and data-driven technologies in their full complexity—not merely as products/commodities, but rather as intricate sociotechnical, material, discursive, and more-than-human? This lecture invites students to embark together on this journey towards the 'otherwise', guided by Afro-communitarian perspectives. This viewpoint begins with 'we' rather than 'I', emphasises multiplicity over singularity, and foregrounds multiple ontologies and diverse forms of becoming. Adopting this gaze neither erases individuality nor diminishes the pursuit of equality, fairness, and transparency in AI systems. Rather, these concepts are illuminated in views from the 'otherwise'. They are understood not merely as claims grounded in legally prescribed rights but as necessities arising from the inherent multiplicity within individuality itself—captured vividly in various Afro-communitarian principles, from the well-known 'I am because we are' to ‘//Namummaa’//. This gaze remains deeply attentive to personhood as entangled within plurality and relationality; and subjectivity, as a form of irreducibly emergent becoming. |
| |
| **Further Readings**: |
| |
**Further Readings**: | |
* Mhlambi, S. (2020). [[https://perma.cc/Q5ZL-TTD8|From rationality to relationality: Ubuntu as an ethical and human rights framework for artificial intelligence governance]]. In //Car center for human rights policy//, discussion paper series 220–009. | * Mhlambi, S. (2020). [[https://perma.cc/Q5ZL-TTD8|From rationality to relationality: Ubuntu as an ethical and human rights framework for artificial intelligence governance]]. In //Car center for human rights policy//, discussion paper series 220–009. |
* Law, John (2015). [[https://doi.org/10.1080/1600910x.2015.1020066|What's wrong with a one-world world? ]]//Distinktion//: //Journal of Social Theory//, 16(1) pp. 126–139 | * Law, John (2015). [[https://doi.org/10.1080/1600910x.2015.1020066|What's wrong with a one-world world? ]]//Distinktion//: //Journal of Social Theory//, 16(1) pp. 126–139 |
| |
**Dr. [[https://ranjodhdhaliwal.com/|Ranjodh Singh Dhaliwal,]] **Professor, Digital Humanities, Artificial Intelligence and Media Studies, University of Basel | **Dr. [[https://ranjodhdhaliwal.com/|Ranjodh Singh Dhaliwal,]] **Professor, Digital Humanities, Artificial Intelligence and Media Studies, University of Basel |
| |
**The Perplexities of Persona and Personhood: On LLMs, Stereotypes, and Design Methods** | **The Perplexities of Persona and Personhood: On LLMs, Stereotypes, and Design Methods** |
| |
Across industry and academia, a curious trend has emerged. From OpenAI to law firms and sociologists to policy analysts, several expert groups have started using large language models (LLMs) to model human perspectives. This lecture simply seeks to understand, contextualise, and investigate this phenomenon. We will look at the technical substrates of chatbots, which make them ripe for the use of 'personas,' a prompt engineering technique that instantiates different 'bots' from the same underlying AI model. What are the social, cultural, political, and affective implications of this world full of bots with personas? Staying with this question, Dhaliwal will then trace the spread of persona-bots in academic and corporate methodologies. Why and how he asks, do these communities expect (or desire) their bots to stand in for actual humans – be they sociological subjects or democratic actors – and how do such modes of thought succeed and fail? | Across industry and academia, a curious trend has emerged. From OpenAI to law firms and sociologists to policy analysts, several expert groups have started using large language models (LLMs) to model human perspectives. This lecture simply seeks to understand, contextualise, and investigate this phenomenon. We will look at the technical substrates of chatbots, which make them ripe for the use of 'personas,' a prompt engineering technique that instantiates different 'bots' from the same underlying AI model. What are the social, cultural, political, and affective implications of this world full of bots with personas? Staying with this question, Dhaliwal will then trace the spread of persona-bots in academic and corporate methodologies. Why and how he asks, do these communities expect (or desire) their bots to stand in for actual humans – be they sociological subjects or democratic actors – and how do such modes of thought succeed and fail? |
| |
**Further Readings** | **Further Readings** |
| |
* Bisbee, James, Joshua D. Clinton, Cassy Dorff, Brenton Kenkel, and Jennifer M. Larson. “Synthetic Replacements for Human Survey Data? The Perils of Large Language Models.” //Political Analysis// 32, no. 4 (October 2024): 401–16. [[https://doi.org/10.1017/pan.2024.5]]. | * Bisbee, James, Joshua D. Clinton, Cassy Dorff, Brenton Kenkel, and Jennifer M. Larson. “Synthetic Replacements for Human Survey Data? The Perils of Large Language Models.” //Political Analysis// 32, no. 4 (October 2024): 401–16. [[https://doi.org/10.1017/pan.2024.5]]. |
* Blair, Margaret M. “Corporate Personhood and the Corporate Persona Symposium: In the Boardroom.” //University of Illinois Law Review// 2013, no. 3 (2013): 785–820. | * Blair, Margaret M. “Corporate Personhood and the Corporate Persona Symposium: In the Boardroom.” //University of Illinois Law Review// 2013, no. 3 (2013): 785–820. |
| |
===== Friday: Group Work and Presentations ===== | ===== Friday: Group Work and Presentations ===== |
| |
| **Group Work** |
| |
| * Session Four: finishing up & preparing the presentation |
| |
| **Presention**: each 10 Minute Presentation, 10 Minute Diskussion |
| |
| |