Imagining Computing Futures and Mitigating Algorithmic Harm: Conversations Between Artistic Disciplines and Computing. CSCW ’24.

Angela Schöpke-Gonzalez, Justin Wyss-Gallifent, Charli Brissey, Steph Jordan, Libby Hemphill. (2024). Imagining Computing Futures and Mitigating Algorithmic Harm:
Conversations Between Artistic Disciplines and Computing. CSCW Companion 2024: 672-674.

Available online at https://dl.acm.org/doi/pdf/10.1145/3678884.3687140


ABSTRACT
This SIG invites interdisciplinary researchers in ethics, computing,
and the arts; data professionals; and arts practitioners into a col-
lective reflection on how these disciplines can work together to
mitigate algorithmic harm. As we explore this topic, we recognize
the historical and current marginalization of artistic disciplines in
terms of credit and funding, and bring into our discussion consid-
eration not only of what computing and data professions can gain
from the arts, but of what artistic disciplines can gain from working
with computing and data professions guided by the shared goal of
mitigating algorithmic harm. This SIG invites reflection on how
arts-based methods can help answer the following questions:
(1) What agency and responsibility do data professionals as
individuals have for mitigating algorithmic harm in their
day-to-day workflows?
(2) How does a data professional’s individual agency and re-
sponsibility relate to collectives (e.g., institutions, groups of
colleagues, families, employers, etc.) that they are a part of?
(3) What agency and responsibility do collectives have for miti-
gating algorithmic harm?
(4) What can artistic disciplines gain from working with com-
puting and data professions to mitigate algorithmic harm?
During this SIG, we will introduce the necessary disciplinary
contexts for our collective discussion, facilitate a movement impro-
visation score that we developed called “On The Perils of Poorly
Chosen Sorting Algorithms”, and facilitate reflective discussion
around how arts-based methods can invite data professionals to
imagine new data workflows that mitigate algorithmic harm.