This is the first pub­lic event of the cre­ative com­put­ing in­sti­tute

Feminist Internet are the first re­search group. This comes out of ual fu­tures. The mis­sion is to ad­vance equal­ity on the in­ter­net.

Gendering of per­sonal as­sis­tants (i.e Alexa, Google, Siri, Cortana)

The in­ter­net of things is a thing.

(Why is every slide a gif)

These per­sonal as­sis­tants have two com­po­nents:

Buolamwini (2018):

We have en­tered the age of au­toma­tion over­con­fi­dent, yet un­der­pre­pared. If we fail to make eth­i­cal and in­clu­sive ar­ti­fi­cial in­tel­li­gence we risk los­ing gains made in civil rights and gen­der eq­uity un­der the guise of ma­chine neu­tral­ity.


Tae (the Microsoft chat­bot)

The more you chat with Tay the smarter she gets, so the ex­pe­ri­ence can be more per­son­al­ized to you

She turned into a nazi blah blah we know the story

Microsoft Zo is the lat­est it­er­a­tion of this.

People work­ing on AI ethics:

You get a think tank! And you get a think tank!

Jaqline Feldman (2016) in the New Yorker:

By cre­at­ing in­ter­ac­tions that en­cour­age con­sumers to un­der­stand the ob­jects that serve them as women, tech­nol­o­gists abet the prej­u­dice by which women are con­sid­ered ob­jects.

Tech com­pa­nies say to this: It’s just what the mar­ket wants. AIs are de­signed as women, re­spond to abu­sive lan­guage in ways that re­in­force stereo­types.

Leah Fessler, Quartz Magazine (2017): We tested bots like Siri and Alexa to see who would stand up to sex­ual ha­rass­ment

There’s been some push­back to that (and some changes), but that’s not as good as in­ter­ven­ing at the de­sign/​de­vel­op­ment stage.


Josie Young on Feminist Chatbots

How do we in­ter­ro­gate how we de­sign chat­bots? Chatbots are prob­a­bly the main in­ter­face we have with AI (citation needed). If you call up a gov­ern­ment agency, talk to your lap­top, Facebook etc., you’re tal­ing to a chat­bot. Biases in chat­bots seep back into so­ci­ety in all kinds of ways.

Should chat­bots have a gen­der? Nope.

Feminist re­search de­sign process

Uzbekistan is­n’t a great place.

Teens in AI does boot­camps, hackathons etc.

Hey! This page con­tains em­bed­ded con­tent from Youtube, who might use cook­ies and other tech­nolo­gies to track you. To view this con­tent, click Allow Youtube con­tent.


Alex Fefegha: Algorithms and the Life of Brisha Borden

This is his CSM MA Thesis.

AI as

The study of how to make com­put­ers do things at which, at the mo­ment, peo­ple are bet­ter Rich and Knight (1991)

Brisha Borden was a Florida teen. She got ar­rested for steal­ing a bike - the judge in the case was us­ing a re-of­fend­ing score soft­ware. Of course the thing’s racist.

This is de­tailed in a 2016 ProPublica Investigation

The of­fend­ers in Florida would get a sur­vey where they ask ques­tions which are es­sen­tially de­signed to fil­ter out poor peo­ple. Of course this plays into dis­pro­por­tion­ate sen­tenc­ing of black peo­ple in the US>

Responses to the ProPublica piece:

This is all based on US data and re­port­ing, how does this play in a UK con­text. Ran work­shops etc. with Comuzi.


Philip Alston:

It is ex­tremely im­por­tant for an au­di­ence in­ter­ested in AI to rec­og­nize that when we take a so­cial wel­fare sys­tem and … put on top of it ways to make it more ef­fi­cient, what we’re do­ing is dou­bling down on in­jus­tices


AI Cheatsheet


How do we bal­ance chang­ing tech vs chang­ing so­ci­ety

Do we need stan­dards / global reg­u­la­tion for AI

Calvert makes her point about ar­ti­fi­cial in­tel­li­gence v.s. par­tial in­tel­li­gence

Fefegha: I stay away from that con­ver­sa­tion and fo­cus on real-world is­sues that af­fect peo­ple now (i.e sen­tenc­ing) Young: A more opimistic of the fu­ture, where AI cre­ates, works to­gether. Her (2013) as op­posed to Ex Machina (2014).