At re:publica this year giving my first ever public talk about my PhD topic, titled, ‘The problem with Trolleys’, in which I will describe what I think the problem with the Trolley Problem is in its application to the development of ethics in self-driving cars.
How science represents the real world can be cute to the point of frustrating. In 7th grade mathematics you have problems like:
“If six men do a piece of work in 19 days, how many days will it take for 4 men to do the same work when two men are off on paternity leave for four months?”
Well, of course there was no such thing then of men taking paternity leave. But you can’t help but think about the universe of such a problem. What was the work? Were all the men the same, did they do the work in the same way, wasn’t one of them better than the rest and therefore was the leader of the pack and got to decided what they would do on their day off?
Here is the definition of machine learning according to one of the pioneers of machine learning, Tom M. Mitchell:
“A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P if its performance at tasks in T, as measured by P, improves with experience E”
This can be difficult for a social scientist to parse because you’re curious about the experience E and the experience-r of experience E. What is this E? Who or what is experiencing E? What are the conditions that make E possible? How can we study E? And who set the standard of Performance P? For a scientist, Experience E itself is not that important, rather, how E is achieved, sustained and improved on is the important part. How science develops these problem-stories becomes an indicator of its narrativising of the world; a world that needs to be fixed.
This definition is the beginning of the framing of ethics in an autonomous vehicle. Ethics becomes an engineering problem to be solved by logical-probabilities executed and analysed by machine learning algorithms. (TBC)
When you’re curating a program for yourself at an event or conference you’re often doing so consciously and conscientiously: there are things you need to see or attend for work, or for something new you need to wrap your head around. Then there are those times when it seems like you have no agenda except for entertainment and pleasure, which doesn’t mean, however, that your curated program is serendipitious or magical. This is what this week’s Berlinale is for me. I found myself curating one part of my program with some expected resonances: three films involving female protagonists reconstructing or re-discovering the past, and in doing so visit the unstable ground between, and in the creation of, fiction and non-fiction:
1. Kate plays Christine. Robert Greene. 2016.
In 1974 in Sarasota, Florida, a 29 year old newscaster, Christine Chubbuck, shot herself, fatally, on live TV. In 2015, an actress, Kate Lyn Sheil, prepares to recreate that moment and the film follows her journey.
2. A Magical Substance Flows Into Me. Jumana Manna. 2016
In the 1930s, Robert Lachmann, a German, had a radio show featuring “Oriental” i.e Palestinian music. In 2014, Jumana Manna, a Palestinian artist, travels around Israel and Palestine playing recordings from the old shows and recording contemporary versions. What do these songs sound like now when performed by Moroccan, Kurdish, or Yemenite Jews, by Samaritans, members of the urban and rural Palestinian communities, Bedouins and Coptic Christians?
3. The Watermelon Woman. Cheryl Dunye. 1996
Cheryl is a young black woman working at a video store. She becomes curious about black women playing stereotypical ‘mammys’ in films from the 1930s and 1940s. She sets out to discover one who is known only as the Watermelon Woman, a black lesbian actress who had an affair with a white woman director….
I’m excited to see them all and will be writing about them here..
A name from an email, a name that I recognise as distinctly East African, Kenyan to be exact, buzzes in your head. It goes round and round because it is unusual and melodic. The name belongs to someone who attended a talk I gave at an event some years before. I gave her my email ID and said ‘write to me if you have any questions, or if you want to talk more, I’m great on email and terrible on Facebook’. However, she has only ever reached out to me to accept her LinkedIn request. [I have an inbox filter set to identify the word LinkedIN in a subject line and push them to the trash folder.] It occurs to me that the two most plaintive cries of our times are: “you’re breaking up!” and “Teresa Wambui sent you a Linked In Request.” I imagine a long line of LinkedIN requests waiting patiently to be accepted, long-suffering and hopeful like not attractive people on a dating site. I ignore all of those requests, because they aren’t really requests; they are intrusions generated en masse by someone else not reading the fine print, or for that matter, what’s on the box itself. I curse her and everyone who doesn’t know what the default means, that there is default setting on things. Perhaps even on the world as you encounter it. Like the world that seemed too much for Rohith Vemula to struggle on with any further. The stardust of his dreams catch in your throat and you think about every single way that caste privilege and power is casually and not-casually implicated in your ideas of the world, your self.
My mother, after being called North-Indian-Lower-Caste by the maamis in madsaars to the point where her name was changed on her own wedding invitation card to sound more South Indian and Brahmin, has become a naturalised Tam Brahm. I can hear it in her English and Tamil. For years she was judged and teased for not being able to produce the perfectly set curd or sambar. Personally I applaud her for this, though I know it has been the source of much self doubt for her. Of course every last tyrannical Brahminical madsaar-wearing Maami wanted her to be their doctor, and she gently and respectfully helped them reach the end.
There’s the way the Brahminical self is asserted, usually jokingly, about our gradual lapse into modernity. From eating beef in restaurants, to bringing cooked beef into the house, and the granddaughter of the no-beef-in-my-house grandmother producing the finest erchi oliyathatu ever. From rank alcoholism and domestic violence to genteel wine tasting tours of Napa. Marrying lower castes, Christians. No Muslims yet, but who knows. Some never marrying at all.
Then there are the smart “Paapan” genes, shorthand for a combination of privilege, access, pressure and expectation to become doctors and/or engineers who will eventually live The Good Life in America, far away from the heat and dust of Chennai, visiting only to look in on old parents and expose American-born children to their roots. It’s a little perverse, like spitting on your grandmother’s diamond earrings, to choose something else, something outlandish like Cultural Studies, Gender Studies, Activism.
Do young people have to die in India to make a point? First there was Jyoti Pandey and now Rohith Vemula. It seems that they do. The work of politics however is harder and more personal, and it’s something that I think you do in private, in the small gestures that no one sees. It is in questioning origin stories, speech, in what you’ve come to believe in as personal choices as really being about giving in to conditioning and pressure. The work on the self doesn’t stop if you want to live a considered, sensitive life.
the act of deliberately attempting to destroy a person’s reputation by defamatory remarks
I could write about the auburn-haired woman who I sit across from at work, the one with the tics and the lazy eye. She is an only child with that peculiar sense of phantom wholeness only children have. People think she is a bureaucrat, and it may be that I am the only one who can sense the evil lurking in her. She doesn’t take risks, which isn’t necessarily a bad thing; the world needs people who can look at things rationally and calmly for a long time before acting. She is someone out of a book written by Lionel Shriver about a family full of broken people. She would be the dark horse – or the roan maybe – quietly spinning lies and deceit in the corner and all the while seeming to be the most gentle. What is it like to get into the head of a character that you dislike and yet feel empathy for? I think I would write this character falling in love with a boy much below her class – these things are very important to the English, did I mention she was English – and did madcap things with him, like walk naked down the high street and almost get arrested for it.
I could write the heartbreaking story of my best friend who fell apart from anger following the tragic death of her roommate from breast cancer. The roommate was one of those unlucky young women – 31 when she was diagnosed – who unknowingly harboured a lump like a dark grudge. She was diagnosed and dead within six months. It was six months of my friend visiting her in the hospital, accompanying her to chemotherapy, comforting the boyfriend and the girl’s family. My friend couldn’t bring herself to attend the funeral or the memorial. She was so wrapped up in her own grief, it seemed at the time, that she couldn’t reach out to the roommate’s husband (the boyfriend married her while she was dying in hospital), sister or family. She mourned for weeks and the decision to stay in their shared apartment took an additional toll on her. Months later, when I couldn’t keep quiet about it any longer,I asked her what she needed to do to get over it and move on from the roommate’s death. A lot of Old Monk later, sprawled across the divan staring at the ceiling fan, a tear rolled out of the corner of her eye and she said that she fucking hated them all, her dead roommate’s family that is. They didn’t really thank her enough for all that she did and she has never forgiven them for it. She was furious that she had been “passed over” without enough praise and thanks. She felt used. She wasn’t going to get over it until they thanked her properly for everything she had done.
There is the woman with the watery grey eyes and a gaze so steady that I believe it gives her the power of endurance, as if she could stand in a light blizzard in her mustard yellow coat and not move for hours. She arrives at her studio-office every morning, which is on the ground floor of my apartment building. She is an illustrator for school text books. Every morning she has müsli and a cafe latte at the Swiss bakery and then goes to her studio after checking for mail, sometimes pausing to look at the junk mail. She makes herself a second cup of coffee, usually black because the milk has gone bad. She sniffs at the milk every morning. She is at her desk by five minutes to 9 o’clock. She spends a few minutes rearranging her papers, checking on her pencils, scanner, computer. She then gets down to work and does not move for three hours. She refills her coffee cup in a sort of daze and then returns to her desk. She is fixed, but fluid, for those three hours, sitting in one place but appearing to be very far away somewhere inside herself, or her work. At twelve o’clock she goes for a walk, and to eat lunch if she hasn’t brought a sandwich with her. She comes back looking alert, bright-eyed, and flushed as if she has been exerting herself by walking up a hill; the approach to our building is flat however. Her afternoon routine is in complete contrast to her morning one in that there is no routine. It’s difficult to know how she will spend her afternoons. Some days she just reads, other days she types furiously at her computers, and some days she browses through what must be clickbait – there’s a sort of glazed look in her eyes as her index finger clicks through at a regular beat. The days she reads she revisits some of the morning’s deep torpor, unmoving, lost in what she is doing. And then there are those days when she lies on the chaise lounge and cries. This is preceded by a lazy pacing of the studio, staring at the floor and then collapsing into the chair with deep sobs that seem to come from very deep within and wrack her narrow frame. She seems to be able to cry for hours on end, sustaining herself through a particular rhythm. Each long, slow wave of tears building up to a crescendo as if the memories or feelings come faster and harder like contractions, they take hold of her and she seems to be as if possessed for she can seem to go on crying for a while at a loud, fevered pace. Then it ebbs and you can see her gasping for breath, realising her own tiredness, eventually stopping with a series of whimpers and falling back till the next fresh wave crashes over her. Hours later, exhausted, she falls into a deep sleep. She leaves the studio every evening at five o’clock.
a state or condition of individuals or society characterized by a breakdown or absence of social norms and values, as in the case of uprooted people.
There is a sound in this city, a soft, constant tattoo of hundreds of thousands of fingertips on keyboards. Ragged bitten grimy short Vietnamese precision manicured false brittle not enough calcium in the diet not enough vitamin D pitted nicotine marked. A global army beating its retreat from some unbearable now. Also, wires, fans, battery heat, dead metal hums that are no language just pure industrial noise and perfect background score for falling in between the cracks. Your etsy-ing, your artisanal gins and lake swimming are cute, but I think the logic of despair entails a long moment of flailing in full view on a super fast connection.
I’m trying to remember when I first heard the phrase ‘the ethics of algorithms’ (TEoA) and why it bothers me. It sounded like something from a branding exercise triumph.TEoA has dogged me; it has been the intellectual equivalent of an adorable little puppy that snaps at your ankles in encouragement to play, and then opens its eyes wide to melt you with love and fake neediness, saying take me home please, Mommy. (Maybe I’m referring to a cat and not a dog; perhaps only cats and toddlers are capable of such machinations? A puppy-cat!). The ‘ethics of algorithms’ rolls off the tongue nicely, sounds important and meaningful, and captures the degrees of concern and outrage we feel about the powerful role that computer algorithms have in society, and will continue to.
I recently started a DPhil (a kind of PhD) in big data and ethics (longer version here), so I’m somewhat invested in the phrase TEoA because that is what I’m often asked if my work is about. It isn’t. However there are people working on the ethics of algorithms and the good people at CIHR recently published a paper on it which I think you should read right after you finish reading my post, because the paper is a good description of the way algorithms work in our quantified society. I’m not referring to any of these things in my work however. What I’m working on is the algorithms of ethics. By this I mean that I’m going to think about the ethics first and understand how they work, where they come from, and what that ethics [can] mean in the context of big data.
The reason why I think TEoA is a cute but needy puppycat as described above: too atomising, deterministic even, and an outcome rather than a starting point. Why would you start with the algorithm, which is the outcome of long chains of technical, scientific, legal, economic events, and not with a point earlier on in the process of its development? I think the focus on the outcome, the algorithm, is also indicative of how we think of ethics as outcomes, rather than a series of processes, negotiations. Or the fact that we think about ethics as outcomes has led to a focus on algorithms. Both, possibly.
I don’t think algorithms have ethics; people have ethics. Algorithms govern perhaps, they make decisions, but I don’t think they have or make ethics. Of course saying ‘the ethics of algorithms’ isn’t to be taken literally; it’s the assertion that algorithms that make decisions have been programmed (to learn) how to do so because of humans (who have the capacity for ethical reasoning). However the phrase is misleading because it seems like algorithms are in fact making ethical decisions. At the same time, algorithms cannot function without making some kind of judgment (not moral judgments; though algorithms can do things that can have moral implications), it wouldn’t be able to proceed to the next step if not. But does this amount to ethics? I suppose it depends on what ethics you’re subscribing to, but I’d say no
‘Ethics of algorithms’ could also refer to the ethical features or properties of algorithms, not the ethics that algorithms are assumed to produce. Kraemer, van Overveld and Peterson have a paper on it here which is based on medical imaging analysis. This work suggests that algorithms have value judgments baked into them, their functioning but concludes the ethics is the domain of systems design(ers) and that users should have more control in the outcomes of algorithmic functioning.
My work begins with the hypothesis that principles based on classical ethics (like the oft-quoted Trolley Problem in the context of autonomous vehicles, something that I believe was developed so that journalists could write their stories) are not really appropriate to big data environments (I refer to this as a crisis of ethics), and to come up with alternate approaches to thinking about ethics. Along the way I hope to develop methods to study ethics in quantified environments, not just come up with “the answer”. (Thankfully, this is a humanities PhD so there is no “right answer”). I’m also pretty sure I will have many new puppycats snapping at my ankles excited to play.
I have also discovered that there is a strange hybrid creature called PuppyCat Here is a weird animated video with puppycats
On Friday, October 30th, I presented my new doctoral work to a small group of scholars and engaged political people who came together for an evening event around CIHR’s Fellows’ Day. This post is a summary of some of the ideas discussed there.
Every time someone says “but what about the ethics of…” they’re often referring to a personal architecture of how right and wrong stack up, or of how they think accountability must be pursued; or merely to surface the outrageous, or the potentially criminal or harmful. Then this personal morality is applied to ethical crises and termed “the ethics of” without necessarily applying any ethical rules to it. It’s a combination of truthiness and a sense of fairplay, and if you actually work on info-tech issues, perhaps a little more awareness of the stakes, positions and laws. My doctoral work is about developing a new conceptual framework with which to think about what ethics are in quantified environments .
Most of us can identify the crises in quantified environments – breaches, hacks, leaks, privacy violations, the possible, future implications of devolving control to autonomous or semi-autonomous vehicles – and these result in moral questions.And everyone has a different moral approach to these things and yet there is an attempt to appeal to some universal logic of safety, well-being, care and accountability. I argue that this is near impossible. Carolyn Culbertson is reflecting on the development of ethics in the work of Judith Butler and says what I’m trying to more eloquently:
“Our beginning-points in ethics are, for the most part, not simply our own. And to the extent that they are, we should want to question how effective these foundations will be in guiding our actions and relationships with others in a world that is even less one’s own. Moral philosophy—and I use that term broadly to mean the way that we think through how best to live our lives—is always in some sense culturally and historically situated. This fact haunts the universalist aspirations of moral philosophy—again, broadly understood—which aims to come up with, if not moral absolutes, at least moral principles that are not merely private idiosyncrasies.” 
I argue that that human, moral reasoning cannot be directly mapped onto resolving ethical crises in quantified and autonomous environments because of the size, numbers of actors, complexity,dynamism and plastic nature of these environments. How can ethics (by which I’m mostly referring to consequential, virtue ethics, deontological approaches; although there are others most derive from these) based on individual moral responsibility and virtues manifest and be applicable to distributed networks of human [agency, intention, affect and action], post-human and software/machine?
Ethics are expected to be broad, overarching, resilient guidelines based on norms repeatedly practiced and universally applied. But attempting to achieve this in quantified environments results in what I’m referring to as a A crisis of ethics. This crisis of ethics is the beginning of a new conceptual and methodological approach for how to think about the place and work of ethics in quantified environments, not an indefensible set of ethical values for quantified environments. I will start fleshing out these crises: of consciousness, of care, accountability and of uncertainty. There may be others.
Yet, the feminist philosophers ask: why these morals and ethics anyway? What makes ethics and moral reasoning from patriarchal, Western Judeo-Christian codifications in religion and the law valid? What is the baggage of these approaches and is it possible to escape the Church and the Father? What are the ethics that develop through affect? Is there an ethics in notions of collectivities, distributon, trust, sharing? I’m waiting to dive into the work of ethicists and philosophers like Sara Ahmed and Judith Butler (for starters) to find out. And, as Alex Galloway may say, the ethics are made by the protocols, not by humans. What then? (Did I say I was going to do this in three years?)
More updates as and when. I’m happy to talk or participate in events; and share the details of empirical work after April 2016. This is a part-time PhD at the Institute of Culture and Aesthetics of Media at Leuphana University in Lüneburg, Germany. I continue to work full-time at Tactical Tech.
 ‘Big data’ is a term that has general, widespread use and familiarity, however its ubiquity also makes it opaque. The word ‘big’ is misleading for it tends to indicate size or speed, which are features of this phenomenon but do not reveal anything about how it came to be either large or fast. ‘Data’ is equally misleading because nearly every technical part of the internet runs on the creation, modification, exchange of data. There is nothing about the phrase ‘big data’ that tells us what it really is. So use the terms ‘quantification’ and ‘quantified environments’ interchangeably with ‘big data’. ‘Quantified environment’ refers to a specific aspect of digital environments I.e quantification, which is made possible through specific technology infrastructures, business and legal arrangements that are both visible and invisible. The use of the phrase ‘quantification’ also indicates a subtle but real shift to the ‘attention economy’ where every single digital action is quantified within an advertising driven business model. There ‘QE’ is also an entry into discussing the social, political, technical, infrastructural aspects of digital ecosystems through specific case studies.
 Culbertson, Carolyn (2013). The ethics of relationality: Judith Butler and social critique. Continental Philosophy Review (2013) 46:449–463