HSURAE




I’m currently working on a neural network that categorizes things based on differences instead of similarities. I call this an empathy machine, based on the paradox that confines empathy, the paradox of difference and similarity. (And their constitutive paradoxes: self/other, interior/exterior, subject/object, mind/body, visible/invisible, life/death, light/dark, masculine/feminine…etc) Studies have shown that empathy is more fluid when the other resembles ourselves, and more solid when they appear different.

In the psychologist Paul Bloom's book Against Empathy, he notes that the greatest pitfall of empathy is that it leads to tribal mentality. Most people can only empathize with others that they can imagine themselves into. Nearly a decade before Bloom’s book, Saidiya Hartman theorizes that this limit of empathy is in fact a feature and not a bug. In order to empathize, one displaces the other to feel their pain in a form that is legible to oneself. Unsurprisingly, this legibility hinges upon sameness. Hartman critiques that empathy requires the negation and obliteration of the other into a phantasy of ourselves.


Our current theories of empathy are inconsistent and contested, we praise it today and cancel it tomorrow. However, it is a concept we hear ever too often, because at the bottom of empathy is the question: how do we care for one another? We cannot dispense with it. We need to find ways to delay, displace, and dub its meaning beyond the absolute individuation of subjectivity, beyond a projection based upon sameness.

At the heart of this project is the desire to hack into the kinds of arrangements(code) of categories(logic) that lead to a proprietary relationship with the earth and each other. What are the metaphysical assumptions that precondition dispossession? How do we design more equitable systems without careful consideration of the metaphysics that run the systems that run the systems? This work builds upon Deleuzian (and some Derridian) metaphysics of difference in training our prodigy, in the hopes that when an artificial general intelligence finally arrives, it will empathize on much higher orders than we ever could.   




© 2023 HSURAE