If you suspend your transcription on amara.org, please add a timestamp below to indicate how far you progressed! This will help others to resume your work!
Please do not press “publish” on amara.org to save your progress, use “save draft” instead. Only press “publish” when you're done with quality control.
There’s a lot of emotion involved with social media with posts, images, and songs; all of those events have a context and that’s a context algorithms can’t understand. However, humans can. A Social Media Break Up Coordinator goes through a clients social media accounts with the client, and helps block, unfriend, untag, and help ‚mute‘ old relationships and bad memories.
„Social Media Break Up Coordinator“ will be presented as lecture over my performance which is debuting at Babycastles on Nov 21st, and 22nd. I am a user researcher and UX designer at IBM Watson. I work in conversational analytics and I help design systems and software layout for chat bots. I spend a lot of time thinking about the way systems and AIs ‚think‘ about and relate to words, context of conversations, and emotions. The way our social media systems are structured, there’s a fair amount of outlining and ‚work‘ users have to do when it comes to reporting online harassment, changing privacy settings, etc. But in that same vein, there's a lot the systems do for us- from suggesting users, suggesting content, to display of content, messaging, images, and events. All of those ‚things‘ are coming from content created by users, and are treated as data. But what is the context of it all? Facebook can see if a user removes a relationship status, but there isn’t a button or an algorithm for break ups. How do you a tell a system your child died or your heart was broken?
That’s where the Social Media Break Up Coordinator comes in. I will perform a series of paid events for my customers, who will also sign a legally binding. These services range from untagging the user and the ex or chosen person in specific images, moving friends of the ex/person and the ex/person onto a special list (to either mute, see less content, etc), crafting a series of emotional neutral messages to alert the other person that they are being unfriended/unfollowed/blocked for specific reasons, taking the number of the ex, deleting it from the client’s cellphone, and holding onto it for a specified amount of time decided upon between the client and myself.
Effectively, the way to emotional navigate really ‚sticky‘ situations in social media is to have human intervention. Human emotions and relationships are complex and complicated, and require context, very deep context to understand. This is just something an algorithm cannot do. I’m interested in exploring emotional labor and the creation of new digital services and job to aid in this area of human relationships that have gone awry or death, as more and more users lives are lived and shared online, especially when algorithms began to fail with this kind of content.
The talk/lecture at CCC would feature documentation of my performance, research that I’ve done around blocking, muting, and a wide variety of social interactions (from abusive arguments with Gamergaters to navigating spaces with exes- ex boyfriends, girlfriends, best friends, and co-workers). I will also lecture on the structure of Facebook, Twitter, and Instagram- and how the UI first fails with users when it comes to ‚de-couple‘ or ‚un-friend‘. I’ll then lecture how the algorithms fail, because all of the interactions of these are designed towards users interacting, not users completely separating from other users who may be very connected by friends and professional networks. What is the solution for this? My hypothesis is a series of newly created human roles, almost like a freelance life couch, to help the user get their digital lives in ‚order‘.