Rachel Kisela
Algorithmic Autonomy: How to regain control over your social media experience
Updated: Feb 25, 2021
My TikTok feed knows a lot about me. It knows that I like watching videos about cooking, fashion, and tech (coincidentally, that’s how I discovered Herd). It also figured out, through a process of analyzing my likes, shares, and watch time, that I was in a romantic relationship. What TikTok wasn’t able to mathematically analyze was when my partner and I hit a rough patch. When I next opened TikTok, I felt emotionally bombarded with videos of cute couple activities and date ideas. All I wanted to do was turn it off and go back to recipe tutorials to take my mind off of what was causing me distress.
What TikTok missed in this key moment was algorithmic autonomy: the ability for the user to decide what kinds of content to consume. Without autonomy, the user completely hands control of the content they consume to the algorithm of the platform, no matter what positive or negative emotions that content might bring up. And of course, in order to exert control over an algorithm, the user needs to know how it works to begin with through a concept called algorithmic transparency. Transparency allows a user to see for themselves how their digital identity is constructed, and how that identity affects what content appears as they scroll through social media. Providing transparency is not always easy in a world where much of our algorithms are governed by complex artificial intelligence, but it is essential to establishing trust with users and allowing them to regain control over their digital experience.
When giving up control of a user experience to an algorithm, and restricting the user’s knowledge of how that algorithm operates, there are a number of potential negative user experiences. Dr. Taina Bucher, a media researcher at the University of Copenhagen, outlines five of them, ranging from individual to societal consequences.
Profiling identity happens when a digital identity, an assumption of what kinds of content you want to view, has been generated by the algorithm to correspond with your name. For example, if a male user interested in makeup content signs up for a social media account, it’s possible that the algorithm would take longer to recognize that interest than it would for a female user or even fail altogether.
Whoa moments are those times when you feel as though social media has profiled you a little too accurately. Have you ever scrolled through social media and saw an advertisement for a product that you were literally just discussing with a friend? Without insight into where and how the site produced this advertisement, users have the potential to feel distrustful of the site’s data collection policies.
Faulty prediction is when the user is aware, and annoyed, that the algorithm is making an incorrect choice in selecting content to display. In my example, TikTok made a faulty prediction that I wanted to see content about relationships.
The popularity game describes all the ways users attempt to have their content picked up by an algorithm and shown to other people. Working alongside the mostly unknown inner workings of an algorithm can induce stress for a user as they try to attain a certain number of likes or followers.
Cruel connections happen when an algorithm makes an inhumane choice in what to display to a user, causing distress. Displaying photos of a recently deceased family member, for example, in a “year in review” photo slideshow could be considered a cruel connection.
Ruined friendships is the concept that social media algorithms actually affect the friendships they mediate. Have you ever been scrolling through social media and see a post from someone you haven’t talked to in years, inspiring you to reach out? Or maybe an algorithm has neglected to display a certain friend’s posts in favor of more popular ones, leading you to forget about or neglect your friendship. The strengths and weaknesses of your social connections may actually be strongly influenced by the algorithmic choices behind your favorite social media sites.
As you may have realized, one key step towards improving these negative experiences is to incorporate transparency and autonomy into algorithmic design. That’s why the team at Herd has been hard at work developing one of its standout features, Feed Autonomy. Feed Autonomy allows you, the user, to be 100% in control of what kind of content you’d like to see on your feed. When you open the Feed Autonomy page, you are presented with a list of topics that you can adjust using a frequency slider. You can access this page at any time to adjust your feed as your interests and preferences change. You don’t have to turn over control of your mental health and wellbeing to an algorithm - Herd’s Feed Autonomy allows you to make healthy choices about how to engage with social media, because you know yourself best.
A prominent social media researcher and Cornell professor, Tarleton Gillespie, notes in a 2016 article that algorithms not only choose what cultural products to display, but they are active contributors to culture themselves. Algorithms have the power to shift democratic elections through choices in displaying political content or to popularize the next viral trend. Social media algorithms have even contributed to the rising problem of eating disorders by not providing users with the autonomy to avoid triggering content. However, Herd’s Feed Autonomy feature is a bold step in the right direction. Herd’s co-founders Mady Dewey and Ali Howard have re-imagined how you can interact with social media algorithms in a way that heals, energizes, and motivates you to be your best self.
View more of Rachel's work here.