top of page

Demystifying Algorithms

What are their ramifications beyond computational mathematics?

Human fingers controlled by electronic devices. Image provided by Paul Spella of The Atlantic

“Why are algorithms important?” I asked my best friend — Google, of course. The search engine has been my ever-loving, supportive buddy since childhood. It too, however, has its inadequacies. While it toiled for an extraordinary 0.55 seconds to retrieve hundreds of millions of answers to said question, the insular tool failed to realize it was powered with the help of algorithms — technically.


Algorithms are a set of instructions that allow our devices to accomplish both simple and complex tasks. Google uses them to “retrieve data from its search index and instantly deliver the best possible results for a query,” according to the Search Engine Journal. The results are further stratified or ranked by relevance. Humans have algorithms to efficiently gather groceries from the car to the kitchen top counter or to follow an intricate recipe. Algorithms, however, are now subject to much polarization. First manifesting in the form of the quadratic formula, now they are accused of being racist and plunging social media users into deep and, possibly, incendiary feedback loops.


On social media, algorithms are instructions made to determine what humans like watching. According to an article from the Washington Post, Facebook has departed from its ubiquitous clickbait articles (formerly used to ensure longer visits) while aggrandizing interaction — miscellaneous content, like viral memes and posts that friends and family either make or comment on. They are also very personalized. If a person has very particular values, the Facebook algorithm could recognize that and send them videos and articles that align with their views and have significant interactions with many comments and likes. Suggestions predicated on our interests and biases are not wholly insidious. They introduce users to a fan club of fellow followers of composers, journalists, or socialites. They also expose users to more stressful conversations in the comments, inducing stress and anxiety or calcifying our biases.


Outside social media, the technology is generating more controversy. During an interview in an MIT Technology Review article, Yeshimabeit Milner of Data for Black Lives proclaimed, “There’s a long history of data being weaponized against Black communities.” Similar to social media algorithms that predict what content a user would watch based on data, predictive policing tools hypothesize what areas and people are more likely to be defined by crime. “Location-based algorithms,” continues the article, “draw on links between places, events, and historical crime rates to predict where and when crimes are more likely to happen—for example, in certain weather conditions or at large sporting events.” The other algorithms are based on personal characteristics — gender, history of substance abuse, and criminal record. The problem arises not in the algorithm itself but in the data it is based on. Predicting the recidivism (likelihood of returning to jail) of a black convict is not wholly accurate, as the article delineates, because African-Americans are more disproportionately arrested and re-arrested than any other group in America. In other words, the algorithms entrench bias while diminishing our capability to think and find answers ourselves. The tools are convenient, but they keep us complacent and unwilling to seek or enjoy anything outside of what is familiar or confirms our notions about people, places and events.

@2024 International Review in STEM (IRIS)

  • Instagram
  • LinkedIn
  • X
  • TikTok
bottom of page