X Close

Global Social Media Impact Study

Home

Project Blog

Menu

Algorithms and homogenization

By Elisabetta Costa, on 8 November 2012

 

The general goal of our project is to investigate on the social effects of Facebook in seven different countries. We want to understand how Facebook, as a global phenomenon, is locally appropriated in seven different small towns. It’s comparative research, so we are interested in finding out differences and similarities emerging from these places. The argument that Facebook is always an invention and creation of its users (Miller) does not imply that the social network does not have its own infrastructure or architecture that produces some sort of homogenization.

In the field of Internet studies some scholars aim to find out how technologies shape and constitute the everyday life of people through the understanding of algorithms and codes that constitute the way the technology works. This kind of research might be very intriguing. For example yesterday I posted something on my Facebook page. Apparently it was not so appealing to my Facebook friends and for this reason I did not receive any feedback. After almost thirty minutes I published a second post more provocative and (apparently) more attractive as in few minutes I received a lot of comments. At the same time Facebook kept visualizing the first post in my friends’ news feed section, the uninteresting one. I thought: “Facebook is so sweet! It doesn’t want me to think that nobody is interested in what I write. It tries to convince my friends to make some comments or at least say ‘I like’”

Facebook is built to prompt people to write comments and give feedback to their friends. If I post something Facebook will help me to receive comments. Facebook has been designed to build networks and create social relationships. The more we connect the more profits Facebook makes.

Facebook does lead people to act in certain ways and not in others. If algorithms and codes are the central mechanism of social network sites, it is surely very interesting to investigate on the intentions of computer scientists and designers. Technologies, material objects and digital platforms always embed the intentions of their producers. However Facebook is presumably appropriated in a way that wasn’t intended and expected by its designers. This has been the case of every artifact, material object and technology in the course of the history.

I am very thrilled in finding out about infrastructures and architectures, such as Facebook’s algorithms. But infrastructures are always used differently in different contests. For this reason I believe that being aware of the way the algorithms work does not give us much information about the social impact of Facebook. Rather a comparative research project about the use of social network sites can give us much more insights about the regularities and the cultural homogenization brought by Facebook in different social contests.

 

One Response to “Algorithms and homogenization”

  • 1
    Brett Creider wrote on 7 May 2016:

    Whatever became of this study? I cannot help but feel the algorithms of the internet are creating homogenization by attempting to quantify what makes people tick and selling them what they presumably want to hear. Is this why everyone is so mean in typical comments? Why they are so extreme in those comments? Because if every ad or suggested page or post, maybe even search results, are tailored to “preach to the choir”, what would ever prompt someone to think different?

    Pardon my incoherence in proving my point, but if the internet relies on clicks for profit, those clicks will always be tailored to appeal to that particular user’s presuppositions.

    And if the vast majority of internet users are unaware of these algorithms, then they have no reason to believe it is biased, and so, it seems all the more objectively true….thus the mean, prescriptive nature of 99% of online commentary on any topic.