SAME SAME, BUT DIFFERENT

Silicon Valley has designed algorithms to reflect your biases, not disrupt them

Literally reflecting your world view.
Literally reflecting your world view.
Image: Reuters/Steve Marcus
We may earn a commission from links on this page.

Silicon Valley dominates the internet—and that prevents us from learning more deeply about other people, cultures, and places. To support richer understandings of one another across our differences, we need to redesign social media networks and search systems to better represent diverse cultural and political perspectives.

The most prominent and globally used social media networks and search engines— Facebook and Google—are produced and shaped by engineers from corporations based in Europe and North America. As a result, technologies used by nearly 2 billion people worldwide reflect the design perspectives of the limited few from the West who have power over how these systems are developed.

The problem goes far beyond filter bubbles: Linguistic and cultural diversity is disappearing around the world. It is estimated that half of the world’s 7,000 languages are going to vanish from our world in the next 100 years. And as those languages disappear, the cultural meanings and values they stand for will also be lost.

If the internet is to be part of a process of countering rather than aiding this trend, we need to imagine technological connectivity in ways that are centered around the beliefs, values, and agendas of diverse peoples—or risk retreating deeper into global homogeny. The internet can be a tool to support diverse cultural knowledge rather than reinforcing the voices of those who already hold power and privilege.

A quick web search can reveal the pervasiveness of this issue in a very simple way. For example, ahead of a planned visit to the west-African nation of Cameroon, I typed the country name into Google and was disappointed by the result.

Like my Facebook newsfeed, my search results privileged information that Google’s algorithms (incorrectly) thought I would find most relevant to my interests: Anglo-centric, Western-based takes on Cameroon, far removed from the country itself. These results started with Wikipedia and the CIA’s World Factbook page, both written in English, and most likely by people from parts of the world with far greater economic and political power than Cameroon itself. I could not find a web page from Cameroon in my search result until I reached page three. This is notable because less than 10% of users reach page two when conducting a Google search, and an even lower number reach page three.

When seeking information about an African nation, my top results were communicated through a Western lens; they were also biased toward the location where I conducted the search, my nationality, and my language preferences. Given the smaller user base and less-accessible internet in Cameroon, there are naturally fewer pages available from the country to answer my query. But shouldn’t Google weigh the value of a place or culture’s own information above that of a foreign body? Just because this option may be the easiest for me to understand doesn’t mean that it should be the perspective I am offered.

If the perfect search engine is to be “like the mind of God”, as once stated by Google co-founder Sergey Brin, should it not support the perspectives of the voiceless in our world? Instead of blindly accepting the Internet systems that shape our world, we can open up the cultural and political values that shape the design and engineering practices of these technologies.

Designing for global equality on the internet

The building blocks of the internet—interfaces, databases, and algorithms—are socially constructed; they represent the values, visions, and belief systems of their designers. Instead of seeing platforms such as Facebook, Google, and Twitter as universal interfaces that serve all equally, it is time to think about how they can be localized and innovated upon in relation to their diverse user communities across the world.

It is time for the cultures of our world to take back the internet. It is worth remembering how important online communities were in the early digital days. The users in these environments had power to shape how content was shared, between whom, and with what constraints; even Facebook started as a simple community network to serve the Harvard undergraduate community before expanding to other universities.

We must learn from cultural differences rather than give into a misguided narrative of digital universality, which is a perspective whereby one presumes the experiences of all people, regardless of place or culture in relation to new technology, are the same. Embracing our differences means not taking users from different cultures and places for granted. We should design interfaces that shape how information is circulated and shared in ways that are consistent with the diverse cultures and communities that have begun to use internet technologies.

Having collaborated with indigenous communities across the world for the past 15 years to design and develop technologies that support their voices and objectives, I have learned to question my assumptions around how information is shared. For example, communities I have worked with have explained that the Western philosophy—embraced by hackers and Californian liberals—of making information public or “free” is actually at odds with their own traditions, where knowledge is carefully guarded and circulated for the good of all people. My Zuni friends, for example, have explained to me that if their sacred knowledge falls into the wrong hands, that it can bring curses upon the world. They see the protection of this information as part of their goal of helping all people, rather than just their community.

Technology firms thus have the opportunity to give users the power to represent themselves via the digital platforms that have increasingly entered their lives. A recent Pew study concluded that over 54% of adults within the developing world now have internet access. Designing interfaces with these communities could allow these new users to share their experiences and insights with the wider world.

Algorithms as grassroots cultural expression

Many technologies are now encoded with Western biases. It is therefore of critical importance to open up the black box of personalization and search algorithms that drive technologies such as Google and Facebook.

We mistakenly speak of algorithms as untouchable and illusory. This is because they have been removed from our sight, delegated to conduct incredibly powerful tasks that affect our lives without us having any knowledge or control. Users must urge Facebook and Google to provide further transparency around the technologies they use to order and structure how we experience information—and one another.

Instead, algorithms should be the vehicle for grassroots cultural expression. User communities can categorize, organize, and design models by which information can be retrieved in various systems. Cultural knowledge can often be thought of in structural manners around how people choose to define and connect concepts with one another. We can think about algorithms similarly.

We must now turn our gaze to the vast range of communities and cultures implicated in the global technology revolution, and think about how technology can support, rather than blur, our cultural differences.