Algorithms and Counter-Culture

Anders was looking forward to the day in city, he had meetings booked, lunch with a friend. He pulled on his ball cap, which had a special subtle design on it that would confuse cameras in stores with an anti-algorithm for retail surveillance of his shopping habits. His coat and the same thing, including a small device that would block the signals from iBeacons analyzing his movement through a store. He used in IP spoofing tools and a VPN on his mobile.

There is a growing movement seeking to subvert the algorithms. Not yet a subculture or mainstream movement. But aa growing grassroots movement that is coming to understand the influence algorithms are having across so many aspects of our lives. Some social media platforms are aware of this, often changing their algorithms to counter these actions. It creates an intense feedback loop. One where pressure is mounting throughout the systems.

The anti-algorithm behaviour has been around for a very long time, but is becoming larger as time goes by, more common. Where once it was largely relegated to hackers, good and bad, and those more technically knowledgable, it is slipping into the general population, becoming a part of culture.

Understanding why this is happening helps us to understand how culture as a whole, is beginning to make subtle, invisible hand style moves towards the roles it does and doesn’t want algorithms to play. And as I’ve written before, the ultimate arbiter of all technologies is culture.

Even those who aren’t overly technology literate have come to understand how algorithms can divide society, pushing up negative and fear-driven stories, clickbait and ads that seem too uncanny to us based on something we just mentioned to a friend. They may not always understand it is algorithms that do this, people just “know” something isn’t right.

But not all algorithms are designed to be nefarious or do negative things. Many algorithms assist us in our daily lives in ways that benefit us. From finding links on search engines that we want, showing us the best routes to drive somewhere, managing complex business systems.

It is not these beneficial uses of algorithms that gets the hairs on the back of our necks tingling and riles up our ire. It is when they are seen to intrude without our consent. Are brains are wired to dislike when we are being watched. This comes from thousands of years of being on some other animals dinner menu. Humans didn’t like being a snack for a sabre tooth tiger.

We also seem to be more aware of the way algorithms, by promoting a meme or some other aesthetic element of culture like an idea, can speed up new cultural norms. This can lead to new norms that can influence larger groups far faster than before in history. This is part of the reason that we see rapid formations of cultural divides, or why trends can come and go so quickly.

A trend fades so fast in the digital world because algorithms will pick up on something else and amplify new trend if the algorithms detect the prior trend is starting to fade away. This also impacts how we use language and talk to one another online and in the real world.

This is how algorithms then can change sociocultural power dynamics from politics locally and globally, to our friend groups and coworkers. If a human feels they have gained a powerful or useful insight as surfaced by an algorithm, they may use it for social gain.

Aside from power dynamics, algorithms can influence social values and signalling. Through the use of likes, shares, followers etc., the more we get the more we feel society has placed a higher, or lesser, value on what we have created or shared. This can complicate our social interactions, as they also become fleeting. What was loved by thousands today may be disliked, or ignored tomorrow. This makes social dynamics more difficult to understand even amongst smaller groups.

Social media platforms have enormous power in influencing our morals and values. Even though some humans are in the loop for moderation, they too are training the algorithms, just as a human participant does. This is not an area governments tend to wade into for governance, but are having to now address.

This is why we are seeing more anti-algorithm and anti-surveillance clothing. The increased use of VPNs (Virtual Private Networks) for a growing slice of consumers. People increasingly turning to more private communications tools such as email, WhatsApp, Signal and Telegram.

These actions then put more pressure on those analysing the data and tweaking the algorithms and the algorithms that self-learn, which results in even more opacity within the algorithmic black boxes. So it will be interesting to see how this plays out over the longer term.

Culture at scale doesn’t necessarily have to fully understand the nuances of an issue. Especially one that can be felt across multiple cultures. This is the invisible hand of culture at work. It is one of the cultural mechanisms we use as a survival tactic. There is a sort of tug-of-war going on now between sociocultural systems and algorithms. Even though it’s messy, my bet would be on culture, not the algorithms. But a counter-culture is evolving.

Previous
Previous

The Rise Of Anti-Tech Movements

Next
Next

Byte-Sized Beliefs: Memes as Social Mirror