You get trapped and addicted by algorithms

Companies are trying to keep you on their websites for
as long as possible

Have a look at how they use clever algorithms for this.
During your lunch break you're looking up recipes for lasagna. You start watching a cooking video where 91-year-old Maria shares her famous lasagna recipe.
When it's finished, a new video is suggested automatically. It's about a grandmother baking cookies with her grandson. This is cute and funny, so you keep watching.
Automatically, the next video starts playing. Baking with Tourettes and my brother, it's called. You're curious and keep watching because it has a few million views.
Your suggestions keep getting more interesting. You start watching an inspiring video about a polio survivor in an iron lung. Something you've never thought about before.
What follows is a viral video about conspiracy theories.

It's becoming dark outside, and you realise you've been watching for hours. And these videos have changed how you think about the world.

You just watched hours of random videos.
This is
not your fault

The algorithm makes the video suggestions impossible to resist. It's not a surprise that social media can feel like an addiction.

The consequence? Disinformation and extremist theories

There are dangerous unintended consequences. We become trapped in a ‘filter bubble’ and don’t get exposed to information that would challenge or broaden our worldview.
Algorithms are programmed to promote content that people are likely to interact with, such as shocking, extreme or hateful posts and videos.
These “network effects” make disinformation, conspiracy theories and extremist content go viral. These ideas can impact the choices of policy-makers and shape public opinion, whether they are verified or not.

What does a future without polarising technology look like?

In a future with the new EU Digital Services Act, users will have a real choice about how they want to see content displayed. They will be given a choice to opt in for personalized recommendation algorithms, which should be deactivated by default.

Users should be able to choose and influence what they get to see with advanced settings to influence the algorithms.

Let's find out more

You can imagine that these practices harm not only individuals, but also society...
Find out about the consequences for society in the next chapter
Act now
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram