Go straight to content
<
<
Young people need supportive adults who can talk about digital platforms and algorithms

Young people need supportive adults who can talk about digital platforms and algorithms

Ung sosiale medier algoritmer

(Illustration: Colourbox)

Insight

Published: 26.02.2026
Oppdatert: 26.02.2026

Gilda Seddighi
Rebecca Lynn Radlick
Benedicte Nessa

The vast majority of young people today use TikTok, Instagram, and other algorithmically driven digital platforms as central sources of information, identity formation, and belonging.


However, vulnerable young people in particular may find themselves having to manage both the opportunities and risks of such exposure largely on their own. Adult professionals and volunteers can play an important role in supporting these young people in their encounters with the digital world—but they need to take the initiative to start the conversation.

Digital spaces as escape and community

I’ve used gaming to find friends, and over time I joined servers to meet people who play the same games as me. I’ve actively searched for environments online where I could talk to someone.
(“Aaliyah,” 22 years old)

In our research project DigCapabilities, we interviewed several dozen young people with immigrant backgrounds who are outside employment and education, as well as parents and professionals who work with them. One of them is “Aaliyah.” She experienced bullying and racism throughout her childhood and adolescence because of her appearance, without parents who could support her and without a school that intervened. Video games, online forums, and digital platforms such as TikTok and Instagram therefore, became crucial for finding a space where she could feel safe and experience a sense of belonging.

Previous research shows that weak social relationships, experiences of bullying, and family-related challenges can be important factors behind young people dropping out of education and working life. For many young people, digital platforms therefore become a vital arena for safety, community, and belonging, and thus also important for mental health and well-being. At the same time, these platforms can play an ambiguous role: young people may receive support and feel a sense of belonging, but they also risk encountering harmful content without adult support.

Young people are left to figure out platforms on their own

Social media platforms are governed by algorithms. Who you are, who you “follow,” and who you interact with influence the reality you are exposed to. At times, it is also difficult to understand why young people are exposed to particular types of content.

Young people learn how algorithms work partly through conversations with friends about the content that appears in their feeds. We therefore asked them what they think about the algorithms that shape what they see, which digital spaces they seek out, and who they turn to when they are exposed to offensive or harmful content.

Findings from the DigCapabilities project point to something critically important: young people with limited networks, or who have lost contact with friends and family, are more likely to have to deal with such exposure alone, often through trial and error.

This pattern is clearly visible in the case of “Ali,” another young person we spoke with. He needs help finding a job and updating his skills but does not have parents who can support him. Ali has gradually lost contact with his friends and has low trust in public services. Instead, he tries to navigate the internet on his own. Along the way, he has been scammed several times because he lacks both guidance and an overview of which courses or job opportunities he can actually trust:

I tried to find someone on YouTube who could help me start a business. I had questions that other people might have asked their father about. I found an influencer. Over time, I realized I didn’t agree with his perspective. I didn’t take the course, but I still had to pay him.

“Aaliyah” describes how she has had to cope with challenges on digital platforms on her own:

Since I was 12, I’ve tried to find someone to talk to online. But there were also several people who threatened to kill me if I didn’t get together with them or send them pictures. For a period, I was also exposed to a lot of violent content on Instagram. Every time I saw it, I felt so unwell that I had to cancel my plans for the day. I reported the content, but it continued to affect me.

Safe digital platforms

Sexualized videos, conspiracy theories, extreme political or violent content are not only watched, but create new risks that vulnerable young people must navigate alone. It is particularly concerning that none of the young people we spoke with said they had discussed such experiences with a professional adult, even though several of them see a psychologist or participate in low-threshold support services.

For young people who already live with discrimination, mental health challenges, weak social relationships, or low trust in institutions, algorithmically driven content can thus become an extension of their social exclusion.

Digital platforms must be designed to be safe and inclusive for young people

We believe that digital platforms must be designed to be safe and inclusive for young people. At the same time, attention must be directed toward the adults who encounter young people in everyday life. Professionals such as healthcare personnel, counselors/advisors, child welfare educators, and other adults in low-threshold and voluntary services with whom these young people have trusted relationships can initiate conversations about young people’s digital lives and how algorithms shape them.

For many of these young people, such adults are often the only ones who can listen, provide advice, and act as a trusted source of support. Without this help, young people are largely left alone to face digital systems that actively shape their worldviews, relationships, and future opportunities.

Original text published on Utrop here: https://www.utrop.no/plenum/ytringer/377092/