The Dangers of Social Media Algorithms: A Look at How Tiktok and Other Platforms are Impacting User Safety and Mental Health

Turning off social recommendation algorithms: A step-by-step guide

The school shooting in Vantaa has sparked a flood of content on social media platforms, particularly on Tiktok. Despite not actively searching for related content, users may still come across it in their feed or recommended search terms due to the platform’s recommendation algorithms. These algorithms aim to show content that interests users and often prioritize provocative and emotional content.

Recent studies have shown that social media platforms’ algorithms can be detrimental to user safety, as they often promote harmful content. For example, Yle tested what kind of content Tiktok shows to a depressed user and found that the platform offered more mental health topics, videos related to eating disorders, and content encouraging dangerous weight loss methods.

The responsibility of parents to monitor their children’s online activity has been highlighted by social media platform companies. However, turning off personalization requires multiple clicks on most platforms, making it difficult to prevent access to harmful content. The EU’s digital service regulation aims to promote user safety and hold digital giants accountable for the content on their platforms.

To adjust the personalized feed on major social media platforms, Sitra has provided instructions. However, turning off personalization does not always prevent exposure to harmful content as seen with varying user experiences on different platforms like Linkedin, Facebook, and Instagram. Snapchat and Pinterest received poor grades from Sitra for their lack of control over recommendation algorithms.

Overall, social media platforms need to take more responsibility in protecting users, especially underage users, from harmful content. Parents play a crucial role in supervising their children’s online activities and guiding them through the digital landscape. The regulation of digital platforms is necessary to ensure user safety and prevent the spread of illegal and harmful content online.

It is important for social media companies to understand that they have a significant impact on their users’ mental health and well-being through their recommendation algorithms. They must take steps towards creating safer spaces for users by regulating what kind of content is recommended based on individual preferences.

Furthermore, parents should educate themselves about how these algorithms work so they can better protect their children from potential harm caused by harmful or misleading information shared online.

In conclusion, while social media platforms are constantly evolving and improving their recommendation algorithms, there is still much work to be done in ensuring user safety and preventing harm caused by harmful or misleading information shared online.

Leave a Reply