We can all agree that Facebook and Instagram have a “rabbit hole” effect. Entering those platforms is an easy task. However, when it comes to exiting the platforms the fear of missing out might make it harder than we think. This “rabbit hole” effect has even a greater outcome on young minds that are fed too much information in less time.
Think about how many times you sat and looked at memes, thinking that only 5 minutes had passed. But in reality, you could have been there 30 minutes straight. Well, this effect has been observed by the EU committee which decided to investigate more in depth the problem.
The European Union is concerned that Meta fails at protecting children from Instagram and Facebook addictive behavior. Thierry Breton, the EU’s committee representative for internal markets who is leading the investigation, stated:
“We are not convinced that (Meta) has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans, (...)".
The DSA (Digital Service Act) requires online platforms such as the ones Meta owns to protect children from prolonged exposure and access to inappropriate content. Meta needs to protect young people by offering different levels of safety and privacy measurements. Not respecting those rules can lead to Meta being held responsible and being fined as much as 6% of their global revenue and/or Meta being forced to change its software.
The European Union is also concerned about
“The Commission is also concerned about age assurance and verification methods put in place by Meta,”.
Meta has been held responsible more and more in recent years. Different US school districts and state attorney generals have raised lawsuits regarding young children's safety measurements over their mental health, child safety, and privacy matters.
Meta’s algorithms are also a concerning matter. Users might click on one harmful content, and afterward, the algorithms recommend it extensively. This can raise concerns regarding children’s mental health. Even more so, the age-checking provided by Meta is more of a formality than a real measurement. Those concerns are backed by recent studies that expose that many children under 13 (the approved age for social media) are using social media platforms, leading to further investigation.