Since 2010, data show a steady rise in suicide rates and anxiety among young people.It’s hard to ignore that this timeline aligns closely with the social media boom, especially the emergence of Facebook.
Mark Zuckerberg created the Facebook in 2004, likely without anticipating the massive popularity it would achieve in the years to come, or the lasting impact it would still have on our daily lives. Then came Instagram and WhatsApp—both later acquired by Zuckerberg—followed by Twitter, TikTok, and the rise of likes, reels, tutorials, Get Ready With Me videos, and everyday life itself, now filtered through a screen.
Social media today spans all age groups. Parents who once scolded their kids for using their phones at the dinner table now send them reels on WhatsApp, stirring a pot with one hand while scrolling through TikTok with the other. Kids who used to play outside now often connect over Discord while shooting opponents in Counter Strike or spending entire nights listening to other young people talk on livestreams.
In the span of just twenty years, reality has shifted dramatically. And the consequences of that shift—perhaps long hidden beneath the excitement of new technologies—are only now beginning to surface.
In March 2026, a Los Angeles jury found Meta and YouTube liable in a case involving alleged mental health harm suffered by a young woman who developed a social media addiction from an early age, after beginning to use YouTube at age 6 and Instagram at age 9.
The ruling determined that the issue did not lie in the content itself, but in the platforms’ design, pointing to features like infinite scroll, autoplay, and constant notifications as mechanisms that encourage compulsive use.
As a result, the court ordered $3 million in damages, with Meta responsible for 70% and YouTube covering the remainder, while additional punitive damages are still under consideration.
This case could set an important precedent by establishing that digital platforms can be held accountable for the impact that their design has on users’ mental health—especially minors—and it opens the door to thousands of similar lawsuits already underway across the United States.
It all comes down to how we interact with social media and certain apps. Yes, they can be entertaining, distracting, and sometimes even informative, but it’s hard to ignore the reality: when you look at how many hours we spend on them, or think about the first thing we do when we wake up and the last thing we do before going to sleep, it can feel unsettling.
Recent data shows that, globally, the average person spends about 6 hours and 43 minutes per day in front of screens—roughly 40% of their waking hours. In countries like Argentina, Brazil, and South Africa, that number climbs past 50%, meaning more than half of conscious time is spent on devices. Among teenagers, the figure is often even higher.
When it comes specifically to social media, users spend an average of 2 hours and 15 minutes daily—about 13% of their waking time. In places like Brazil, Chile, and South Africa, that number exceeds 20%, with teens again leading the trend.
A study indexed in SciELO found an association between increased time spent online and higher levels of anxiety and depression among adolescents, particularly among those who spend more than four hours a day connected.
At a stage of life defined by identity-building and emotional vulnerability, this level of consumption can directly affect mood, sleep, and social relationships. This makes it clear that the issue isn’t just about time—it’s about impact.
Along the same lines, studies cited by the Pew Research Center show that a growing share of teens report feeling “almost constantly online,” something that directly affects their emotional well-being.
So is it a coincidence that more and more people—regardless of religion, ethnicity, nationality, or age—seem so absorbed, so captivated, so deeply immersed in these platforms? Unfortunately, it’s not. And for those who design these systems, it’s not a surprise either. Evidence suggests that many of these systems are intentionally designed.
Far from being a random occurrence, multiple studies point out that these platforms are built around reward systems and constant validation, combined with social and cultural dynamics that encourage intensive use that becomes difficult to interrupt.
The number of hours we spend on our phones is shaped by large teams of designers, UX specialists, developers, marketers, and psychologists. The goal of these apps is simple: to make them nearly impossible to put down. Sorry if you thought you were fully in control of your screen time: you may not be.
And this isn’t some conspiracy theory either: it’s been documented by the very people who run these companies. In a 2015 internal email, Mark Zuckerberg set a goal of increasing user time on the platform by 10%. When questioned about it in court, Zuckerberg acknowledged that teams used to be given those kinds of targets, although he claimed that approach is no longer how the company operates today, and described the email as “very old.” Whether or not those practices continue today, the takeaway is clear: user engagement appears to have been deliberately engineered rather than accidental.
And it’s not just about how long we spend on these platforms, it’s about what we see while we’re on them. This personalized bubble on the device you carry in your pocket is also precisely designed just for you.
The specific mix of content—kittens and makeup, design and food culture, philosophy and storytelling, film and fashion, politics, music and history—is what makes each person unique. And that’s exactly what the algorithm uses to serve you an endless stream of content that keeps you scrolling for hours without even realizing it. That, ultimately, is what convinced the jury: that the deliberate design behind excessive social media use should have legal consequences.
The problem is that, in ways that can resemble the effects of a drug, this can create dependency. An addiction that might seem harmless, but clearly isn’t.
The case of the woman who sued Meta and YouTube—and won $3 million—is reflected in the experiences of countless other children. They spend hours watching other kids play, while their own hands hold not toys, but phones. They spend even more time watching young girls or adult women in makeup tutorials, learning how to be more beautiful, more polished, more grown-up. Girls teaching other girls how to look like women.
Self-esteem, social relationships, and mental health are just a few of the areas deeply affected by social media use among children, teens, and adults.
For years, social media use has been normalized, and that normalization is part of the danger. If we don’t question it, we can’t address its consequences. If we don’t understand social media—and especially the companies that build, design, and constantly update it to keep us hooked—the same way we understand any other product with side effects, then we’re simply turning a blind eye to a problem that will only lead to more dependency.
They’re Not Just Addictive: They Also Put Children’s Safety at Risk
Unfortunately, social media has also been described in some cases as facilitating the exchange of sexual content—including, in some cases, content involving minors. While there is no evidence that this is the intent of the platforms’ creators, it does point to serious gaps in systems meant to protect young users.
This isn’t just a perception: it has been a central issue in recent lawsuits against Meta in the United States. During an investigation presented in court, researchers created fake profiles posing as minors on platforms like Instagram and Facebook. Within minutes, those accounts began receiving messages from adults from adults sharing sexually explicit content. For prosecutors, this demonstrated not only failures in platform safety systems, but also how easily underage users can be exposed to situations involving abuse and exploitation within these environments.
As the rulings stack up, Mark Zuckerberg’s company isn’t just on the hook for the $3 million awarded to the woman who developed a social media addiction as a child; it’s now been ordered to pay an additional $375 million for for putting the safety and mental health of other minors at risk.
The verdict was issued by a jury in New Mexico, which found that Meta violated consumer protection laws by withholding information about the risks associated with its platforms and failing to take adequate action to prevent minors from being exposed to harmful content.
Prosecutors argued that the company allowed predators to access and contact underage users for years, enabling situations that could escalate into real-world abuse. They also stated that Meta was internally aware of these risks, yet failed to implement effective safeguards or provide transparent warnings to users.
From the prosecution’s perspective, the ruling was described as a “historic victory,” emphasizing that the company prioritized growth and profit over the safety of children and teenagers.
These two cases are just the beginning. Roughly 1,500 families are reportedly preparing to file lawsuits against Meta this year, seeking not only financial compensation but also legal accountability for what what they describe as a deliberately addictive design and a lack of meaningful safeguards across its platforms.
Both cases are part of a growing wave of litigation that directly challenges the social media business model, one that some analysts have compared to the historic lawsuits against the tobacco industry.
For the first time, courts are not just examining the content circulating on these platforms, but the structural design that encourages prolonged use and exposure to risk.
Still, Mark Zuckerberg has argued in his defense that he should not be held responsible. He argues that, according to platform rules, users under 13 are not allowed to access services like Facebook, placing the blame on users who lie about their age to create accounts, while overlooking the company’s ability to implement stricter safeguards to actually prevent children from accessing them.
What was once celebrated as innovation and technological progress has, at the same time, quietly given rise to something darker—outcomes that are now materializing in the form of mental health conditions, from anxiety and depression to suicide.
And yet, there are signs of hope—not just for affected families, but for society as a whole and future generations—beginning to take shape when this issue moves beyond articles and public debate and into the courtroom. The legal system no longer just examining how we use these platforms, it’s questioning how they were designed to be used.
The question, then, is no longer just what we do with social media, but what social media does to us. Because if the time we spend on our screens isn’t accidental, if the content we consume is carefully curated, and if the risks were known by those who built these platforms, then this is no longer an individual issue but a structural one.
Perhaps the most unsettling part isn’t that these technologies have changed how we live. It’s that we’re only now beginning to understand just how much. And more importantly, who may have been aware of these dynamics from early on.
Cover photo created with AI.


