Social Media Is Putting Profits Over the Safety of Our Children
We want to talk about two articles that recently caught our eye – both tragic stories about our nation’s youth. One discusses a new study that shows American teen girls are dealing with more depression and violence than ever before. The other gets more personal, talking with parents about how social media is directly and negatively affecting young kids across the country, in some cases leading to their deaths.
We can’t help thinking these two things are related. The pandemic and ensuing quarantine put social media use, especially for young people, at an all-time high, which likely had some unintended consequences. And, with so many social media platforms allegedly feeding harmful content to younger and less discerning viewers, vulnerable kids are getting a distorted view of the world. Some are self-harming and others are dying.
The Youth Risk Behavior Survey
On February 13, the CDC released their Youth Risk Behavior Survey (2011-2021). The Washington Post reports (emphasis ours):
Nearly 1 in 3 high school girls reported in 2021 that they seriously considered suicide — up nearly 60 percent from a decade ago — according to new findings from the Centers for Disease Control and Prevention. Almost 15 percent of teen girls said they were forced to have sex, an increase of 27 percent over two years and the first increase since the CDC began tracking it.
That means at least one of every 10 teen girls has been raped. Teen girls also had higher rates of drug and alcohol abuse, and higher levels of being bullied electronically. More, 13% had attempted suicide in the past year, compared to only seven percent of boys. The specific reasons for the gender disparities are unclear, but rather a variety of factors that vary by “race, ethnicity, class, culture and access to mental health resources.”
Per the Survey, “These data make it clear that young people in the U.S. are collectively experiencing a level of distress that calls on us to act.”
Social media won’t protect our kids
Adding insult to injury is the online world our children often inhabit. Did you know there are more than 1,200 families pursuing lawsuits against social media companies, including TikTok, Snapchat, YouTube, Roblox and Meta (Facebook’s parent company)? CBS News reports over 150 lawsuits will be moving forward this year. They spoke to some parents who believe social media had a direct and severe impact on their children’s mental health.
One of those families, the Spences, believes Instagram led their daughter Alexis into an eating disorder and depression at the young age of 12. They gave her a cell phone at age 11, with many restrictions in place. However, Alexis was able to easily get around them, even starting her own Instagram account although users are supposed to be 13 to use the app; she simply lied and got right in.
Now 20, Alexis told CBS News that an innocent search for diet tips led her deep into a world of eating disorder tips and pro-anorexic websites, pushed directly to her account courtesy of Instagram’s algorithm. Per CBS News:
[A] previously unpublished internal document reveals Facebook knew Instagram was pushing girls to dangerous content.
It says that in 2021, an Instagram employee ran an internal investigation on eating disorders by opening up a false account as a 13-year-old girl looking for diet tips. She was led to content and recommendations to follow ‘skinny binge’ and ‘apple core anorexic.’
Other memos show Facebook employees raising concerns about company research that revealed Instagram made 1-in-3 teen girls feel worse about their bodies and that teens who used the app felt higher rates of anxiety and depression.
As the Spence’s attorney says, “Time and time again, when they have an opportunity to choose between safety of our kids and profits, they always choose profits.”
CBS highlights another terribly tragic story about 14-year-old Englyn Roberts who hung herself in her bedroom after learning how to do so from an Instagram video. Per CBS: “Nearly a year and a half after Englyn’s death, that hanging video was still circulating on Instagram, with at least 1,500 views. Toney Roberts says it was taken down in December 2021. The Roberts are suing Meta, the parent company to Instagram.”
Asks Toney, “If they so call monitor and do things, how could it stay on that site? Because part of their policies says they don’t allow for self-harm photos, videos, things of that nature. So, who’s holding them accountable?”
What kind of lawsuits are these?
These lawsuits – against Meta, TikTok, YouTube, etc. – would fall under the product liability umbrella of personal injury. One attorney told CBS, “They have intentionally designed a product that is addictive. They understand that if children stay online, they make more money. It doesn’t matter how harmful the material is.” He also added that these products are designed to “evade parental authority.”
Product liability law involves the responsibility of manufacturers and sellers to compensate consumers for damages or injuries caused by their defective products. A design flaw in a product is a type of defect that can result in a product liability lawsuit. A design flaw occurs when a product is designed in a way that makes it unsafe or causes it to malfunction. This can happen for a variety of reasons, including insufficient testing, inadequate quality control, or a failure to consider the needs and safety of consumers.
These social media lawsuits involve discussions around their design and whether they allow children access to dangerous information. We’ll be watching these lawsuits progress with interest. If your child suffered harm because of social media, we want to hear from you and we want to help. Our attorneys serve the greater Washington, DC area.
Please contact Paulson & Nace, PLLC through this contact form or by calling our office.
Christopher T. Nace works in all practice areas of the firm, including medical malpractice, birth injury, drug and product liability, motor vehicle accidents, wrongful death, and other negligence and personal injury matters.
Read more about Christopher T. Nace.