SAN DIEGO — Through her zombie-like digital search for likes, fueled by the green glow radiating from her smartphone, tablet, and computer, 11-year-old "Jennay" lost herself, or found a person she never knew could be inside of her.
Jennay, who will remain anonymous because she is a minor, said she considered herself a normal girl who participated in gymnastics and liked to draw.
Within a year all that changed.
Jennay had logged on to social media platforms Instagram, TikTok, and Snapchat. She grew addicted, obsessed with the number of likes, and consumed with images of women that the companies placed in her feed and injected into her timelines.
Her anxiety mounted.
Her depression took root.
A man targeted her online, convincing her to send nude videos and photos of herself. Before long she developed an eating disorder; started to cut herself.
Her grandmother, "Meghan", who was Jennay 's legal guardian, was blind to it all, unaware of her young granddaughter's online addiction, uninformed of the lengths the 11-year-old was willing to go down the path of self-destruction.
In February 2022, Jennay stole a handful of her grandmother's hypertension pills and swallowed them with the hopes of ending her life.
In August came another suicide attempt, this time Jennay swallowed nine over-the-counter pain relievers, chasing it with half of a bottle of cough syrup.
Now, Jennay is waging her own war on the same social media companies that she says destroyed her young life. She is the first young person in San Diego County to sue the companies for luring her down a path of self-destruction in order to maximize profits.
"I felt the pressure to try and make myself look like other people. Even though like I knew I wasn't them. And I found myself having less and less confidence and more insecurities. I didn't really like going outside anymore. I saw a lot of talk about self-harm and drug use and suicide. And I hadn't really had too much knowledge of that yet. And instead of being told by people that would filter it a little bit for me. I was told by social media instead and it wasn't in the best way."
In September, with the help of the national organization, Social Media Victims Law Center, Jennay and her grandmother Meghan filed a lawsuit against Snapchat, TikTok, and Meta, the company that owns and operates Facebook and Instagram.
Jennay 's lawsuit is one of the hundreds that have been filed nationwide, alleging that social media companies knowingly use algorithms to lure as many young people to their sites, and back again, all in a scramble to raise profits.
Jennay Logs On To Social Media
It didn't take long for Jennay 's curiosity to turn to compulsion.
The more time she was logged on, Jennay says the more intense her feed or suggested pages grew.
Jennay soon focused on her own self-image and how she looked, what her body looked like, and, more important, what it could look like.
She changed her diet and began losing weight.
According to the lawsuit, Instagram inserted content into her timeline that focused on excessive exercise and over-healthy eating.
"I would feel like, you know, like, why couldn't that be me? Or feeling like, maybe I'm ugly," said Jennay. "Or, just because I don't look like that certain person. I felt as if I needed to eat less. I lost 11 or 12 pounds in a week.
At the same time, according to her lawsuit, "TikTok was identifying and pushing harmful content and hashtags like 'What I eat in a day', and '500 calories.' This was content TikTok selected, not because [Jennay] asked for it or even wanted it, but because TikTok chose to program its recommendation technologies in a way that TikTok believed would maximize user engagement."
As the months transpired, Jennay says her body dysmorphia progressed.
That's when a man she did not know contacted her on Snapchat.
Convincing her that any images and videos would disappear, the man enticed Jennay to send him explicit images and inappropriate videos of her in the bathroom and videos of her touching herself.
Eventually, Jennay's grandmother discovered the relationship and contacted the police.
Less than three months later, Jennay made the first attempt at taking her life.
The second attempt came just six months later.
Regarding how social media influenced her decision to attempt suicide, Jennay says people were posting about self-harm and suicide and made it look cool.
"Some people made jokes about it and I didn't know any better," she says. "Some people would even go as far as to say that self-harm is a joke or suicide is a joke. Or people would make, like TikToks and videos about them being in the hospital for attempted suicide. And they would make it seem like it's fun. So I thought maybe it is fun. And then I looked it up and I saw things that I shouldn't have."
Using Litigation to Force the Companies to Stop
Matthew Bergman founded the Social Media Victims Law Center in January 2022 not long after Facebook whistleblower Frances Haugen told members of a senate committee that the company knowingly peddled products and made suggestions that harm younger users all under the guise of turning a profit.
Bergman says his law center alone represents 50 parents or guardians whose children killed themselves as a result of social media addiction.
If changes are not made and protections put into place, Bergman says suicide, depression, and anxiety rates will only keep going up.
Bergman points to studies as well as reports from the United States Surgeon General which find depression, feelings of hopelessness, and suicidal ideation in children and teenagers have skyrocketed since the advent of social media. They have since worsened as those companies develop newer and more effective algorithms to take hold of younger people's minds.
"We are in the midst of what the Surgeon General of the United States has called an epidemic of mental health crises among our young people," Bergman told CBS 8 in an interview. "This epidemic coincides with the proliferation of social media use among young people. These platforms are designed to be addictive. The products are designed to have algorithms to keep people online and to keep people engaged no matter what. And so this carnage that our young people are experiencing, is a direct result of the defective and dangerous products that the social media companies have produced."
"We want to hold these companies accountable for designing algorithms, so that when a 14-year-old girl is interested in an exercise, she's not directed to content that she doesn't want to see, one that promotes her bad body image."
In regards to critics saying the lawsuits and efforts by state lawmakers to put a rein on social media companies is an assault on the First Amendment, Bergman argues that it is not the content that is the problem but the delivery and personalization of the wrong type of content tailored in order to keep young people logged on.
"We're not trying to limit speech. We're not trying to limit content, we're not even trying to restrict what people can look at if they choose to look at it. But the most dangerous content that our children are being subjected to is content that they don't even seek after," says Bergman. "There's nothing about free speech that allows a pedophile to groom a young child online, and convince them to send explicit photos of themselves. The nature of these products fails to protect kids from being preyed on by pedophiles. Again, we're not trying to limit free speech. We just want safer platforms."
Who's Responsible? Is it the Parents or Guardians or the Social Media Companies?
Jennay's grandmother, "Meghan," says she had no idea about the dangers of social media or that her young granddaughter had lost herself in a dark and dangerous online labyrinth.
Bergman from the Social Media Victims Law Center says that parents and guardians should be informed and involved, however, social media companies have designed their platforms to keep them out - and unaware.
"We're all for parental responsibility," said Bergman. "The fact is, these products are designed to evade parental responsibility and designed to thwart parents' exercise of their authority over their children."
"This is not a mystery. This is not an accident or coincidence. This is a direct result of the design of these platforms to maximize user engagement, over safety. And through these lawsuits, we simply want the companies to be safer."
What Can and Should Be Done?
Bergman says that social media companies can act now to stop young users from online dangers.
Companies, says Bergman, can implement age and identity verification, preventing sexual predators from posing as children or someone they are not.
Social media platforms can also turn off algorithms, allowing young users to get only the content they are searching for and not potentially harmful and unwanted material or links.
Bergman says companies can provide videos and training for users to educate them about signs to look for.
"This is not going to be a cakewalk," adds Bergman. "But every one of these parents says the same thing, if through this advocacy through this lawsuit, they can prevent one family from going through the pain, the loss, the tragedy that never goes away, then it's worth it. We understand that social media is here to stay. And we understand that there's some good things that come out of social media. But enough is enough, our children need to be the highest priority. And that's what these lawsuits are about."
The Response From Social Media Companies
The companies named in the lawsuit, except for TikTok which did not respond to CBS 8's request for comment, say they are unable to comment on active litigation but they have worked to ensure that young people are protected.
"Nothing is more important to us than the well-being of our community," said a spokesperson from Snap, the parent company of Snapchat. "The sexual exploitation of minors is abhorrent, and Snap has specific protections in place to make it hard for strangers to contact young people on the platform. In fact, Snapchat helps people communicate with their real friends, without some of the public pressure and social comparison features of traditional social media platforms. We also work closely with many mental health organizations to provide in-app tools and resources for Snapchatters as part of our ongoing work to keep our community safe. We will continue working to keep Snapchatters safe and support their health and well being.”
The spokesperson says the app is designed to intentionally make it difficult for strangers to identify, much less contact, people that they don’t know and has refused to incentivize users from spreading harmful content.
A spokesperson for Meta, which owns Facebook and Instagram, said the company can't comment on pending litigation but is working with independent experts to protect and support teens and children.
Meta's initiatives now include preventing adults from messaging teens that don't follow them and new controls that limit the content that teens and young users can view. Other initiatives include boosting parental supervision and monitoring the time younger users stay on an Instagram feed.
As for Jennay and where she is at now, "I'm trying to find, you know, like, what, what are my likes and dislikes. And you know what makes me happy and what makes me sad so I can avoid those things. I just tried my best to make life easy for me.