USA News

Instagram addiction: Colorado mother sues Facebook parent company over daughter’s social media use

The mother of a 14-year-old Castle Rock, Colorado girl who became addicted to social media sued the parent company of Facebook and Instagram earlier this month on the grounds the company deliberately designed addictive, dangerous products and failed to warn users of the potential pitfalls.

The federal lawsuit against Meta, filed Monday in U.S. District Court for the District of Colorado, is one of at least eight lawsuits with similar claims across the country brought this month by an Alabama law firm. Attorney Clinton Richardson alleges Meta is liable for product liability, including design defects, manufacturing defects and a failure to warn users of social media’s dangers.

“Overall, this is really about accountability,” Richardson said. “We want them to be held accountable for what they are doing and what is perpetuating a mental health crisis in the United States. Facebook has put its business model of profit-at-all-costs above the well-being of young people.”

The lawsuit relies on a largely untested legal argument that’s “way out on the frontier,” said Denver attorney Randy Barnhart.

“This is a very unusual and interesting case,” Barnhart said. “Typically when we think of product liability, we think of an object, a thing — a car, a tire, a room heater. Here, it appears Facebook is selling a service. And therefore I think the issue of whether or not it is a proper product liability claim is an open question… I don’t know of a case that has dealt with the issue of whether or not a service can be a product for the purposes of product liability litigation.”

Richardson argues in the complaints that Meta knew teenagers, in particular, were vulnerable to excessive social media use, and yet intentionally designed their platforms to “exploit” young users by encouraging them to spend more and more time on the social media sites, using mechanisms such as “likes,” displaying three dots when another user is typing a message, and curating feeds to keep users logged in.

“All told, Meta’s algorithm optimizes for angry, divisive and polarizing content because it’ll increase its number of users and the time users stay on the platform per viewing session, which thereby increases its appeal to advertisers, thereby increasing its overall value and profitability,” reads the complaint in the Colorado case.

For teenage social media users, platforms like Instagram worsen self-esteem, body image and bullying, the complaint contends. Soon after the 14-year-old Castle Rock girl opened her social media accounts, her “interest in any activity other than viewing and posting on the Meta platforms progressively declined,” the lawsuit alleges.

She slept little as the addiction worsened, the complaint claims, and eventually engaged in self-harm, developed an eating disorder and attempted suicide, according to the lawsuit. The Denver Post is not identifying the girl or her mother, because she is a minor. The family declined to comment through Richardson.

A spokeswoman for Instagram declined to comment on the case Thursday, but Meta has previously denied that the company put profits over safety, saying last year that it expected to spend $5 billion on safety and security in 2021 and that it employs about 40,000 people focused on user safety.

Fort Collins attorney Tom Metier said the lawsuit raises “viable” arguments.

“There’s a pattern, according to the complaint… (of the company recognizing) what will make Meta more popular and therefore drive more profits in advertising dollars, and at some point, and apparently many points, it’s alleged the choice was made to create harm in exchange for profit,” he said. “And so there’s an intentionality that could be devastating for Meta.”

He added that in most product liability cases, manufacturers of physical products are required to identify the strengths and weaknesses of their products and consider what harm the products could cause. Companies have a duty to make reasonably safe products, and when a product can’t be made physically safe, companies must warn consumers about the “truth of the dangers,” he said.

File source

Tags
Show More

Related Articles

Back to top button
Close