Lawsuits allege that Meta’s algorithm for Instagram encouraged eating disorders by showing very thin models and content related to extreme exercise. The twelve-year-old girls for whose addiction the lawsuits seek damages tried to take their own lives and had to be hospitalized on several occasions. One of the girls refused to eat and as a result was hospitalized and had to have a feeding tube put in. The complaints for the lawsuit state that the company did not verify the girls’ ages, allowing them to join when they were as young as 12 years old. The platform did not shield them from sexual messages from adult Instagram users who were trying to exploit the girls and/or suffered from their own mental health issues.
The harm to children caused by social media addiction, particularly to Meta’s platforms, can be significant and unprecedented. Last year, a whistleblower revealed documents that show the substantial harm to teens, especially teen girls, who use Instagram. There have also been claims that the platforms are being used to spread dangerous misinformation. In February of this year, senators introduced a bill to try to curb addiction and allow the Federal Trade Commission to pass rules restricting the social media company’s power over various users.
Product Liability LawsuitsIf you believe your children have been harmed by Instagram or Facebook, you may have a product liability claim for damages against Meta. Products can be defective with regard to manufacturing, design, or marketing. It appears that there may be design defects in the algorithm that powers Instagram. The age restrictions may not be implemented appropriately, which leaves our youth to participate in potentially damaging activities before they reach a safe age to handle it. Additionally, Meta may be liable for marketing defects; there may be nonexistent or inadequate warnings of the harms that may arise from a child or teen’s use of Instagram and Facebook.
In addition to product liability causes of action, users may have grounds to sue on the basis of misrepresentation or statutory violations. For example, California may put in place a law that would allow parents to sue for violations that harm their minor children, and other states may follow the state’s lead. Additionally, Meta may have liability under a theory of misrepresentation for failing to warn that its products can be dangerous to kids.
If our Chicago-based lawyers can establish Meta’s liability, we may be able to recover compensatory damages on behalf of you or your child. While nothing can fully make up for psychiatric and physical harms inflicted by the company’s algorithms, these are damages intended to put you both back in the place you would have been in had your child not become addicted to the platform. To the extent that Meta’s actions were intentional and involved deliberate action to make its algorithm addictive to adolescents and tweens, we may be able to hold it accountable liable for punitive damages.
Call Our Chicago-Based Product Liability AttorneysTech companies have profited from automated decision-making by their algorithms. Social media applications appear to be designed to trigger addiction in teens. Your child should not have been harmed by inappropriate and damaging content. If you believe he or she was injured through the use of Meta’s social media platforms Facebook and Instagram, you should call the seasoned Chicago product liability lawyers of Moll Law Group about whether you have grounds to sue. Our firm represents parents and children around the country, and billions have been recovered in product liability lawsuits with which we’ve been involved. Please complete our online form or call us at 312.462.1700.