World

Meta’s Mark Zuckerberg seeks to avoid personal liability in lawsuits blaming him for Instagram addiction


A loss for the billionaire who launched Facebook with friends as a Harvard undergraduate two decades ago could encourage claims against other CEOs in mass personal injury litigation.

Zuckerberg faces allegations from young people and parents that he was repeatedly warned that Instagram and Facebook weren’t safe for children, but ignored the findings and chose not to share them publicly.

Meta, TikTok and X CEOs grilled on child exploitation by US lawmakers

The cases naming Zuckerberg are a small subset of a collection of more than 1,000 suits in state and federal courts by families and public school districts against Meta along with Alphabet Inc.’s Google, ByteDance Ltd.’s TikTok and Snap Inc. US District Judge Yvonne Gonzalez Rogers in Oakland, who is overseeing the federal cases, recently allowed some claims to proceed against the companies while dismissing others.

Plaintiffs contend that as the face of Meta, Zuckerberg has a responsibility to “speak fully and truthfully on the risks Meta’s platforms pose to children’s health.”

“With great power comes great responsibility,” plaintiffs’ lawyers said in a court filing, quoting the Spider Man comics in a footnote. “Unfortunately, Mr. Zuckerberg has not lived up to that maxim.”

Zuckerberg, the world’s fourth-richest person, has argued that he cannot be held personally responsible for actions at Meta just because he is the CEO. His lawyers also claim that Zuckerberg did not have a duty to disclose the safety findings that were allegedly reported to him.

“There is ample legal precedent establishing that being an executive does not confer liability for alleged conduct of a corporation,” a Meta spokesperson said in a statement, adding that the claims against Zuckerberg should be dismissed in their entirety.

02:15

Singaporeans fume over US lawmaker grilling of TikTok CEO

Singaporeans fume over US lawmaker grilling of TikTok CEO

At the hearing, Rogers pressed the plaintiffs about whether Zuckerberg was required to disclose safety information absent a “special relationship” with the users of his products.

Plaintiffs had argued that the Meta CEO had a responsibility to Facebook and Instagram users given his “outsize role in the company,” but Rogers challenged them to point to a specific law that would support their argument.

Rogers appeared more sympathetic to plaintiffs’ arguments that Zuckerberg could be held liable for personally concealing information as a corporate officer at Meta, asking Zuckerberg’s lawyers how he avoids potential personal liability if there’s an understanding that Meta itself had a duty to disclose the safety information.

The judge also discussed with lawyers how laws covering corporate officer responsibility, which vary among states, apply to Zuckerberg.

‘Racist’: Singaporeans slam US senator’s grilling of TikTok CEO’s nationality

Zuckerberg, who is Meta’s most significant shareholder and maintains sole voting control at the company, is also at risk of being held personally liable in a separate 2022 lawsuit over the Cambridge Analytica data privacy scandal brought by the attorney general of the District of Columbia in Washington.

Pinning blame on an executive for unlawful conduct typically hinges on showing their involvement in relevant day-to-day decisions or their knowledge of the practices at issue. It is generally easier to assign executive liability at smaller companies, where an individual’s direct participation in decision-making can be clearer. At large companies, liability comes down to proving control over decision-making.

Social media companies have come under increased scrutiny for their impact on young people’s mental health and role in spreading sexually explicit content. At a Senate hearing last month, US Senator Josh Hawley, a Missouri Republican, pressed Zuckerberg on whether he should personally compensate victims of sexual exploitation online. Zuckerberg then offered a rare apology to the victims’ families.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.