![]() A major ruling concerning the law, which may be reformed, was delivered last year when a federal appeals court found that Snap can’t invoke Section 230 to protect itself from a lawsuit claiming that the company’s design of a speedometer function contributed to a fatal crash by encouraging speeding. The law has historically afforded tech companies significant legal protection from liability as third-party publishers. They allege that the companies’ algorithms amplify dangerous content that prioritizes engagement over safety.īy steering clear of claims centering on the specific content that the platforms host, they sidestep potential immunity flowing from Section 230. Plaintiffs take aim at the platforms’ product features. Defendants chose to continue causing harm and concealed the truth instead.” “At any point any of these Defendants could have come forward and shared this information with the public, but they knew that doing so would have given their competitors an advantage and/or would have meant wholesale changes to their products and trajectory. “This is the business model utilized by all Defendants - engagement and growth over user safety - as evidenced by the inherently dangerous design and operation of their social media products,” states one of the complaints filed on Thursday in Los Angeles Superior Court. Netflix Poaches Snapchat Execs Jeremi Gorman and Peter Naylor to Lead Advertising Business At least 20 such lawsuits have been filed across the country citing the Facebook Papers, a trove of internal company documents leaked by whistleblower Frances Haugen last year, with dozens more expected to come. They advance a theory arguing that platforms like Facebook are essentially defective products that lead to injuries, including eating disorders, anxiety and suicide. The lawsuits - the latest in a string of cases linking social media to mental health problems in minors - assert product liability claims to get around Section 230 of the Communications Decency Act, a federal law shielding tech companies from liability arising from content produced by third parties. The plaintiffs are among a wave of parents and their children that are taking social media platforms to court arguing that the companies not only hook users but do so knowing the harms they pose. Meta, TikTok and Snap were each hit with a new lawsuit accusing them of fueling mental health disorders in teenage users.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |