Can Snapchat Be Held Legally Responsible for Harm Incurred by Snapping and Driving?

By: Briscoe Robinette

Can social media companies, such as Snapchat, be held liable for accidents that occur while using their platforms? As social media becomes increasingly popular, especially with younger age groups, danger lies in their drive to exponentially increase the number of active daily users. Social media apps are largely meant to gain the attention and “likes” of others, causing many users to behave differently to gain more followers and interact with others on the social media platforms. 

Codified in 47 U.S. § 230, the Communications Decency Act (“CDA” or “Section 230”) is one of the most important laws passed in terms of protecting Internet innovation and freedom of expression, according to The Electronic Frontier Foundation. The law gives social media and tech companies protection against “libel and other civil suits for what people post on sites, regardless of how harmful it may be,” as stated by NPR. Therefore, these hosting platforms generally cannot be liable for content that is posted on their platforms. Section 230 is the law that gives YouTube users the right to post their own videos, Facebook users to post their thoughts, and many other users on a variety of platforms to post their own content without the companies being responsible for any harm resulting from a user’s post.

While investigations into the January 6th insurrection are still ongoing, the key role social media played in the events is indisputable. Rioters utilized smaller platforms, such as Parler and Gab, as well as main-stream social media, such as Facebook and Twitter, to gather support, organize, and plan the attack. Smaller platforms were used to speak more freely about the attack and planning, while rioters used Facebook to create groups similar to the Stop the Steal group, to promote misinformation and create a place for similarly minded people to feed off each other.

The use of social media in these events has brought increased attention to the debate over regulating tech companies regarding speech that incites violence. Recently, Section 230 came under fire for the role platforms played in the attack on the Capital and other issues of misinformation. On April 9th, 2021, members of the House held a hearing during which they questioned executives of Google, Facebook, and Twitter on social media’s role in promoting extremism and false claims. The hearing showcased the House members’ want to update the way in which social media companies operate, with politicians from both sides calling for changes to be made to Section 230. However, House members are not in agreement on exactly how to change Section 230, and multiple different proposals are being debated, demonstrating the complexity of the issue. 

Social media giants argue that their platforms could not survive without the CDA primarily because it would be impossible for the companies to effectively monitor millions of daily posts, exposing them unfairly to liability for user posts. 

On the other hand, those in favor of a change in law cite the dire consequences that will continue to mount if social media companies can continue to elude accountability for fostering dangerous activity. Recently, the 9th Circuit Court of Appeals chipped away at the CDA’s armour. In 2017, three males fatally crashed their car in Wisconsin at an extremely high speed. The 17-year-old driver hit speeds north of 123 miles per hour as another of the boys used Snapchat to take a picture with the speed filter. The speed filter on Snapchat registers the mph the user is going at the moment that the photo is taken and displays it on the photo. Tragically, the three boys crashed into a tree and instantly died shortly after snapping the picture. As a result, the parents sued Snapchat over the death of the boys, arguing that Snapchat was liable in this case, as the boys drove at the high speeds they did in order to take the picture and engage with their friends and other users. The district court dismissed the case due to the protection Section 230 provides to social media companies. However, the 9th U.S. Circuit Court of Appeals ruled that parents did have the legal right to sue Snap Inc., (“Snapchat”) the creator of Snapchat. The appeals court ruled that Section 230 is not applicable to this case because the suit was based on the features of the app, as opposed to the content uploaded on it. Therefore, the Court held Snapchat could be held liable because it was the design of the product which spurred the boys into actions that caused them harm, not that their content caused harm or was untrue. NPR reported that Judge McLane of the appeals court further elaborated on the ruling stating “that manufacturers have a ‘duty to exercise due care in supplying products that do not present unreasonable risk of injury or harm to the public’”. In this case, the 9th Circuit therefore found the immunity provided by Section 230 was not applicable. 

However, the 2nd Circuit reached a different outcome based on a similar case involving another app, and the split between circuit courts portends the need for the Supreme Court of the United States to settle the law. Snap Inc.’s lawyers spoke out about the case, stating that the court’s decision could lead to a multitude of other companies being held responsible for any harm or injury incurred while people were using their platforms while driving. It is unlikely that this will be the case though as the 9th Circuit argued that Snapchat’s features led to harmful consequences and therefore it wasn’t protected as a publisher under Section 230. However, the 9th Circuit is tip-toeing along a fine line of distinction, as the use of the speed filter is largely meant to publish something, meaning that under Section 230 Snapchat cannot be held liable for the boys’ deaths as it was a publisher in this case. 

The responsibility social media platforms have for their users’ actions is an increasingly important issue in today’s legal world and one that must be constantly reexamined due to the rapidly growing nature of technology. The final verdict on the Wisconsian case has not yet been reached, but these two cases have the potential to draw in the opinion of the U.S. Supreme Court. The reach of Section 230 will be something to watch out for, especially as social media and tech continue to grow.

Taylor Hastings