By Allison M. Clay

          Tesla recently appeared in the news, yet again, after a 2021 Tesla Model S caused an eight-car pileup on Thanksgiving Day in San Francisco, California.[1] According to the driver, a California attorney, the vehicle was in Autopilot mode and was attempting to change into the far-left lane when it suddenly braked for no apparent reason.[2] This incident is just one example from a growing list of accidents involving Tesla’s advanced driver assistance features.[3]

            Tesla offers several different driver assistance features including Autopilot, Enhanced Autopilot, and Full Self-Driving Capability.[4] Pursuant to Tesla’s website, the Autopilot feature includes Traffic Aware Cruise Control and Autosteer functions.[5] The Enhanced Autopilot feature offers those same functions and adds on features such as Auto Lane Change and Autopark.[6] Full Self-Driving (“FSD”) Beta, the most advanced of the driver assistance features, includes the additional Traffic and Stop Sign Control (Beta) function and is intended to introduce Autosteer on City Streets.[7] At face value, these features appear to make driving easier and less tedious for the driver. However, due to several incidents surrounding the use of Tesla driver assistance features, questions about their safety and effectiveness continue to arise.[8]

            The National Highway Traffic Safety Administration (NHTSA), opened a Preliminary Evaluation of Tesla’s Autopilot function in August of 2021 in response to a series of collisions involving Tesla vehicles and stationary first responder vehicles.[9] The investigation centered around sixteen crashes involving first responder vehicles, and later expanded to include 191 additional accidents.[10] Although the NHTSA removed several of these incidents from the investigation due to evidence of external factors or insufficient data, the NHTSA found sufficient patterns in performance and behavior to upgrade the Preliminary Evaluation to an Engineering Analysis in June of 2022.[11] The NHTSA’s Office of Defects Investigation (ODI) Resume released on June 8, 2022 noted several concerns with the findings of the Preliminary Evaluation.[12] First, the circumstances surrounding some of the accidents under review suggested that the crashes occurred despite driver adherence to Tesla’s driver engagement strategy.[13] Additionally, and maybe more importantly, the Resume explained that even where drivers misused the systems, that fact alone does not mean no design defect exists, especially if the use or misuse was foreseeable.[14]

            More recently, on February 15th, 2023, the NHTSA issued a recall of all Tesla vehicles using the FSD Beta features.[15] According to the Safety Recall Report, all 2016–2023 Tesla Model S, 2016–2023 Tesla Model X, 2017–2023 Tesla Model 3, and 2020–2023 Tesla Model Y vehicles that have installed or are set to install the FSD Beta software are subject to the recall.[16] The Report cites the safety risk posed by the software as the potential that, under certain rare circumstances, the maneuvers performed by the vehicle could violate local traffic laws and increase risk of collisions.[17] Though the Report acknowledges the warnings given by the vehicle in circumstances such as these and highlights the need for driver intervention, it found the risks sufficient to deem the software defective.[18] In response to this finding, Tesla will be required to issue an over-the-air software update to improve how the software functions in these specific circumstances.[19] Owner notification is set to take place on April 15th, 2023.[20]

            Despite ongoing investigation into Tesla’s Autopilot system and the mandated update of the FSD Beta software, the mere existence of both investigations is enough to raise some serious concerns about the safety of driver assistance features. These functions are unique in that they require the cooperation of the technology and the operator in order to function safely and properly.[21] Tesla’s website acknowledges this fact and notes that the systems are intended for use by attentive drivers with their hands on the wheel who are prepared to take control of the vehicle at any moment.[22] Alongside several disclaimers advising drivers of the need for active supervision while Autopilot is engaged, the website also states that Autopilot is a mechanism for increasing safety and convenience, and the system “reduces your overall workload as a driver.”[23]

            By their nature, the terms “autopilot” and “self-driving” inherently suggest to the driver that the vehicle will control itself, and they are safe to let their guard down. Regardless of the warnings and notices informing drivers they are responsible for monitoring the vehicle and correcting potential errors, it is easily foreseeable that a driver using an autopilot system may take longer to notice and correct a malfunction or error than a driver who is actively operating a vehicle on their own. On a highway with several lanes and numerous cars traveling at high speeds, the consequences of a malfunction, defect, or mistake can be devastating. Pursuant to 49 U.S.C. § 30111, vehicle manufacturers are to comply with safety standards set by the Secretary of Transportation.[24] Manufacturers are responsible for safety-related defects under 49 C.F.R. § 537.5.[25] A vehicle component may be deemed defective upon a showing of a significant number of failures in normal operation, either during specified use, or caused by reasonably foreseeable owner abuse.[26] This interpretation appears to mean that, with sufficient evidence of regular failures of Tesla vehicles using Autopilot or FSD features, the features may be considered defective and Tesla held responsible, even where such failures are caused by foreseeable driver errors. This is demonstrated, to some extent, by the NHTSA Safety Recall, but, as previously mentioned, this recall is limited to errors occurring under a small number of specific circumstances, and the remedy required is merely a mandated update of the FSD Beta software.[27]

            Self-driving vehicles present a challenging new issue. Determining liability in this context will require a significant amount of research and consideration, especially taking into account the effect of disclaimers and warnings purporting to place responsibility on the driver. If nothing else, Tesla will surely remain in the news as these advanced driver assistance features evolve. If such features continue on their current trajectory, it will certainly be interesting to watch the legal landscape change to address the unique balance between driver and manufacturer liability in the context of autopilot and self-driving vehicles.

[1] Brad Templeton, An 8-Car Pileup Started by a Tesla in Autopilot Opens Up Many Complex Issues, Forbes (Jan. 11, 2023),

[2] Id.

[3] See generally Nat’l Highway Traffic Safety Admin., ODI Resume (2022), (discussing investigation of accidents associated with Tesla’s Autopilot system) [hereinafter NHTSA, ODI Resume].

[4] See Tesla, (last visited Mar. 11, 2023).

[5] Id.

[6] Id.

[7] Id.

[8] See NHTSA, ODI Resume, supra note 3 (discussing investigation of Tesla’s Autopilot system); see also Nat’l Highway Traffic Safety Admin., Part 573 Safety Recall Report (2023), (announcing safety recall of Tesla’s Full Self-Driving Beta software) [hereinafter NHTSA, Recall Report].

[9] NHTSA, ODI Resume, supra note 3.

[10] Id.

[11] Id.

[12] See id.

[13] NHTSA, ODI Resume, supra note 3.

[14] Id. For additional information on the relationship between driver error and defects, see infra note 25 and accompanying text.

[15]See NHTSA, Recall Report, supra note 8.

[16] Id. at 1–2.

[17] Id. at 3.

[18] See id.

[19] NHTSA, Recall Report, supra note 8, at 4.

[20] Id.

[21] See Tesla, supra note 4.

[22] Id.

[23] Id.

[24] 49 U.S.C. § 30111.

[25] 49 C.F.R. § 537.5 (2023).

[26] United States v. Gen. Motors Corp., 518 F.2d 420, 427 (D.C. Cir. 1975).

[27] NHTSA, Recall Report, supra note 8, at 2–4.