Connect with us


Man Brags On Social Media That His Tesla Drove Him Home When He Was Too Drunk To Drive



A Tesla owner boasted of utilizing his vehicle’s Full Self-Driving capability to go home while intoxicated, which is a concerning admission that should cause everyone to stop and think about the implications.

During a recent Twitter Spaces chat, the user boasted to another participant, “I admitted the other day, I was a little bit tipsy after Christmas,” Insider was listening in on the conversation.

“I was probably drunk,” he added. “But with FSD, it drove me home, I mean, flawlessly.”

To be quite clear, driving under the influence on purpose, whether under the influence of alcohol or drugs with having FSD or not, is not only against the law but also incredibly dangerous and puts the lives of everyone else on the road at jeopardy.

In November, Tesla made accessible to a wider audience an incomplete beta version of its FSD software, despite lots of well-documented deficiencies. Some argue that this puts other drivers who use the road in danger.

According to the Tesla website, the Full Self-Driving feature is “designed to provide more active guidance and assisted driving under your active supervision.”

However, this has not prevented certain owners from exploiting the system in the way that they are utilizing it. There have been innumerable examples of drivers discovering simple workarounds to fool the feature into believing that they are paying attention to their surroundings when in reality they are not.

Others have been found to be completely asleep while operating a motor vehicle. Just the week before, German police spent an agonizing fifteen minutes pursuing a Tesla motorist who was supposedly asleep while driving on the motorway, according to Futurism.

Yet another reason why Tesla should get ahead of a huge issue it has created is the fact that people intentionally go behind the wheel while drunk.

Nearly a third of all road accident deaths in the US involve intoxicated drivers, according to the National Highway Traffic Safety Administration, which is itself examining hundreds of crashes employing Tesla’s Autopilot software.

It’s a glaring case of a Tesla buyer being mislead by the company’s deceptive marketing into greatly exaggerating the capabilities of their car. As such, it exemplifies why it is a bad idea to implement such a system with the expectation that end users would behave properly.

Like this article? Get the latest from The Mind Unleashed in your inbox. Sign up right here.

Typos, corrections and/or news tips? Email us at