YouTube star Mark Rober has gone viral on social media for controversial reasons after he uploaded a video to test Tesla's self-driving feature. On March 15, 2025, the content creator uploaded a video titled, Can You Fool A Self Driving Car? In it, the 45-year-old wanted to see if a Tesla vehicle could detect a fake wall, and the autopilot system could stop the car and avoid getting into an accident.
The video has since received some negative reception on social media, with some X users claiming that Mark Rober did not use the Tesla Model Y's Full Self-Driving (FSD) feature.
X user @chrisfirsttt posted:
"Mark Rober posted a video ripping @Tesla’s FSD/AutoPilot technology. The problem? FSD was never engaged, and nothing was on when he crashed into the wall. The whole video was designed to bash Tesla and promote Luminar, a LiDAR company." X user @chrisfirsttt posted.
Another X user, @BLSmith2112, accused the YouTuber of "misleading" viewers. They elaborated:
"Let's state the obvious. This is an incredibly sad and misleading video. The title, "Can you Fool A Self A Driving Car?" Implies that the Tesla used in this video was using Full Self Driving (FSD), when in fact it was using Autopilot (AP), a completely different software stack. AP's purpose is to act as an advanced cruise control, not meant to be fully self driving. While Mark mentions this no less than 9 minutes into the video, he achieved his mission of baiting the viewer. This is not to suggest that FSD performed differently in the tests, but it most likely (99% certain) would have. It's just stating that this is a very low and shady practice of using misinformation to generate engagement. This creates confusion for would be buyers." X user @BLSmith2112 remarked.
Another community member referred to Mark Rober as a "fraud":
"Mark Rober is a complete fraud. Images 1 and 2 show AP was not engaged until shortly before approaching the wall. Image 3 shows AP was not engaged as it got to the wall, and was going 42 mph, over the 40 mph AP was supposedly set at in image 2," X user @WR4NYGov remarked.
What did Mark Rober say about Tesla's optical camera system after Model Y crashed through a fake wall?
In the 18-minute-53-second video mentioned above, Mark Rober tested whether Tesla's autopilot function could be "tricked" because it used "simple camera tools" for self-driving functionality, as opposed to the "much more expensive tech" used by other car manufacturers.
At the 15-minute mark of the video, Mark drove his Tesla Model Y through a wall that had been painted in such a way that it blended in with the surroundings of the road appearing ahead. The YouTuber described this as a "Wile E. Coyote (from the cartoon series Wile E. Coyote and the Road Runner) style painted wall."
However, the car did not appear to detect the wall, and the self-driving system did not stop the vehicle, causing it to drive through the wall.
Commenting on the situation, Mark Rober said:
"So I can definitely say, for the first time in the history of the world, Tesla's optical camera system would absolutely smash through a fake wall without even a slight tap on the brakes."
Timestamp - 15:50
Earlier today (March 17, 2025), Mark Rober took to X to share an 18-second "raw footage" of the Tesla Model Y failing to detect the fake wall. According to him, the Tesla FSD disengaged "17 frames before" hitting the wall. He added that his feet were not on the brake or accelerator.