[ad_1]

Buckle up, folks! The National Highway Traffic Safety Administration (NHTSA) has decided it’s time to investigate Tesla’s Full Self-Driving (FSD) feature. You’re not dreaming; FSD, the flagship feature that promises to turn your car into a modern-day chariot, may actually be causing more chaos than a toddler in a candy store. At least one unfortunate pedestrian can confirm this, as they tragically became an unwitting participant in Tesla’s latest safety experiment—oops.

So, what’s the deal with FSD, you ask? Well, the NHTSA is looking into not just one, but four incidents where Tesla’s shiny software turned roadways into demolition derbies. In a classic case of irony, the agency’s investigation suggests that these crashes occurred in conditions that would stump even seasoned human drivers—think sun glare, fog, or airborne dust, the holy trifecta of “I can’t see anything.” If only those pesky weather phenomena could learn to follow traffic laws!

Despite the metallic mayhem, Tesla remains hopeful, flashing a grin and selling the FSD feature like it’s a ticket to the golden age of travel—a mere $8,000. Yes, for the price of a small used car, you too can have your very own autopilot that requires you to stay awake, alert, and ready to wrestle back control at a moment’s notice. Safety first, right? Elon Musk insists FSD is safer than you behind the wheel, but that’s like saying swimming with sharks is safer than cliff diving—it’s all about perspective.

Just last week, Musk showcased a glittering vision of a future filled with self-driving “robotaxis”—because what could go wrong when you let a mental image of a robot do the driving? Despite his captivating sales pitch, the stock market responded in true Shakespearean fashion, sending Tesla shares tumbling down nearly 9% the day after. It seems investors prefer their cars with something called “stability” over flashy promises of profit.

This isn’t the first time the NHTSA has treated Tesla like a high school science project gone awry. Back in February, they ordered a mass recall of over 360,000 vehicles to tweak FSD software—because apparently, the “reasonable risk” of an FSD overzealously ignoring traffic laws wasn’t cutting it. It’s almost as if the feature was engineered to treat stop signs like strong suggestions rather than commands. Sorry, not sorry.

To add more fuel to the fire (figuratively, we hope), a December recall involved 2 million Teslas due to a less-intelligent version of FSD called Autopilot. This flashy name masked a multitude of oopsie-daisies, such as missing stop signs or barreling through yellow lights with the grace of a caffeine-fueled squirrel. Nothing says high-tech future like your car potentially auditioning for a role in a horror movie.

[ad_2]
Source