Tesla has recalled nearly 54,000 cars from its Full Self-Driving (FSD) beta program for software updating.
The cars are currently programmed to execute rolling stops at intersections under certain conditions.
This is the latest in a long series of flaws making Tesla's ambitious goal of fully-autonomous vehicles -- which it's staked its reputation on -- ever more distant. It's ridiculous that Musk had to have such a basic (and illegal) safety concern pointed out to him.
This is senseless. The rules were designed with flawed humans in mind, and even then rolling stops usually aren't that dangerous. Besides, there are a number of unwritten rules and exceptions for humans; why shouldn't it be the same for robocars?