New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired
New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired
That doesn’t drive the problem of autopilot not taking the right choices. What is the driver wasn’t drunk, but they had a heart attack? What if someone put a roofie on their drink? What if the driver was diabetic or hypoglycemic and suffered a blood glucose fall? What if they had a stroke?
Furthermore, what if the driver got drunk BECAUSE the car’s AI was advertised as being able to drive for you? Think of false publicity.
If your AI can’t handle one simple case of a driver being unresponsive, that’s negligence on the company’s part.
How could the company be negligent if someone gets drunk or has a heart attack and crashes their car? No company has a Level 5 autonomous vehicle where no human intervention is needed. Tesla is only Level 2. Mercedes has a Level 3 option (in extremely limited conditions). Waymo claims Level 4 but is geofenced.