#blindfaith

Tesla’s Autopilot beta testing means customers are crash test dummies

DCD0B79C-8F2E-4A81-912B-A7B6EE330A61

Back in April 2016, we wrote about the dangerous legal ramifications facing Tesla due to its overzealous promotion of the auto-pilot function. What people tend to forget is the issues surrounding liability. An insurance company often covers a driver with respect to accidents – wet road, poor visibility or being hit by another driver. The insurer covers that type of damage. Yet the death of a Tesla driver in California last week was found to have had the auto pilot function on. Why should an insurer pay for damages that result from willful negligence promoted by the manufacturer itself? This is a design fault. Moreover how could Elon Musk’s legal team not suggest that he refrain from such promotion? Accidents as a result of Tesla’s auto pilot are becoming so numerous that it is hard to fathom why people put so much faith in the system, as this video highlights. They are willingly becoming crash test dummies.

DF8508D5-2491-49EB-BEA8-15C64DE165AB.jpeg

Tesla’s own website notes, “Build upon Enhanced Autopilot and order Full Self-Driving Capability on your Tesla. This doubles the number of active cameras from four to eight, enabling full self-driving in almost all circumstances, at what we believe will be a probability of safety at least twice as good as the average human driver. The system is designed to be able to conduct short and long distance trips with no action required by the person in the driver’s seat.”

The video on the autopilot webpage highlighting the autopilot function on the makes no reference to ensure drivers pay attention to the road even when the system is in use. Sounds to me like the ambulance chasers have plenty of ammunition to launch a class action. It only cost Toyota $1.2bn for the runaway accelerator issue. For a company deeply in debt with such heavy losses, rising interest rates, falling credit rating and senior departures, Tesla should be careful not to get carried away with signaling the virtues of systems that are clearly flawed.

4349AF04-8ACD-4587-A072-A4640B7E29A9.jpeg

 

Tricking the auto-pilot 73% of the time

 

So much faith is put in the hands of computers nowadays but the idea of driverless cars is still fraught with danger.  Car & Driver reports “Researchers at the University of Washington have shown they can get computer vision systems to misidentify road signs using nothing more than stickers made on a home printer. UW computer-security researcher Yoshi Kohno described an attack algorithm that uses printed images stuck on road signs. These images confuse the cameras on which most self-driving vehicles rely. In one example, explained in a document uploaded to the open-source scientific-paper site arXiv last week, small stickers attached to a standard stop sign…using an attack disguised as graffiti, researchers were able to get computer vision systems to misclassify stop signs at a 73.3 percent rate, causing them to be interpreted as Speed Limit 45 signs..”

Sure systems will improve over time but we already have a plethora of people already putting too much “blind” faith in systems being fool proof as this video demonstrates