2018-03-20

In Soviet San Francisco, Uber Eats YOU

On Sunday night/Monday morning in Arizona, a self-driving car being tested by Uber was involved in a fatal crash in which a 49-year old woman was struck while walking across the street.

There will be tons of thinkpieces and speculatory pieces and blame-assigning pieces coming out about these over the next 168 hours or so, so pace yourself. That includes you, Chief Moir...

Police have viewed footage from two of the vehicle’s cameras, one facing forward toward the street, and the other inside the car facing the driver. Based on the footage, Moir said that the driver had little time to react. “The driver said it was like a flash, the person walked out in front of them,” she said. “His first alert to the collision was the sound of the collision.”

She added, “It’s very clear it would have been difficult to avoid this collision in any kind of mode [autonomous or human-driven] based on how she came from the shadows right into the roadway.”

Safe streets advocates were quick to denounce Moir’s comments as tone deaf, inappropriate, and possibly misinformed. The Tempe Police Department has since walked some of it back, issuing a statement that reads, “Tempe Police Department does not determine fault in vehicular collisions.”
Now what we can do, however, is note some of the curious aspects of the story. Let's start with what the Tempe PD said just now...they do not determine fault in vehicle collisions. Well who does?
Since Arizona is not a no-fault state, car accident forensics often determine who caused an accident and therefore whose insurance company will pay for damages.
Oh. Well then. But then why is the National Transportation Safety Board investigating? Is this a special squad sent out to protect (or crucify?) companies with driverless cars?

Finally, this bit of #FakeNews by the far-left Globe and Mail accidentally reveals what is probably the biggest part of this story...that nobody after the fact could decide what the speed limit on that road was.
The Arizona collision happened on a road with seven driving lanes and two bicycle lanes. The speed limit there is 35 miles an hour, the equivalent of 55 kilometres an hour.
Sorry Oliver Moore, lying reporter but check out some of the other coverage. Here's the Verve article linked at the top of this post:
The vehicle was traveling 38 mph, though it is unclear whether that was above or below the speed limit. (Police said the speed limit was 35 mph, but a Google Street View shot of the roadway taken last July shows a speed limit of 45 mph along that stretch of road.) The driver, 44-year-old Rafaela Vasquez, has given a statement to police.
Now it's possible that the road has changed the speed limit. That's been known to happen, as many a person has been charged with speeding for following the speed limit indication on a navigation device or app.

But here's where all the media seem to have missed the point. It's probable that this dumb hippie cyclist jumped out in front of the automated car inappropriately: their kind do that a lot, you get numb to it. Meanwhile the difference in reality between 35 and 45 is going to be pretty much negligible, as readers of this space are well aware I'm an advocate of driving double the speed limit all the time. But if news agencies after the fact can't agree on a road's speed limit, what chance to driverless cars have? And if you abhore speeding how do you account for the possibility of it happening ignorantly en masse (without the human factor of knowing when your abilities are better than your current speed allows?) More vitally, why don't these pricks get charged with speeding like the innocent guy who followed his TomTom (no chuckles please), or even just went on what he remembered or assumed the speed limit to be?