Are We Surprised

It was only a matter of time before an autonomous vehicle killed someone. The only surprise was how soon. Okay, cards on the table here these things are fucking dangerous and have no place in a chaotic environment such as a public highway. Nothing will ever entice me into one of these contraptions – and, no, I am not a technophobe, for I am always ready to embrace new technology, but this I am not.

Yes, sure, human drivers get it catastrophically wrong, too. However, human drivers do manage to travel millions of miles without mishap and they do it for a very good reason; human senses are proactive, whereas AI is reactive. The autonomous vehicle starts to react after it has sensed a potential collision. A human driver can look at a situation developing and reading body language and little subliminal cues to react long before the situation has become dangerous, long before the AI has even suspected that a situation is developing.

By all means develop this technology, for there is a place for it – but that place is not mixing with people, cars, bicycles, motorcycles and trucks on the public highway.

“It’s just awful,” Tina Marie Herzberg White, a stepdaughter of the victim, told the Guardian on Wednesday. “There should be a criminal case.”

I agree. This woman was unlawfully killed and someone, somewhere should be prosecuted for it. If it was a human driver, then that would be straightforward. However, now we get into who programmed the damned thing, how was it set up, what failed? But bottom line here is that someone was criminally negligent and should be doing jail time.

Companies manufacturing the technology have argued that self-driving cars are safer than humans…

Bollocks. As they start to progress out onto the highways expect more collisions.

…but skeptics have pointed out that the industry is entering a dangerous phase while the cars aren’t yet fully autonomous, but human operators aren’t fully engaged.

Yup. This. With these things on the road, motorcycling is going to become more risky and eventually, some fucktard will decide to ban us as a consequence. I can see it coming.

49 Comments

  1. Idiots.

    So joe soap the attendant is supposed to sit there vegetating letting the robot drive, but simultaneously he’s also supposed to be monitoring progress and ready to take over to avert disaster at a moments notice.
    How’s that working out so far?

    Joe supposed to not be driving at all and in every way ever more deskilled but when the bloody robot bollockses up is apparently supposed to morph instantly into Lewis bloody Hamilton and correct something said effin robot has made an unavoidable accident.
    Working out better now?

    Can’t wait till we have microprocessed 44 tonners (which will be 50/60 tonners by then) trundling round our roads at 8.30am, what could possibly go wrong.

    This wheeze must be like having a lucrative non job in the global warming industry, it beats the hell out of working for a living.

    • And ready to take over to avert disaster at a moments notice. How’s that working out so far?

      Sadly many plane crashes fall into that category. So often the report reveals that flight crew have been caught out when the automatics quit, and they are suddenly presented with a very serious situation. This isn’t helped by most airlines INSISTING on flights being conducted in automatic mode, so real “Hands On” skills are becoming a thing of the past. This very subject is frequently brought up on pilots forums.

      Now transpose this automation to the roads, where you are dealing with thousands of vehicles in very close proximity, without any of the controlled airspace that airlines normally operate in…

  2. The Tesla one was different. It was a driver assist.

    It is inevitable that we will find the woman did something wrong. She may have stepped on to the road like we all do but the computers rely on everything following the rules.

    I await the time when all cars, vans, motorbikes and cyclists are banned, too unpredictable and then any asshole will just stroll across the road and cause traffic jams everywhere. The cars will just stop and with no human drivers to drive around them they will hold up everything. Imagine a protest, any protest, that has a single person just sitting in the road till plod comes and then they disperse only to get the next person to repeat it a mile away. A few people could bring even an area like London to a halt.

    Some things are ripe for automation, trains, plains as they have fixed areas to work in with little chance of something else being there. They should prove the technology there first.

  3. There is a video of the incident at http://ninetymilesfromtyranny.blogspot.co.uk/2018/03/police-release-video-of-driverless-uber.html

    Ironically, and I will not comment any further here, it was a woman co pilot.

    In this video you could see that the car should have detected the women and at least slowed down. The co pilot, not paying attention did not notice until it was too late.

    If here job was to watch the road while the car was being tested imo she should be charged for this. If she was to act as a passenger then there is no issue with her doing passengary things.

    • My understanding is the person employed by Uber was born a man, was convicted of several offences as a man, however he now identifies as a woman and was employed by them as such.

      In the video, it was clearly not paying attention to the road ahead and while an accident was inevitable, I believe a fatality would have been avoided had it hit the brakes instead of just squealing in horror once it cottoned on to the fact there was a person pushing a bicycle in the vehicle lane.

      Quite what the effect of massive doses of estrogen on a male body played in this is another subject.

      There were wrongs on both sides, the victim had no lights on her bike and was jay walking. And I agree the Uber employee was not blameless and should face charges. And the sack.

      • The jay walking bit occurred to me. However, this happens. It is a part of using the roads – they are full of idiots who think they are bomb proof, so it is up to drivers to avoid them. Computer technology that cannot cope with such irrational behaviour is not fit for purpose and should not be anywhere near the roads.

        • Wholly agree about self driving cars. But it’s everything to do with reducing costs. Uber is at the forefront on this because they have most to gain. Same with their interest in flying people using oversized drones.

          Musk wants his truck to be self driving for the same reason; fleet operators don’t much like salaries, unions and pensions. And Musk wants it so whole fleets of trucks can be used on long haul drives with only one human in the first truck, the rest follow driverless.

          As a rider of small capacity scoots I frequently sit in the slipstream of 16 wheelers. Quite how Musk’s software boffins factor that probability will be interesting.

          Or crash for cash nutters.

          • On the one hand, I support Uber and how they are breaking into the taxi market. On the other, I cannot support this obsession with automated cars. Not on the public highway. And the idea of automated trucks in convoys on the motorway fills me with terror. It’s bad enough getting across to the exits and emerging from them as it is with human drivers, without an automated train – for that is what they are planning.

    • From what I understand, this was a test, so the co pilot should have been paying attention and be ready to intervene, not just trust the damned thing.

  4. “Are we surprised”

    No, of course we’re not f*cking suprised. It was always ever going to be a matter of time.

    You didn’t need to be rocket scientist to see this bastard coming did you?

  5. What is needed is a speed limit of 12mph and a man walking in front with a red flag… because you know… new-fanged, anything could happen, what’s the World coming to, don’t frighten the horses, progress oh no!

    • As I mentioned, I have no problem with progress. I do have a problem with unthinking, reactive machines mixing it with the chaotic environment of the public highway and most certainly at such an early stage of development. This was an entirely foreseeable incident. Put them on their own space and it would be fine.

  6. “This woman was unlawfully killed.” No. Definitely not. I’ve read the reports and seen the incident pictures. Ms Herzberg (the cyclist) stepped or fell out into traffic from a central median or reservation, probably tripping over her heavily laden (and unlit) bicycle after scavenging the area for cans or bottles which have a five cent recycle fee which can be redeemed through the CRV scheme. It’s not unreasonable to deduce that the cyclist was collecting discarded cans and bottles for the recycle fee. Having travelled the highways of California in 2016, I saw people engaged in this very activity. If you look at the incident photo’s these bags were full size (and full) black bin bags and one can clearly be seen close to the damaged bicycle. Not ‘shopping’ bags as reported.

    So not the vehicles or anyone’s fault. If it hadn’t been a self driving car the cyclist fell in front of, this wouldn’t even have made a paragraph on page four.

    • I can’t comment on US law – although it is based upon UK common law. The likelihood was reasonably predictable and we have a duty of care not to cause injury. That an automated vehicle would hit and kill someone – even if they are contributorily negligent – was entirely foreseeable, so negligence will apply and it would apply with a human driver too. And I would expect a prosecution even if that resulted in acquittal.

      Update – just to enlarge on this a bit. If you do something that results in death, knowing beforehand that death was a likely outcome, then manslaughter charges are appropriate. This was so foreseeable there should have been neon light all over it.

    • I’ve seen the video, and very much doubt the outcome would have been much better if I’d been driving, and I drive 35,000 miles a year, which does give me fairly sharp reactions (most of the time when a passenger I find it quite scary how long it takes for people to react to things like people in front of the braking).
      The time from it becoming apparent that there was someone walking in the road to the point of impact just wouldn’t have been sufficient to mentally react, physically move one’s foot to the brake pedal and push.

      I’ve few qualms about the actual tech of self driving vehicles – the current evidence seems to suggest that they are safer than human drivers, and frankly even if all the did was to allow me to snooze on the motorway bits of a journey, that would be of great value.

      I do think that the legal side of things will need some serious thought (exactly who is liable when things do go wrong), and I’m not overly keen on the idea of being the “responsible adult” to an entirely self driving machine to over-ride it if stuffs up – maintaining adequate levels of concentration driving a “fully manual” motor on long stretches of empty motorway is bad enough as things stand.

      My biggest concern is that I’m not keen on trusting my life to networked control software which can (and will) be hacked by all sorts of people engaged in everything from bitcoin mining to espionage…

      • As a human driver, you would have reacted and made some attempt to avoid the collision. You might not have been successful, but you would have tried. Therein lies the difference. This thing didn’t react.

        • You are spot on. As an ex-professional driver I could see this coming and am not surprised at all at how quickly it came about. Most tech that comes to the market is very rarely ready and always ends up being tweaked because they discover this that or the other wrong with it. As for the assertion by Bill above that the accident would have occurred even with him driving I do believe he is mistaken. Just because you can’t see the woman in the video doesn’t mean that you would miss her looking through the windscreen. The main thing though is that even a split second of braking can reduce the speed of collision down to a survivable impact. This is why people that fall asleep at the wheel often die in crashes because they do not have that split second of braking. It really does make all the difference.

  7. I like the point you make in para 2. Though many human drivers do not seem to use this kind of analysis and e.g. change lanes to avoid a parked car at almost the last moment.

    • You’re quite right. However I’ve often sat next to a driver who “saw that coming” and reacted without giving it much thought in good time. There are so many subliminal clues that AI just won’t catch. It will only respond after the offending obstacle has started to move into its path.

  8. “This woman was unlawfully killed and someone, somewhere should be prosecuted for it. If it was a human driver, then that would be straightforward.”

    Hang on, thats not right. Lets say this wasn’t a computer driven car, and a human was driving, and the same outcome transpired. Surely the police investigation would determine who was at fault, and if the driver wasn’t at fault in any way, then no prosecution takes place? Not every fatal accident results in a prosecution for manslaughter (or whatever). If the driver is obeying the laws of the road, and the victim does something stupid and gets killed then its not the fault of the driver, whether its a human or a computer.

    I know someone who killed a man who was walking on the motorway, the victim was walking in the slow lane at night (had been drinking all day and taken drugs) and was hit by the car driven by my friend – the inquest absolved him of any responsibility, there would not have been enough reaction time from him being visible in the lights to swerve to avoid.

    So just because someone has died does NOT mean that someone is necessarily criminally liable for that death.

    • The reason I am suggesting that manslaughter applies is because an autonomous car hitting and killing someone is entirely foreseeable and this is not mature technology and they were trying it out with live people – other words, the general public are being used as an experiment for this immature technology and lives are being risked. That is negligence. So the comparison with a human driver doesn’t apply – unless the human driver was driving in a manner that made a collision likely.

      Just a further thought here – if a human driver failed to make any attempt to slow down I’d expect a prosecution.

      • “the general public are being used as an experiment for this immature technology and lives are being risked. That is negligence.”

        You don’t know that. It may well be the case, but its not fact, yet. Its entirely possible that no-one, human or machine could have stopped in time in this case. We won’t know until the investigation is complete. Calling it manslaughter just because its a computer driven car and someone died is jumping the gun.

        • No, pointing out that this was foreseeable is not jumping the gun. And it was entirely foreseeable. That’s when it becomes manslaughter – and at the very least, negligence. of course they are using the public in their experiment – they are road testing on the public highway. That, frankly is gross negligence.

          Yes, I know that this accident might have resulted in the same outcome had it been a human driver. It’s not the not being able to stop in time that’s the issue here. This thing didn’t see the obstacle and it should have, and they let it out on the roads. I find that terrifying.

          • I forsee that a motorcyclist will sometime in 2018, somewhere in the world, kill a pedestrian.

          • But not necessarily through negligence – and that’s a strawman argument there.

            You are missing my point here. This technology is not yet mature – and I remain unconvinced that it ever will be sufficiently mature for our road systems. Taking it out onto the road – a chaotic environment – was a clearly a foreseeable risk to other road users. And who the fuck do these people think they are using the rest of us as guinea pigs for their experiments?

            Negligence is defined as to do something a reasonable man would not do (or vice versa).

            A reasonable man would not road test immature technology in a live environment with people moving about in an unpredictable manner until they are absolutely certain that the vehicle will respond appropriately in all possible scenarios – including idiots stepping into the road at short notice. Because that’s what happens. It is foreseeable.

            That this vehicle completely failed to respond Tells me that I am correct – the video confirms it beyond all doubt. Had the vehicle made some attempt to avoid the collision, I would be taking a different view regarding legality in this case. The thing is seriously flawed yet these cretins still allowed it to be tested on the public highway. That is certainly negligence.

            Taking an unroadworthy vehicle onto the public highway is, rightly an offence. An automated vehicle that does not see and therefore does not respond to an obstacle in the road is clearly unroadworthy. It could just as easily have been a pedestrian crossing at a pedestrian crossing.

          • And who the fuck do these people think they are using the rest of us as guinea pigs for their experiments?

            Sadly, self driving cars is just one example – there are many more which we’ve never been consulted on…

          • I told you there would be things we agree on….;-)
            I am wondering though what would be the test of whether something is foreseeable though. I knew this would happen as did you and I bet that the vast majority of drivers would feel the same but I have no figures to back my assertions. It just seems so damn obvious. I bet if it went to court though a judge would probably cite some dodgy bit of research that was funded by the makers…

          • It would actually be an interesting idea to see how well this tech would do on a hazard awareness test. A human for instance will see a young person walking on the pavement bouncing a ball and will register that as a potential hazard yet as far as I’m aware this kind of tech will only realise there is a hazard when the person suddenly flies out into traffic to retrieve the ball. As our host has rightfully pointed out this is reactive yet we have to pass the hazard awareness test before being allowed out on the road, showing that we are proactive. I could think of dozens of scenarios where this tech will fail that a human would sail through easily.

          • This is its inherent weakness. Even relatively poor drivers will be subliminally noticing hazards and starting to prepare before the hazard becomes a problem. Most people talk of having a sixth sense when driving. This is all it is.

          • There have been loads of test on private land. It is now time that they actually test them in a real environment but the co pilots are sold on how good it is and lulled by hours of boring waiting so that when something happens that is unforeseen like this they are way behind the curve in reaction.

            Dark clothing, reflective bags, there are many things that the sensors do not see and as I said when they were first proposed the only way they will work is with nothing else on the public road, no cars, no bikes, no people. They are ideal for automated driving in works where people are not able to go and can be ordered to but of no use where kids can run out after a ball in the middle of the night.

            If everything follows the rules then it is OK. Every day I see dozens of incidents where people don’t.

            Our normal system of justice allows us to make the risks too expensive to experiment in this why. What is so special about cars that the government will offer immunity for these crimes?

          • Dark clothing, reflective bags, there are many things that the sensors do not see and as I said when they were first proposed the only way they will work is with nothing else on the public road, no cars, no bikes, no people. They are ideal for automated driving in works where people are not able to go and can be ordered to but of no use where kids can run out after a ball in the middle of the night.

            This. Precisely this. However, such is the obsession with this technology, the likely outcome is not to give them their own space, but to drive the rest of us off the roads and give them our space.

            Our normal system of justice allows us to make the risks too expensive to experiment in this why.

            And quite right too. I have no problem with people being allowed to exercise risk providing they are the ones being exposed to it. Placing unwitting and unpaid members of the public at risk for the sake of their geekery is downright cavalier.

          • Everything is foreseeable. Its foreseeable that any given car driver could have an accident and kill someone. Does that mean they are necessarily criminally liable for every accident they have, regardless of whose fault it was? Even if a madman jumps out in front of them at the last minute on a dark night? Of course not the criminal liability lies not in the event being foreseeable, rather was it the fault of the driver, if not should the driver have reacted sooner. If either of those are the case then yes a criminal case of some kind would be made against the driver.

            In this case a person stepped out in front of a car. It happens every day in every country, and every day someone dies as a result, with human drivers. Sometimes the driver has some liability, quite often not. The fact the computer was driving makes this case no different at all. If the computer had done something wrong, against the rules of the road, driven on the pavement for example, then you have a point, it would have committed an act, rather than a reaction to someone else’s act. Then criminal charges against someone would be in order.

            But here you have nothing but a person walking in front of a speeding car. That is foreseeable to end badly, regardless of whether the driver is human or not.

          • Sigh, you are deliberately missing my point here and being obtuse. You know damned well what I mean when I talk about foreseeable in this context.

            Foreseeable in this instance is clear. Is doing this activity likely to cause harm and can we predict that it will happen? Yes, of course we could. It was blindingly obvious that this was going to happen sooner or later because this vehicle was on the roads without being under proper control.

            And this car would have hit a pedestrian on a crossing because it did not react to the obstruction. If I hit a pedestrian that jaywalks and I have done everything in my power to avoid the collision, then no fault applies and the foreseeable argument doesn’t apply either. If I take to the road in a defective vehicle, it does. It really is that clear.

            The fact the computer was driving makes this case no different at all.

            It makes all the difference. AI cannot and will not be proactive because it cannot anticipate in the same manner that a human driver can as it does not see subtle clues about behaviour, it cannot exercise anticipation and awareness and therefore will react – if it does – after the situation has started to develop. Too fucking late. A good driver anticipates long before that and has already prepared for it.

            This particular vehicle was defective, yet it was out on the public highway dicing with other road users. This accident was entirely foreseeable and to use unwitting road users as guinea pigs in this experiment was downright cavalier.

            That is foreseeable to end badly, regardless of whether the driver is human or not.

            So what part of “the vehicle did not make any attempt to avoid the collision” did you not get? Unless they were asleep at the wheel, a human driver would have made some attempt to avoid the collision and as Tony mentioned above, that might have made the difference between death and survival for the pedestrian.

          • Human drivers are fallible too. Not all humans are good drivers, yet we let them all on the road. There’s probably hundreds of thousands if not millions of people out there who an objective observer could say is likely to have an accident at some point. Do we criminalise them if they do have an accident? It was entirely foreseeable after all.

            If in this case the computer failed to see the person stepping out in front of it, then there may or may not be criminal proceedings required. However the mere fact that a person died and a computer was in charge at the time does not mean someone is criminally liable, it depends on the facts of the case, just as it would if a human driver made the same error.

            Many things are now entirely computer controlled – planes, trains, train signalling, subway systems. We accept this quite happily, even though at times this technology may malfunction and end up killing someone, because on average its better than humans doing it all the time. Who knows, this car might have avoided accidents at some other points in its life, where a human driver would have failed to react in time. We can’t count the numbers of accidents that don’t happen, so we are only left with those that do. If computer controlled cars have less accidents than human controlled ones (if maybe different in occurrence) then its got to be an advance.

          • Planes trains and subway systems are not chaotic like the public highway. These people placed a machine that was not under control on the public highway. That is the fact of the matter. That this was going to happen was entirely predictable and foreseeable. To do this was criminally negligent.

            I repeat as you seem to be struggling with this – the public highway was being used to test a device that was inherently dangerous precisely because it was not being controlled properly. The general public were being used as unwitting guinea pigs to satisfy a bunch of overgrown schoolboys road testing their latest computer game. The public highway is no place for such machines. Controlled environments such as happens with rail systems, yes. The highway, no. That the outcome was going to happen and a reasonable man could see this makes it criminal.

  9. @LR

    human senses are proactive, whereas AI is reactive. The autonomous vehicle starts to react after it has sensed a potential collision. A human driver can look at a situation developing and reading body language and little subliminal cues to react long before the situation has become dangerous, long before the AI has even suspected that a situation is developing.

    “Can” being the operative word. You’re looking at it from a Biker’s perspective – those who survive are those who “look at a situation developing and reading body language and little subliminal cues to react long before the situation has become dangerous”, same applies to road conditions.

    Most Car drivers live in their bubble and are reactive – often too late.

    • Even a poor driver would have seen this coming. She had to cross a lane before she was directly in front of the vehicle. A human driver would have been reacting by that time. Human drivers regardless of skills have eyes and brains. They can anticipate, which is something AI cannot.

      In a predictable discrete environment AI will be fine. However the public highway is full of constantly changing hazards that even poor drivers will see and anticipate long before an automated vehicle.

      To allow these things onto the road is criminally reckless and the states that have allowed it have been wood by a bunch of geeks who think that the highway is just another computer game.

      • However the public highway is full of constantly changing hazards that even poor drivers will see and anticipate long before an automated vehicle.

        Seriously? They don’t.

        An example of “bubble drivers” not noticing what’s occurring ahead:

        Stepfather driving ~30mph. Car ~300 yards in front indicates, pulls over and stops. Car behind follows and crashes into it. Hit vehicle rises about 3 feet then falls.

        Stepfather follows and starts slowing.

        I say: “Tom, don’t pull up, drive around”

        Tom says: “Why?”

        I’m sure you witness similar frequently.

        .
        On automated vehicles: trains/trams, planes & ships – done, capitulation to unions is problem.

        Roads – do M’Way lorry convoys first inc join/depart & remove/add driver. Then when all OK consider more complex car journeys. Starting with almost most difficult is imho stupid and equiv to willy waving. Even DARPA knows that.

        As you say “public highway is full of constantly changing hazards” especially when not on “A” and “M” roads.

        • Trams planes and ship operate in a highly controlled environment. That’s why automation there is fine.

          Now that I’m up and about, let me expand. Yes, even a poor driver is able to see (sensors don’t see as in observe, anticipate and plan). Your example wasn’t a lack of observation, but poor planning skills. He still started to react, just not a good reaction.

          As for the idea of a stream of automated trucks on the motorway, that is fucking insane and terrifying for the rest of us. It’s bad enough now trying to exit and enter a motorway when a group of trucks is blocking the exit – at least human truckers can see the developing situation and manouver to allow the flow.

          If you want a train of trucks, put them on the railway.

          • “Your example wasn’t a lack of observation..”

            He didn’t see an RTA happening in front of him wasn’t a lack of observation?

            “He still started to react, just not a good reaction.”

            No, he was on auto-pilot following vehicles in front, not observing and reacting. He’d have followed them off the end of a pier. Like Sat-Nav zombies.

            “… a stream of automated trucks on the motorway…”

            With restrictions on when. I was thinking of my frequent night-time* London-Scotland car/bike journeys.

            *Night-time as less traffic = faster journey, but many HGV convoys.

Comments are closed.