Viewing 40 posts - 81 through 120 (of 341 total)
  • Killer cars stalking our streets…
  • aracer
    Free Member

    FuzzyWuzzy wrote:

    For example if you see a pedestrian looking like they might cross without looking properly (on the phone, drifting to kerb edge), do you:

    a). Take over and slow down just in case (as you probably would if driving a normal car)

    😆 – you’re having a laugh. A lot of people on here might, because we’re interested enough in driving safely to care about and pay attention to these things. I’m less than convinced that even an average driver would. Certainly the norm is that pedestrians stepping out is a completely unpredictable thing and would be a successful defence.

    Edukator wrote:

    So we make a calculation, adapt speed, and maybe cover the brake, and maybe give a wide berth, and maybe even put the hazards on to warn people behind us.

    We’re so good at it that people who really want to kill themselves generally choose a train rather than a car or a truck. We might stop or slow down enough to only injure them. In future suicidal people will be able to choose any vehicle which will at least give train drivers a less traumatic time.

    and again, good drivers might, but they’re not the ones killing people mostly, and they’re outnumbered. FWIW I also think you’re wrong to assume that autonomous cars won’t make those judgements and act accordingly.

    martinhutch wrote:

    My view is that there is no way an autonomous vehicle can interpret the road and its surroundings as well as a good, experienced, driver. Without knowing it, good drivers often spot hazard cues long before they are identifiable as actual hazards and the brain is a very good filter of useful/irrelevant information.

    I fundamentally disagree there – I don’t see an obvious reason why computers shouldn’t be able to perform that well at processing the information (with the advantage of far superior sensors).

    However, it’s likely that an autonomous vehicle will be significantly better than a large number of the other drivers who grace our roads.

    Which is the fundamental truth

    Expecting these vehicles to cut massively the number of casualties is unrealistic.

    Not even when the vast majority of casualties are caused by poor drivers, or drivers failing to look properly etc.? What is the realistic chance of an autonomous car completely failing to spot a cyclist in front of them and running straight into them, or of one pulling out of a side road in front of a cyclist?

    martinhutch
    Full Member

    What about mathematics? Can a machine be better at calculations than a human? What about chess? Can a machine be better than the best human at chess? What about Go?

    If you can turn my High Street into a board where grannies can only go forwards, not backwards, and schoolkids move three steps forward and one to the side, then you’re onto a winner. Chess is a game with very simple rules and multiple possibilities all deriving from those simple rules. Live traffic is a game with some rules, which are frequently not followed in a variety of ways, and pieces that sprint onto the board mid-game or hide behind roadsigns.

    I’m sure the ability of these autonomous vehicles will improve. The tech is still relatively in its infancy. But unlike chess, where the ability to beat humans simply meant outmatching them in terms of pure calculations, computers have to find a way to get better than a species with millions of years of adaptations designed to detect and predict hazards and sift out the genuine threat from all the foliage.

    aracer
    Free Member

    uponthedowns wrote:

    So what’s stopping it happening? ASLEF?

    I don’t know exactly, but I suspect that’s part of it. Though there’s also the issue that trains are held to a far higher safety standard than roads and there’s the perception that having a driver as well as the autonomous system results in another layer of protection.

    I note that I’m also not suggesting that the drivers do nothing – AFAIK they still do a lot of the easy stuff themselves, it’s just that the autonomous systems would take over in the event of a safety related issue. Certainly there’s not a mainline track in the country where a driver heart attack should result in a crash even without a dead man’s handle.

    ndthornton wrote:

    aracer

    Are you saying that its possible both in terms of hardware and software to create a system that can resolve….

    a = “Child distracted by social media”

    b = “Child looking carefully at traffic”

    c = “Any one of an infinite number of similar looking scenarios”

    …and be able to prove compliance 100% of the time in 100% of these scenarios with 100% reliability (because that is the level of verification required to get new technology on to a production vehicle)

    I’m saying they can not only do better than the average driver right now (the one who isn’t even paying any attention to the child), but that even if we’re not already there now it can do better than even the best driver. Though there we go again with the attempt to run faster than the bear rather than just run faster than the other bloke. The sort of technology you’re applying those rules to is also the sort of thing which adds another point of failure rather than replacing the biggest existing weakness.

    Ill go back to the trains…automation would be easy – almost trivial. The fact that it hasn’t happened should be a big alarm bell considering the problem and the risk is many orders of magnitude bigger with cars.

    You appear to be ignoring that they can and are. DLR.

    dissonance
    Full Member

    In almost any critical life or death decision, I’ll take an algorithm over human “judgement”.

    Whose judgement designed the algorithm?

    aracer
    Free Member

    dissonance wrote:

    In almost any critical life or death decision, I’ll take an algorithm over human “judgement”.

    Whose judgement designed the algorithm?

    Somebody* with plenty of time to consider the options from a full set of information rather than trying to decide in a split second from a limited subset.

    *actually we’re talking multiple levels of review here over the course of months

    jimjam
    Free Member

    martinhutch

    What about mathematics? Can a machine be better at calculations than a human? What about chess? Can a machine be better than the best human at chess? What about Go?

    If you can turn my High Street into a board where grannies can only go forwards, not backwards, and schoolkids move three steps forward and one to the side, then you’re onto a winner. Chess is a game with very simple rules and multiple possibilities all deriving from those simple rules.

    That wasn’t really my point. In really simple terms driving is about judging speed and distance. Every conceivable variable that a human is considering will be judged by a computer that isn’t guessing, or if it is, it’s guessing based on better information that the average driver has, and they won’t get distracted, tired or angry.

    Autonomous cars will be anticipating every possible scenario and variable and using information a human doesn’t have access to – a really simple example, people don’t have night vision or 360 degree vision.

    thecaptain
    Free Member

    computers have to find a way to get better than a species with millions of years of adaptations designed to detect and predict hazards and sift out the genuine threat from all the foliage.

    Lol. It’s almost like you think humans are good at driving.

    Edukator
    Free Member

    My main worry with self-driving vehicles is not with whatever new perfectly functionning models are put on the roads, it’s what happens when stuff goes wrong. The cars will be dependnt on a mass of sensors feeding signals into a computer. We have examples of extremely well maintained machines where this is the case – planes. And they crash just because of a bit of ice in a tube. And even if there are still two pilots on the plane it still crashes because the pilots et confused when they have to go back to manual.

    Take a look at current cars, most of the problems are electronic. “fail safe” will be programmed in you say. But will it? Will the car stop every time there’s a bit of mud on the lens or the radar signal is jammed. Will it just slow down or stop in the middle of the lane or will it continue. If it does something will all the other cars make the right decisions about what to do because we’re talking about lots of them, and maybe thaey all have their radars messed up by the same paracite. When one car runs a red light because it got confused will all the others respond appropriately.

    I can see a future for autonomous traffic systems but I think that speed limits will have to be dropped to make it work. Autonomous trams work at low speed because they are slow enough and cautious enough to be safe around pedestrians. Allow autonomous vehicles to go as fast as humans go currently  and there will be fatalities – maybe less than with humans at the wheel but a whole lot less acceptable

    mogrim
    Full Member

    Whose judgement designed the algorithm?

    There isn’t “an algorithm”. It’s machine learning, completely different kettle of fish.

    jimjam
    Free Member

    Edukator

    My main worry with self-driving vehicles is not with whatever new perfectly functionning models are put on the roads, it’s what happens when stuff goes wrong.

    Limp mode.

     Will the car stop every time there’s a bit of mud on the lens or the radar signal is jammed. Will it just slow down or stop in the middle of the lane or will it continue.

    Self cleaning lenses aside, perhaps in the event of a sensor failure the car might prompt the useless meatbag in the driver’s seat to put down his or her phone and drive the car?

    I can see a future for autonomous traffic systems but I think that speed limits will have to be dropped to make it work.

    I know you live in eternal hope of lowered speedlimits Edukator but AV’s will mean speed limits are increased massively – current speed limits are based around general human reaction times and abilities. Once this tech becomes common place it’ll only be old dumb, human drivers who have to adhere to current limits. Everyone else will be able to go as fast as their tax band allows.

    dissonance
    Full Member

    It’s machine learning, completely different kettle of fish.

    Not really.

    “Machine Learning” covers a whole range of different technologies with varying amounts of direct intervention. The pure machine learning is also mostly restricted to the recognition tools as opposed to the what to do option.

    You might have a NN to recognise that its a)traffic light and b)its red but the instructions for what to do will be written into it.

    An example would be Mercedes and their comments about who they would have the system prioritise. The driver/passengers or a bystander.

    Edukator
    Free Member

    No way will a manufacturer produce a vehicle that goes faster than our current cras Jimjim. The only reason they produce cars that will go far too fast now is that the ethical dilemmas are with the drivers not the manufacturers.

    ” Here is the nature of the dilemma. Imagine that in the not-too-distant future, you own a self-driving car. One day, while you are driving along, an unfortunate set of events causes the car to head toward a crowd of 10 people crossing the road. It cannot stop in time but it can avoid killing 10 people by steering into a wall. However, this collision would kill you, the owner and occupant. What should it do? ”

    https://www.technologyreview.com/s/542626/why-self-driving-cars-must-be-programmed-to-kill/

    ndthornton
    Free Member

    I’m saying they can not only do better than the average driver right now (the one who isn’t even paying any attention to the child),

    Oh I see,  “your saying”. well I’m convinced 🙂

    Lets let them loose now –

    who needs facts, evidence and testing when you have sayings

    yourguitarhero
    Free Member

    Most cars can go faster than the speed limit already though. So it’s more a case of raising the limits than producing higher performance cars.

    aracer
    Free Member

    ndthornton wrote:

    <span style=”font-size: 0.8rem;”>who needs facts, evidence and testing when you have sayings</span>

    Wow – so your argument is now reduced to pedantry over wording?

    Are you suggesting that the average SMIDSY driver will be doing a better job of observing the child than the computer with its array of sensors and lack of interest in FB? Do you think the evidence for that isn’t available? That’s where the argument is, not over whether it’s possible for the computer to do better than a driving god.

    aracer
    Free Member

    Edukator wrote:

    there will be fatalities – maybe less than with humans at the wheel but a whole lot less acceptable

    Yes, because it’s a whole lot better to be killed by a human driver. You’re still trying to run faster than the bear here.

    Edukator wrote:

    ” Here is the nature of the dilemma. Imagine that in the not-too-distant future, you own a self-driving car. One day, while you are driving along, an unfortunate set of events causes the car to head toward a crowd of 10 people crossing the road. It cannot stop in time but it can avoid killing 10 people by steering into a wall. However, this collision would kill you, the owner and occupant. What should it do? ”

    Which is something which has to be considered, but it’s an edge case, an edge case which may never ever occur in reality, because as discussed already the autonomous car would judge the situation so that it never found itself speeding towards a group of 10 people. The answer isn’t to solve such dilemmas but to ensure they don’t arise in the first place, which is entirely plausible.

    kerley
    Free Member

    Of course autonomous cars will be better than humans.  The problem is that people expect them never to get anything wrong – not once, not ever.  I would take 10 deaths a year from autonomous vehicles vs the current ~1700 but those 10 will get a lot of coverage.

    Edukator
    Free Member

    because as discussed already the autonomous car would judge the situation so that it never found itself speeding towards a group of 10 people

    Read the whole article I linked, the car may never find itself speeding towards 10 people (but might) but it’s quite likely to find itself speeding towards one or two, and fail which is where this thread started. A self-driving car had an accident which whatever the spin, an alert driver anticipating what a person with a bike dripping with plastic bags might do could have done better than all that fancy software and sensors.

    aracer
    Free Member

    I did read the whole article – you’re the one who chose to quote the sensationalist edge case in an attempt to make your point.

    As for the real case here, I’m less than convinced from what’s been reported that I’d have done better than the computer there, and it’s possible I may be a little better at paying attention to pedestrians and cyclists than the average driver – you may think you would have done better and I’ll admit I’d rather take my chances with you driving than with me, but I’m still not convinced of that, despite you probably being way, way better and safer than the average driver.

    There is an interesting point here that it may be possible for the computer to do better and that they need to program better for such scenarios, so that they can avoid those ethical dilemmas in the first place. I very much doubt that it’s impossible for a computer to consider the situation the same way as you.

    ndthornton
    Free Member

    Are you suggesting that the average SMIDSY driver will be doing a better job of observing the child than the computer with its array of sensors and lack of interest in FB?

    Well…

    Yeah – I mean you have massively oversimplified and missed the point as usual and I refuse to Google what SMIDSY means – but essentially yes

    You could maybe (and its a big maybe) design a system to react better than a human in this very specific scenario that you have described. To do just this one thing. That is what computers are great at, doing one or a small number of very well defined tasks with very well specified boundaries. They can repeat these tasks with high levels of accuracy and repeatability and they can do them thousands of times faster than a human.

    That’s what they are good at – but unfortunately that’s not the task. The task is to respond to a limitless number of possible scenarios none of which have clear boundaries. You cant code for an infinite number of possibilities. What you need is an AI – what you really need is a human brain –  and we are nowhere near (I have doubts its even possible).

    So in answer to your question – you might be able to anticipate the child better than a human –  but you might kill the old lady on the bicycle in the process.

    aracer
    Free Member

    Ah, because you’re thinking computers are incapable of doing/observing two things at once – something humans are notoriously excellent at?

    No, you don’t have to specifically program it to avoid Gladys who lives at number 42 separately from little Johnny who lives at number 13. I’m not sure you have much idea at all of how these things work.

    ndthornton
    Free Member

    Wait – what – no

    Oh I give up

    Lets see what happens shall we

    If they become available in my lifetime I will buy you one – how about that?

    Edukator
    Free Member

    I’m not sure you have much idea at all of how these things work.

    Pot calling the crystal glass decanter black.

    Northwind
    Full Member

    The chat above about train staff- driverless isn’t the same as staffless, the train’s software can’t stop a fair dodger or a fight or a sexual assault, or intervene in a medical emergency, or escort people out of the train along the track in an accident or similar. I’d rather have a guard and no driver than a driver and no guard.

    maxtorque
    Full Member

    It’s worth noting that in the real world, even “skilled” drivers are not actually very good, because accident scenarios are incredibly rare.

    I spent 5 years at Prodrive, where they ran Driver Training programs for Emergency service drivers (and for general “company car” drivers too) and even for trained Police drivers, who spend a lot of time driving, in reality, probably under 10% could react to a sudden emergency stop situation with a decent response time/level.  Considering that we were on a Proving Ground, doing Advanced Driving, how many members of the general driving public, on their average commute, on wet tuesday afternoon on a back road in Swindon, do you think could “ace” a brake n’ swerve around a cyclist suddenly appearing from between parked cars into there path?  In my experience, something like 75% won’t have even got their foot off the accelerator, let alone nailed a perfect ABS stop, or well judged handwheel input when they hit said cyclist…..

    thecaptain
    Free Member

    Just to be clear, is there anyone here arguing that self-drive cars are not going to be massively safer than human-driven ones?

    Edukator
    Free Member

    No, but the human in Swindon might have decided to give the parked cars a wide birth to improve visibity because there was nothing else on the road so didn’t feel obliged to stay in lane (unlike the stupid computer car), realised the stopping distances had increased and reduced speed, and had most of his attention focused on the parked cars as that’s where the greatest unknown lay so covered the brake within half a second of the bicylce tyre showing (well in advance of the computer that just had it down as environmental noise), hit the brakes as half the bicycle wheel appeared (the computer still had it in the “falling leaves nothing to worry about” category) and thank to ABS (which means zero skill is required for an emergency stop) and an instinctive swerve (which remained controlled thanks to said ABS) avoided the cyclist completely – but the computer wouldn’t have.

    jimjam
    Free Member

    Edukator

    No way will a manufacturer produce a vehicle that goes faster than our current cras Jimjim.

    Of course they will. If all cars are fully autonomous, homogeneous grey boxes you instantly kill the premium/luxury half of the car industry which is worth hundreds of billions a year. No one is going to pay £40,000 to sit passively in an Audi at an autonomously driven, GPS limited 60mph when a £20,000 Skoda will do the same.  When people are removed from the equation there’s no logical reason to adhere to speedlimits based on human reaction times.

    The only reason they produce cars that will go far too fast now is that the ethical dilemmas are with the drivers not the manufacturers. Here is the nature of the dilemma. Imagine ………….

    Nope. People have always been willing to pay more to travel faster or in more luxury than the next guy. The reason we can buy a 1000bhp car (if money wasn’t a barrier) is nothing to do with ethics, in many ways it’s completely unethical, it’s because there’s a market there. The ethical concerns / trolley tests have been discussed here in multiple threads going back years. It’s a red herring. The reality will be that AV’s will kill and when they do it’ll because they interpreted their data one way, not another and acted in what they percieved to be the best way. It’ll be a novel headline, which slowly becomes reality. There are much bigger and potentially more insidious implications of Ai systems makig decisions which no one is discussing because they’re not as obvious, or as superficially dramatic.

    Edukator
    Free Member

    To answer your question, the captain, I think self-drivng cars will be no better or worse than driver driven cars with a GSP tracker/speed limiter, the same level of driver aids as the driverless car (alcotest ignition lock, auto emergency braking, blind spot warnings etc.) At the same level of technology I’m convinced that the railways and airlines have already shown us the best solution: a human driving or flying but automatic systems to help them.

    mikewsmith
    Free Member

    No, but the human in Swindon might have decided

    From a random sample of drivers on a round trip from Manchester to Rugby not many would have done any of that, plenty would have gone through red lights, gambled on Amber, passed too close and not seen most of what was around them.Your assessment of the Human driver is optimistic to say the least, mostly as if they were doing all that you suggested they would probably miss what was coming from the other side or drive into the back of the car in front.

    jimjam
    Free Member

    Edukator

    At the same level of technology I’m convinced that the railways and airlines have already shown us the best solution: a human driving or flying but automatic systems to help them.

    That’s a terrible idea.

    Edukator
    Free Member

    Speed limits aren’t based on reaction times, they’re based on the consequences of collisions in different environments. In France at least the limits correspond to:

    50kmh urban limit – most pedestrians survive being hit at less than 50kmh. There are move in many cities to go down to 30kmh to improve survivability and not just around schools. Kids have poor survival rate when hit at over 30kmh which jusifies the lower limits around schools but there are kids everywhere to a blanket 30kmh limit in residential areas is being enforced in more and more towns.

    90kmh (soon to be 80kmh) extra-urbain limit – survivble collisions between on-coming cars or cars hitting roadside obstacles

    110kmh non-autoroute dual carriageways – no risk of head ons and safer run off

    130km – the speed at which most autoroute collisions reamin within the barriers. Germany suffer much higher accident rates and many more cross-over accidents on its unlimited sections.

    None of that changes for driverless cars so there’s no reason to allow them to go faster. Indeed, because they have more trouble picking out potential risks from the rest of the environmental noise than a human they should be made to go slower.

    aracer
    Free Member

    thecaptain wrote:

    Just to be clear, is there anyone here arguing that self-drive cars are not going to be massively safer than human-driven ones?

    Quite a few it seems.

    Edukator wrote:

    No, but the human in Swindon might have decided to give the parked cars a wide birth…

    but the computer wouldn’t have.

    Your arguments are getting kind of bizarre now – personally I’d put my money on the computer doing all those things (along with having sensors to detect the bicycle before a human could spot it) because that’s how the algorithms will be programmed and very few of the human drivers doing so. I find it strange that you’re arguing here for humans driving so much better than you do on every other driving thread – I mean you are the same chap normally telling everybody to slow down?

    Which also answers your last point – a computer will be continually attentive and have the right attitude to things like this. You can’t compare professional airline pilots to average drivers (notwithstanding that the fully autonomous systems can do a better job than pilots anyway https://en.wikipedia.org/wiki/Autoland).

    FWIW Edu I’ve worked on AI systems, on embedded systems with sensors and self-learning algorithms and alongside people developing very similar technologies, I have a fair idea how these things work.

    aracer
    Free Member

    jimjam wrote:

    Edukator

    At the same level of technology I’m convinced that the railways and airlines have already shown us the best solution: a human driving or flying but automatic systems to help them.

    That’s a terrible idea.

    Not least because studies seem to show that the intermediate stages in autonomous cars have significant issues with the human paying less attention when you give them lots of aids.

    jimjam
    Free Member

    aracer

    jimjam wrote:

    Edukator

    At the same level of technology I’m convinced that the railways and airlines have already shown us the best solution: a human driving or flying but automatic systems to help them.

    That’s a terrible idea.

    Not least because studies seem to show that the intermediate stages in autonomous cars have significant issues with the human paying less attention when you give them lots of aids.

    Exactly. An airline pilot or a train driver is paid specifically to deliver their passengers safely to their destination. Their livelihood and hundreds of lives depend on them paying attention to what they are doing and not say, checking Facebook or texting. Jack or Jill on the other hand, on their way to the gym, coffee shop, work etc are on their own time, in their own space and the only life they are concerned with at any given time is their own. Give them a chance to spend more time on snapchat and they’ll be all over it.

    Edukator

    Speed limits aren’t based on reaction times, they’re based on the consequences of collisions in different environments. In France at least the limits correspond to:

    50kmh urban limit – most

    Please….even if it’s based on the consequences (which it’s not) the determining factor as to whether theses consequences are suffered or avoided is a human’s ability to perceive and/or react to a hazard. are you seriously suggesting that the 60mph National speed limit is based on survive-ability of an impact at that speed? For who exactly? How many pedestrians will survive a 60mph collision with a car? How many cyclists? How many horses? How many drivers will survive a 60mph collision with a brick wall or a tractor.

    Edukator
    Free Member

    You really think a driverless will change lanes and drive on the wrong side of the road to get better visibility, Aracer? And make reasonable decisions on the risk of a pedestrian or cyclist doing the unexpexted? The incident at the start of this thread shows the car failed to avoid a pedestrian pushing a bike. A motorcylist was hit by a driverless car that changed its mind about a lane change. A driverless indicated a lane change through a junction which was quite rightly inteptreted as a turn signal for the junction so there was a collision. You can’t program a car to cope with every situation it’s going to meet.

    It’ll mistake a dog for a human and provoke a multiple pile up with an unnecessary swerve/brake, it’ll dismiss something important as environmental noise that is important. Thinkk about it. That’s why I linked the article about the ethics of it all. Uber and Google have already proved that with cars strictly programmed to abide by the law and stay well within speed limits they foul up. Just how slowly are these things going to have to go through urban areas for people to feel safe on their streets?

    Put a GPS tracker and speed limiter in every car to make drivers more responsible and take away the ability to speed and you’ll do a lot to prevent collisions between vehicles without removing human care ane consideration for others

    I’m still suggesting people slow down. I’m very happy with the new 30kmh and 80kmh limits

    chakaping
    Free Member

    I don’t understand why some people are so enthusiastic about them when there are so many downsides covered in this thread alone.

    The same dildos who think that self-service checkouts are a sign of progress probably.

    mikewsmith
    Free Member

    You really think a driverless will change lanes and drive on the wrong side of the road to get better visibility, Aracer? And make reasonable decisions on the risk of a pedestrian or cyclist doing the unexpexted?

    The majority of drivers don’t. Are you assuming visibility from a driver in the driving seat perspective or from an array of sensors and cameras?

    A driverless indicated a lane change through a junction which was quite rightly inteptreted as a turn signal for the junction so there was a collision. You can’t program a car to cope with every situation it’s going to meet.

    Exhibit A, B, C (skip a bit) Z drivers making all these mistakes today, will the non driver make less of them?

    It’ll mistake a dog for a human and provoke a multiple pile up with an unnecessary swerve/brake, it’ll dismiss something important as environmental noise that is important. Thinkk about it

    Bring some evidence and we will examine it. Project an opinion and that is all it is, unless you are working on the AI and learning aspects of the cars or the legislation to accompany them.

    The Ethics case is to all extents and academic exercise in ethics, who would you kill in the situations you have 0.01s to decide. How do most drivers fare?

    Put a GPS tracker and speed limiter in every car to make drivers more responsible and take away the ability to speed and you’ll do a lot to prevent collisions between vehicles without removing human care ane consideration for others

    Really? Hundreds of dash cammed cars out there, has driving improved? Will it stop tail gating? Will it make people stop when they are tired, not chance a drink, not drive when medicated, drive to the conditions or pay more attention?

    The Automated car will not do those things, it won’t close it’s eyes, retune the radio, check it’s phone, talk to the passenger, argue with the kids etc. Make drivers perfect and you can hold them up to that standard. The roads would be significantly safer with a lot of drivers removed from them.

    Edukator
    Free Member

    I agree with most of the driver failings you point out in your last two paragraphs, Mike. Some could be countered but there is resistance to alcotest ignition, hands-free phones, automated safe distance. Arguing with the kids is a tricky one though. Despite all that humans do quite well and could do a lot better is they weren’t in such a hurry.

    I see resistance to cars that will be programmed to go slower than people are used to going. The German resistance to a law to stop them doing 250kmh on the autobahn is nothing compared to the resistance you’ll see to cars programmed to do 130kmh on the autobahn and 25kmh through residential areas. Look at what an insurance company pays out to air crash victims compared with car crash victims. In a driverless car people will sue the manufacturers for aircraft industry style sums so the manufacturers are going to be ultra cautious if only because they know that speed kills who or whatever is driving.

    I was an initial fan of the driverless idea but the more I’ve thought about it the more I’ve realised that driverless cars that are fast enough to satisfy the Jimjams will be too much of a liability for the manufacturers and their insurers so we’ll end up with the worst of both worlds. Driverless cars with manual override.

Viewing 40 posts - 81 through 120 (of 341 total)

The topic ‘Killer cars stalking our streets…’ is closed to new replies.