• This topic has 88 replies, 48 voices, and was last updated 7 years ago by khani.
Viewing 40 posts - 41 through 80 (of 89 total)
  • Self Driving Car Darwin Award
  • poly
    Free Member

    Zero proof that the 130 million miles have been on full autopilot with no driver taking control to save the day.

    I assume that as Tesla can do over the air updates, and call this active beta testing that they are getting real feedback from the car so know (i) how many miles have been done on auto and (ii) if the driver ever takes emergency evasive action.

    If it really is 130 million miles by an electric car , just how many thousands of Teslas are out there doing mega miles anyway?

    Well they sold roughly 50k vehicles last year, and are planning to signigicantly increase that this year. On top of those sold in 2014 there must be c. 100k cars around. So each one has driven an average of 1300 autonomous miles – seems highly plausible!

    sharkbait
    Free Member

    I assume that as Tesla can do over the air updates, and call this active beta testing that they are getting real feedback from the car so know (i) how many miles have been done on auto and (ii) if the driver ever takes emergency evasive action.

    Yes they def. do OTA updates and I’m sure the cars will be feeding data back to Tesla.
    Out of interest I wonder how a Google car would have dealt with that situation?

    The reports that the car systems didn’t “see” the lorry are odd

    Indeed. When driving straight ahead the system may not be that different to the adaptive cruise control that’s available on a number of cars now.
    I’d imagine that there are people working on the data/a fix right now.

    br
    Free Member

    Also worth noting that this accident probably wouldn’t have been fatal had it occurred in Europe where trucks have side underride bars, and possibly wouldn’t have happened at all: it’s possible that the reason the autopilot hit the truck is because it could “see” straight under it.[\i]

    This is why I’ve never felt safe driving in the States, and a good example of commerce overriding public safety.

    Does also seem that statistical the Tesla approach is probably safer than the average driver, and also the stats quoted only cover “reported” accidents. I’m pretty sure every Tesla accident is reported, how many normal accidents don’t get reported?

    jambalaya
    Free Member

    Tesla came out very quickly to explain the autopilot error (didn’t see white truck against bright sky) so I assume they had an immediate download perhaps even OTA.

    As I said I am very sceptical.

    FWIW at Volvo Ocean race stopover a while ago they had the pedestrian auto stop feature on display, a certain individual decided to test this once the puboic had gone. He drove the car right through the pedestrian and into a wall. The feature didn’t work.

    irc
    Full Member

    Does also seem that statistical the Tesla approach is probably safer than the average driver,

    There isn’t the evidence for that. Firstly as I posted upthread UK drivers are statistically safer than Tesla drivers. Secondly 130 million miles and 1 death is too small a sample. Once Tesla have a couple of billion auto miles logged we’ll start getting an idea. In the UK that would be around 10 deaths.

    In any case Tesla don’t claim their car can drive itself. Elon Musk states “the responsibility remains with the driver” (1.22s on https://www.youtube.com/watch?v=60-b09XsyqU)

    So Tesla think their system can help the driver to be safer but stats don’t show that so far. The problem with the current state of Tesla automation is that it isn’t fully auto but the more it takes over the more drivers switch off and once every X number of million miles that results in a fatal.

    phiiiiil
    Full Member

    The feature didn’t work

    Are you talking about this? In that case they (a) misunderstood what the various safety systems were, and (b) didn’t actually have the safety system they were apparently trying to demonstrate installed; it’s an optional extra, not standard equipment. I don’t think that can be blamed on the car.

    poly
    Free Member

    irc – I agree with most of what you say, but to compare Tesla miles to “UK driver miles” is bad analysis. You can’t say “UK drivers are safer”, only possibly “driving in the UK is less likely to result in death than the US”. They typically drive different vehicles, their road layouts are different, their driver training and law enforcement is different and potentially even their emergency response is different Its just as bad comparing UK and US stats and attributing that to the person or system controlling the velocity of the vehicle as it is to say “1 death in 130M miles to date means the death rate is 1:130M miles”.

    And of course that works the other way round too – just because they prove technology in wide straight roads in sunny california doesn’t mean you can assume the same safety record on narrow twisty back lanes of somerset on a misty day.

    irc
    Full Member

    irc – I agree with most of what you say, but to compare Tesla miles to “UK driver miles” is bad analysis.

    Point taken. But the gist of my argument stands. One USA fatal per 95 million miles or thereabouts for human drivers is an accurate stat. 1 fatal for 130 million miles for Tesla isn’t enough data. We can’t say whether the Tesla system as it stands is safer. The advantages of the driver assist features may be outweighed as drivers switch off. We don’t know.

    slowoldman
    Full Member

    oh, and Tesla agree that this system is not fully ready yet

    So how the hell is it allowed on the public roads? Has it passed a driving test?

    mikewsmith
    Free Member

    So how the hell is it allowed on the public roads? Has it passed a driving test?

    It’s there to help and assist the driver (the single most dangerous component in a car) at no point should you be using it as a automatic driving system. The failure yet again is the driver.

    irc
    Full Member

    at no point should you be using it as a automatic driving system. The failure yet again is the driver.

    Or alternatively it’s a Tesla failure for not having some system to ensure the driver is paying attention. A modern version of the dead man handle on trains. What it should be? Who knows. Iris scanner? Blink sensor? But an company that semi automates then blames the driver for losing concentration is just shifting blame.

    The other issue is driver de-skilling. First generation drivers have years or decades of experience. Drivers in an auto car world won’t have that. So full automation – not an issue. Semi auto – the drivers will be worse when they need to take over.

    Manual control is a highly skilled activity, and skills need to be practised continuously in order to maintain them. Yet an automatic control system that fails only rarely denies operators the opportunity for practising these basic control skills. One of the consequences of automation, therefore, is that operators become de-skilled in precisely those activities that justify their marginalised existence. But when manual takeover is necessary something has usually gone wrong; this means that operators need to be more rather than less skilled in order to cope with these atypical conditions. Duncan (1987, p. 266) makes the same point: “The more reliable the plant, the less opportunity there will be for the operator to practise direct intervention, and the more difficult will be the demands of the remaining tasks requiring operator intervention.”

    An experienced driver today is probably competent enough to monitor a self-driving car but what about a driver twenty years from today who will likely not have spent any meaningful amount of time driving a manual car?

    Deskilling and The Cul-de-Sac of Near Perfect Automation

    mikewsmith
    Free Member

    https://www.teslamotors.com/models
    From my reading nothing says you don’t actually need to pay attention…

    irc
    Full Member

    https://www.teslamotors.com/models
    From my reading nothing says you don’t actually need to pay attention…

    Operating manual? Human nature? Which will it be?

    scuzz
    Free Member

    An experienced driver today is probably competent enough to monitor a self-driving car but what about a driver twenty years from today who will likely not have spent any meaningful amount of time driving a manual car?

    An interesting point, but with technology decreasing the rate of human intervention, statistically insignificant compared to the current reality.

    Note also the weasel words “experienced” and “highly skilled” – neither of which are prerequisite for driving a car.

    skidsareforkids
    Free Member

    At the opposite end of the spectrum, when Ford first introduced their adaptive cruise system in the new F150, the system would slam on the anchors when met with dark shadows under overpasses on bright days… Caused quite a few wrecks there too! We are on our 4th car that has adaptive cruise (2nd with lane-keeping assist steering) and whilst it does work well, it makes me pretty uncomfortable, and I would never pay any less attention and use it as a crutch! My mother-in-law has fallen asleep at the wheel of her Mercedes ML and the system kept her on the road an woke her up 😯

    Orange-Crush
    Free Member

    I’ve used office computer systems for thirty years. Based on that experience there’s no way I would trust my (or anyone else’s) life so directly to a computerised system.

    I know we are forced indirectly to risk our lives (computerised control of hazardous processes etc) but just because “we” can do something by IT does not mean that we should.

    mikewsmith
    Free Member

    I know we are forced indirectly to risk our lives (computerised control of hazardous processes etc) but just because “we” can do something by IT does not mean that we should.

    Missing the elephant in the room here, human beings manage to kill 1000’s of people through poor, bad, inattentive and under skilled driving. The most dangerous component in the car is the driver.

    wobbliscott
    Free Member

    It’s a classic case of launching an immature system full stop. Tesla can hide behind the fact that ‘the driver is always in control and should monitor the system at all times’ is a cop out. The causes of most accidents is drivers not being alert – well if they’re not alert when actually driven then they’re going to be even less alert by simply overseeing an automated system. If the driver is the weak link the system has to eliminate the weak link or you’re not addressing the danger. The sort of half-way house driver assist systems that Tesla has introduced are just a stepping stone to fully automated driver-less system, and therefore effectively developing the system in the real world.

    Unfortunately there is no global framework for allowing new technology for cars to be proved, certified and released and introduced in a safe way like there is for the aviation industry where there are global aviation authorities that regulate and oversee everything. That hasn’t been an issue because before now we’ve been talking about new tech such as electric windows and sat nav – i.e. technology who’s failure will not result in accidents. This is something different and requires a more thorough approach, other than driving around for a few million miles and declaring “that’ll do”. Where is the thorough global standards that the system has to achieve other than a testing To Do list that a few engineers within Tesla come up with. There is the testing to ensure the system will ‘fail in a safe way’ when certain critical components within the system fail?

    hora
    Free Member

    Jesus RIP, it literally be a combination of the driver was part watching, keeping an eye in the road and using the driving feature. This combined with a tragic traffic move = a sad fatality. I just love news media painting a bloke as 100% irresponsible. You can use a feature for a while and start to trust if you know.

    On a road with impulsive and sudden lane changes, chain reaction moves with no irrational thought a computer can not beat a driver IMO.

    Last week I witnessed a truck driver monster and almost crush a car driver for a small slight- it was ridiculous. Would a computer spot and respond to that or throw a spanner/fault move?

    I just wish the media would wait for 100% facts before slandering a dead man but then they wouldn’t say ‘6 months ago a man was killed and here is our follow up story with all the facts now’.

    richmars
    Full Member

    As wobbliscott says.
    These systems should undergo the same certification that a new passenger plane has too, which is a bit more than fly it for a few 1000 hours and saying that’ll do.
    Also, people seem to be too trusting of technology. The GPS is always right. There’ll always be a mobile signal. The self drive works and I can read a book on the way to work.

    bamboo
    Free Member

    To those saying that they wouldn’t trust their life to an automated system on the basis on using a Windows pc, or that new tech up to now isn’t safety critical (sat navs etc), then how much control do you think you have driving a modern car with an automatic gearbox? The engine is fly by wire and the shift lever has no mechanical link to the transmission for selection of neutral.

    There are standards that if aren’t followed the manufacturers will find themselves liable. Tesla are in a very grey area by referring to this functionality as beta, In my opinion it should not be present on a production car if it is genuine beta software.

    richmars
    Full Member

    I agree, the grey area seems to be getting bigger, and no-one seems to be regulating it. Engines being drive by wire is, I think, just about ok, but combine that with self parking systems (which must therefore have some degree of steer by wire and brake by wire) and I’m having to place a lot of trust in the hands of the hardware and software engineers. I’m not sure I’m ready for that yet without someone checking what they’re doing. (Or I’m just getting old.)

    v8ninety
    Full Member

    how much control do you think you have driving a modern car with an automatic gearbox? The engine is fly by wire and the shift lever has no mechanical link to the transmission for selection of neutral.

    enough to know that I need to pay attention and not watch frozen on the DVD player I guess. Throttle may be fly by wire but the driver is still putting in command inputs, gearbox maybe computer controlled but still responds to control inputs. Steering however has to have a physical link between steering wheel and front wheels so that if (when) hydraulic and electronic systems fail the driver can still steer. Same with brakes. It’s not to do with the gadgetry under the skin of the car, it’s about what makes the control decisions. Even with adaptive cruise control and lane assist, that responsibility lies with the driver. Apparently it is the same for the Tesla, but, people are lazy…

    mikewsmith
    Free Member

    Even with adaptive cruise control and lane assist, that responsibility lies with the driver. Apparently it is the same for the Tesla, but, people are lazy…

    +100

    In the end removing the driver is the safest thing, as much assistance to correct the mistakes of the driver are good. But a driver who ignores the road is an idiot and responsible for what happens

    bamboo
    Free Member

    The inputs to the engine and gearbox are purely electrical though. The software decides what to do with those inputs, there is no mechanical override other than obviously the brakes – but in hybrid and electrical cars they are becoming increasingly computer controlled due to energy regen.

    The point is that we already have a high degree of automation, which if not properly implemented and standards followed, could be dangerous. Teslas autopilot system implies that the driver doesn’t need to take any notice (even if that isn’t the case, that’s how it will be interpreted), and this is a step too far.

    trail_rat
    Free Member

    My issue is it either has to be fully automated or not automated.

    Once you take away the steering and the throttle and the brake – the drivers going to get t bored become inattentive and not pay attention.

    Dead man switch like trains ?

    I do think it is coming and it’s got potential to be a great thing but the ethical dilemma in such cases – so I’m in my automated car I own it’s driving automatically and it kills someone.

    Who’s responsibility ? I reckon I knkw the answer but there will be a dispute I’m sure.

    maccruiskeen
    Full Member

    Whenever someone quotes “Darwin Awards” in relation to someone’s death, it demonstrates an appalling lack of empathy and taste.

    Perhaps we can nominate the OP to explain the concept of ‘Darwinism’ the victims’ children.

    v8ninety
    Full Member

    The inputs to the engine and gearbox are purely electrical though. The software decides what to do with those inputs, there is no mechanical override other than obviously the brakes

    The key thing is who (or what) is generating those inputs. Yes a computer will decide to select a different gear ratio, and a different fuel mixture, but the computer is responding to a driver request. Even with cruise control and lane assist, the driver has requested to maintain a certain speed and course.

    It’s about who is in charge, and who is responsible.

    bamboo
    Free Member

    V8ninety – I see your point about responding to drivers inputs – if you press the accel pedal halfway, then you might reasonably expect the software to implement this for you. This should be the case but don’t be under an illusion that you have direct control, a software bug could prevent this from happening. Fortunately there are standards and procedures followed to mitigate this risk.

    In this example though the driver makes the concious decision how far to press the accelerator pedal. Same with steer by wire, and brake by wires systems- they should respond to what the driver is asking for. The problem that I see with autopilot is the computer becomes the driver and makes the combined decision on how far the press the accel pedal, how much to steer, how much to brake. A huge amount of subconscious decisions and observations are made during driving which I don’t believe a computer is able to replicate yet, or maybe ever.

    phiiiiil
    Full Member

    Automotive software isn’t just hacked together in someone’s basement like a lot of everyday software is, it is a highly regulated environment that goes to great lengths to demonstrate correctness at every stage of the process. The more complex or safety critical a function, the more nauseating the certification process, so I have no doubt that a huge amount of effort has been put into making the Tesla system as safe as they can make it, and that will continue into the future as issues like this surface.

    While only “beta” this will still be far better designed and tested than 99% of software out there will be. The various agencies such as the US NHTSA wouldn’t let them sell it if it wasn’t.

    bamboo
    Free Member

    Too late – I agree.

    v8ninety
    Full Member

    Perhaps we can nominate the OP to explain the concept of ‘Darwinism’ the victims’ children.

    Technically, if anyone is explaining Darwinism to the deceased children, then it doesn’t actually apply… Jus’ sayin

    wobbliscott
    Free Member

    The software required to drive a gearbox is many levels of orders of magnitude less complicated than that required for an autopilot system. You can predict quite simply the number of different possibilities and situations that an auto gearbox has to deal with – maybe 20 or 30 different scenario’s. But the number of potential possibilities that an autopilot system has to be able to deal with will add up to the millions – almost infinite – so many that it is impossible to hard code into software so you’re looking at incorporating elements of artificial intelligence. The two are not even comparable.

    bamboo
    Free Member

    I’m not saying that they are comparable, but what I am saying is that we already have a large amount of automation in cars at this point in time. People on this thread are suggesting that based on sat navs and windows computers that they wouldn’t trust automated systems, but the truth is that they already do. A software bug in engine software, gearbox software, etc could put people in danger, but we do have standards that have to be adhered to, so that this risk is mitigated.

    You are right, the decisions that a gearbox has to make are many many fewer than autopilot, but incorrect gearbox software could still result in tragic consequences.

    v8ninety
    Full Member

    if you press the accel pedal halfway, then you might reasonably expect the software to implement this for you…but don’t be under an illusion that you have direct control, a software bug could prevent this from happening

    By the same token, twenty years ago;
    If you press the pedal half way you might reasonably expect the carburettor to implement this for you, but don’t be under the illusion that you have direct control. The fuel air ratio has been carefully set by some very clever boffins at the factory, and the complex and balanced decision making of how much fuel vs air is allowed into the inlet manifold is made by some very clever self regulating mechanisms that take into account engine temp, load, revs, as well as a variety of other factors. Of course, if a little bug got stuck in a jet, it could prevent this from happening…

    Since machines have become increasingly complex, we have had less and less ‘direct’ control. The paradigm shift that is going on now though is that we are actually handing fundemental control to the machines. As a driver (responsible for my own errors) I’m not sure I’m ready for that yet.

    incorrect gearbox software could still result in tragic consequences.

    Not convinced of this, to be honest. Expensive, yes, but under normal driving conditions a gearbox would have to REALLY **** up to cause a crash. (Edit; and a pure mechanical box could **** up in just the same way, so electronics are kinda irrelevant in that example)

    phiiiiil
    Full Member

    The Tesla software will be more complicated, but all of it will still have been through the same extremely rigorous certification process; I couldn’t even begin to imagine how time consuming and expensive that must have been, and I’m a software engineer who has worked on much, much simpler safety critical systems in the past. It’s a phenomenal amount of work.

    I bet you’re still more likely to get killed by the failure of some really boring bit of hardware like a bonnet catch or something than any software system.

    hora
    Free Member

    phiiiil which would make it even more frustrating for the engineers having their work shot down by a journalist who half baked stories for a living

    phiiiiil
    Full Member

    The difficulty in the step up to fully autonomous control is actually why I prefer Tesla’s approach than Google’s; we are generally better at improving what we have in tiny steps than coming up with a finished product in one go.

    Both are producing extremely valuable results so I’m still glad people are approaching this from many angles, but the incremental approach makes it less of a shock.

    phiiiiil
    Full Member

    phiiiil which would make it even more frustrating for the engineers having their work shot down by a journalist who half baked stories for a living

    Absolutely. The comparison between control system software and desktop or phone software makes things difficult; the gulf between the two is massive.

    When some VW bigwig a while ago blamed their test cheating on “a couple of rogue software engineers” he was talking absolute bollocks; there is no way in hell you could ever sneak anything in to that kind of software without armies of people knowing what you’re doing.

    poly
    Free Member

    Since machines have become increasingly complex, we have had less and less ‘direct’ control. The paradigm shift that is going on now though is that we are actually handing fundemental control to the machines. As a driver (responsible for my own errors) I’m not sure I’m ready for that yet.

    And yet I’d feel a lot safer if you handed control of everyone else’s vehicles to the ‘computer’ rather than let average drivers be in charge.

    I’ve often said that if you invented the car today you’d never be allowed to introduce it onto the roads. The big challenge in autonomous vehicles isn’t driving the vehicle, its guessing what the non-autonomous vehicles (and perhaps to a lesser extent animals, peds and bikes) might do. Interestingly I believe my grand children will never actually drive a car (if I have any they won’t be driving for about 25 years at least), and to make that big leap happen it might be necessary for us to do something radical like banning normal cars from some sorts of road, or insisting they all transmit certain data to help make them predictable to their near neighbours. My grandkids will look back at the improved safety statistics and wonder how it took us so long to phase in the technology.

Viewing 40 posts - 41 through 80 (of 89 total)

The topic ‘Self Driving Car Darwin Award’ is closed to new replies.