• This topic has 88 replies, 48 voices, and was last updated 7 years ago by khani.
Viewing 40 posts - 1 through 40 (of 89 total)
  • Self Driving Car Darwin Award
  • jambalaya
    Free Member

    First fatality in an autonomous tech car. Witness say he drove at speed straight under an 18 wheeler trailer which the driving technology didn’t “see”. Witness says driver was watching a Harry Potter movie. Darwin Award winner.

    Linky

    Have to say I am not sure this technology is ready for the public roads yet

    footflaps
    Full Member

    Good to see you’re now celebrating deaths, having moved on from jeering over Brexit unemployment….

    Nice one.

    jambalaya
    Free Member

    Blah blah blah

    Idiot with car on autopilot watching a DVD. Could have been your family he smashed into.

    scaredypants
    Full Member

    Witness says driver was watching a Harry Potter movie

    or

    Witness Truck driver who attempted to cross the carriageway in front of the car (and therefore needs a decent excuse) says driver was watching a Harry Potter movie

    onehundredthidiot
    Full Member

    Whole load of unusual circumstances including colour and glint confused the algorithms.

    Same methods used to confuse missile seeker algorithms. Just as well humans never make mistakes.

    corroded
    Free Member

    You stay classy.

    whatgoesup
    Full Member

    long term: will be massively safer than real human driving. Not sure what the stats are right now in accidents / fatalities per million miles etc though I’m sure it’s avail able.

    a major advantage is that misses such as this can be analysed and the resultant algorithm updates then pushed to all cars whereas humans have mistakes / near misses etc and only one individual learns (or doesnt)

    oh, and Tesla agree that this system is not fully ready yet – the system is disabled by default on a new car, the owner has to actively enable it with clear disclaimers that it’s a BETA system only. it’s necessary to keep the hands on the wheel (car slows to a stop Isle the driver doesn’t) and the driver is fully responsible.

    I.e. yes Darwin award candidate.

    ninfan
    Free Member

    “Carroofius Removeska”

    pdw
    Free Member

    Have to say I am not sure this technology is ready for the public roads yet

    Well even in this early state, it’s still beating the humans. 1 fatality in 130 million miles is still ahead of the human average of 1 per 95 million miles.

    And unlike a similar accident involving a human, it’s very likely that the autopilot will now be fixed so that this can’t occur again.

    Also worth noting that this accident probably wouldn’t have been fatal had it occurred in Europe where trucks have side underride bars, and possibly wouldn’t have happened at all: it’s possible that the reason the autopilot hit the truck is because it could “see” straight under it.

    Watty
    Full Member

    Tannoy:
    FOR YOUR SAFETY AND THE SAFETY OF OTHERS, PLEASE DO NOT LEAVE YOUR PLASTIC BUBBLE.

    Drac
    Full Member

    Well even in this early state, it’s still beating the humans. 1 fatality in 130 million miles is still ahead of the human average of 1 per 95 million miles.

    That means nothing, how many human car drivers are there compared to Tesla?

    pdw
    Free Member

    Why is that meaningless? Fatalities per miles driven seems directly comparable to me. Obviously 130 million miles isn’t a huge sample size for something that occurs on average once every 100million miles or so but, it’s not a bad start.

    And, of course, we can expect the stats to improve dramatically as the proportion of cars that are self driving increases.

    AdamT
    Full Member

    I’m certain that in the long run, the cars will end up being safer than human drivers. Tesla are pioneering in both technology and their approach. what I mean by this is that I don’t remember seeing “beta” safety features being brought out by other, more traditional manufacturers. It’s certainly causing a big shift in the industry (I work on AI for autonomous car tech)

    Del
    Full Member

    same tech the poor casualty said saved his life a short time before this incident.
    sauce

    hebdencyclist
    Free Member

    Whenever someone quotes “Darwin Awards” in relation to someone’s death, it demonstrates an appalling lack of empathy and taste.

    jambalaya
    Free Member

    The technology didn’t see the truck as it was white and it was a sunny day, its not ready. Watching a DVD – well we can be sure he wasn’t looking where he was going

    Kamakazie
    Full Member

    Already safer than humans driving.
    Of course it’s ready and any one who commutes by bike should be going it gets mainstream adoption as soon as possible.

    thisisnotaspoon
    Free Member

    Why is that meaningless? Fatalities per miles driven seems directly comparable to me. Obviously 130 million miles isn’t a huge sample size for something that occurs on average once every 100million miles or so but, it’s not a bad start.

    With only one crash it’s meaningless though. If a second crashed tomorrow that wouldn’t make the last 130million less safe, it’d just be an incrementally better data set, but still statisticaly insignificant.

    irc
    Full Member

    1 fatality in 130 million miles is still ahead of the human average of 1 per 95 million miles.

    No it isn’t. UK fatalities in 2013 were 5.6 per billion miles. Or 1 per 178 million miles. So humans are safer.

    https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/448037/road-fatalities-2013-data.pdf

    cloudnine
    Free Member

    😆 ninfan

    andyl
    Free Member

    No it isn’t. UK fatalities in 2013 were 5.6 per billion miles. Or 1 per 178 million miles. So humans are safer.

    That’s in the UK and we [unfortunately] need a bigger set of statistics for self driving cars to get the accurate figure.

    It would be nice to never get a high enough sample size in terms of deaths to base the sample size on but we will sure enough see the miles stack up. The next death might occur in 1000 miles time or could be 300 million miles. Either would skew the stats too much at the moment.

    Sad incident but I hope it doesnt set back Tesla and self driving tech and lessons are learnt from it.

    Drac
    Full Member

    With only one crash it’s meaningless though. If a second crashed tomorrow that wouldn’t make the last 130million less safe, it’d just be an incrementally better data set, but still statisticaly insignificant.

    Precisely, I do think the technology is advancing fast and hope to see it common in my lifetime but that figure doesn’t mean anything just now.

    choppersquad
    Free Member

    If the truck was in front of him, how did driver know he was watching a film?

    zokes
    Free Member

    jambalaya – Member
    Blah blah blah

    You know, that’s the most intelligent thing you’ve posted on here in the last few weeks, if ever.

    CharlieMungus
    Free Member

    The tech will get there. Anyone see the AI fighter jet beating the experienced pilot?

    janesy81
    Free Member

    Although it’s no consolation for the poor guy and his family, if it’s anything like aerospace then they’ll learn from this and improve the software – so each accident should in theory improve future safety. And, unlike people, software doesn’t suffer from the complacency of “yeah, but that’ll never happen to me…”
    Don’t think the 1 in 130million miles should be written off as insignificant – that’s a massive distance covered for a first go. But equally, if he’d had a car full of people that stat would be looking a lot worse

    chambord
    Free Member

    Do they just use RGB cameras? No time of flight cameras or something similar?

    richmars
    Full Member

    I find it a bit odd that google are spending years and millions testing their self-drive cars, but Tesla can release something with a fraction of the testing.

    mikewsmith
    Free Member

    I find it a bit odd that google are spending years and millions testing their self-drive cars, but Tesla can release something with a fraction of the testing.

    Is all. Assistance stuff, it’s not a fully self driving car as such you are supposed to actually remain in overall control.

    chambord
    Free Member

    The Tesla isn’t properly self driving though, it just does motorway driving including lane changes as far as I know..

    richmars
    Full Member

    I understand the Tesla isn’t fully self driving, but that hasn’t stopped loads of people doing it like that, just look on YouTube. I think Tesla must bear some responsibility for thinking that people wouldn’t use it like this.

    mikewsmith
    Free Member

    I think Tesla must bear some responsibility for thinking that people wouldn’t use it like this.

    No people need to take responsibility.

    ChannelD
    Full Member

    Apparently, I find a little hard to believe but then may be, Tesla have a ‘red button’ that says “Do not push for autonomous driving”. They then collect the data to improve their data set and algorithms. Easy. The best testing sometimes is to use a human.

    I have momentarily been inside a Tesla I did not see a ‘button’. Could have been a menu option though.

    I did see a driving mode for Insane. The word insane was a little surprising.

    Lifer
    Free Member

    Insane refers to the acceleration.

    As the article says autonomous driving is disabled by default. You have to make an active choice to enable it.

    allan23
    Free Member

    Blah blah blah

    Idiot with car on autopilot watching a DVD. Could have been your family he smashed into.

    Really, bit of a rash judgement. Were you there? The Tesla system means you still have to hold the steering wheel.

    The accident was the car driving under the trailer from the side from the reports I’ve seen. In the UK we had to have guards put on lorry trailers to stop that as humans, supposedly paying attention, kept doing it.

    The reports that the car systems didn’t “see” the lorry are odd, it’s a mix of sensors including ultra-sonic. There have been criticisms by some in the field that the Tesla system has blind spots unlike the similar Mercedes system.

    Something has gone wrong, but the autopilot system has still logged more miles with less accidents than the meat bags who are supposedly in control of vehicles manage. Does that mean people aren’t ready for public roads yet?

    Sadly I know the answer, I know I’d trust an autopilot system configured correctly to give me more room as a cyclist than the idiots on the road at the moment. 🙁

    codybrennan
    Free Member

    If he was watching a movie, then it wasn’t on the Tesla’s entertainment system- it won’t let you do that whilst driving.

    He may have been viewing it on an iPad or something, so its (IMO) only partly Tesla’s fault.

    allthepies
    Free Member

    How do the videos work then showing people not holding the wheel ? And in one case, sitting in the back seat with no-one in the front seat 🙂

    singletrackmind
    Full Member

    Really
    130million miles?
    How many Teslas do you see ?
    Are these real world miles, or simulted Play Station miles?
    Zero proof that the 130 million miles have been on full autopilot with no driver taking control to save the day.
    If it really is 130 million miles by an electric car , just how many thousands of Teslas are out there doing mega miles anyway?

    captainsasquatch
    Free Member

    Isn’t this from the same country where the fella set cruise control on his Winnebago, then got up to make himself a drink, just before it crashed too?

    thisisnotaspoon
    Free Member

    130million is nothing if you think about driving as a job. It’s (at 60mph) 1000 cars driven as a job (8hr day) for a year.

    And I’m sure they have ‘proof’, I’d be surprised if some of it isn’t in scientific papers if you look. Or is this one of those ‘there’s no proof’ statements where people say something then expect everyone else to go off and do the digging as somehow the onus is on them to prove your negative?

Viewing 40 posts - 1 through 40 (of 89 total)

The topic ‘Self Driving Car Darwin Award’ is closed to new replies.