MegaSack DRAW - This year's winner is user - rgwb
We will be in touch
First fatality in an autonomous tech car. Witness say he drove at speed straight under an 18 wheeler trailer which the driving technology didn't "see". Witness says driver was watching a Harry Potter movie. Darwin Award winner.
[url= https://www.theguardian.com/technology/2016/jul/01/tesla-driver-killed-autopilot-self-driving-car-harry-potter ]Linky[/url]
Have to say I am not sure this technology is ready for the public roads yet
Good to see you're now celebrating deaths, having moved on from jeering over Brexit unemployment....
Nice one.
Blah blah blah
Idiot with car on autopilot watching a DVD. Could have been your family he smashed into.
orWitness says driver was watching a Harry Potter movie
[s]Witness[/s] [b]Truck driver who attempted to cross the carriageway in front of the car (and therefore needs a decent excuse)[/b] says driver was watching a Harry Potter movie
Whole load of unusual circumstances including colour and glint confused the algorithms.
Same methods used to confuse missile seeker algorithms. Just as well humans never make mistakes.
You stay classy.
long term: will be massively safer than real human driving. Not sure what the stats are right now in accidents / fatalities per million miles etc though I'm sure it's avail able.
a major advantage is that misses such as this can be analysed and the resultant algorithm updates then pushed to all cars whereas humans have mistakes / near misses etc and only one individual learns (or doesnt)
oh, and Tesla agree that this system is not fully ready yet - the system is disabled by default on a new car, the owner has to actively enable it with clear disclaimers that it's a BETA system only. it's necessary to keep the hands on the wheel (car slows to a stop Isle the driver doesn't) and the driver is fully responsible.
I.e. yes Darwin award candidate.
[i]"Carroofius Removeska"[/i]
Have to say I am not sure this technology is ready for the public roads yet
Well even in this early state, it's still beating the humans. 1 fatality in 130 million miles is still ahead of the human average of 1 per 95 million miles.
And unlike a similar accident involving a human, it's very likely that the autopilot will now be fixed so that this can't occur again.
Also worth noting that this accident probably wouldn't have been fatal had it occurred in Europe where trucks have side underride bars, and possibly wouldn't have happened at all: it's possible that the reason the autopilot hit the truck is because it could "see" straight under it.
Tannoy:
FOR YOUR SAFETY AND THE SAFETY OF OTHERS, PLEASE DO NOT LEAVE YOUR PLASTIC BUBBLE.
Well even in this early state, it's still beating the humans. 1 fatality in 130 million miles is still ahead of the human average of 1 per 95 million miles.
That means nothing, how many human car drivers are there compared to Tesla?
Why is that meaningless? Fatalities per miles driven seems directly comparable to me. Obviously 130 million miles isn't a huge sample size for something that occurs on average once every 100million miles or so but, it's not a bad start.
And, of course, we can expect the stats to improve dramatically as the proportion of cars that are self driving increases.
I'm certain that in the long run, the cars will end up being safer than human drivers. Tesla are pioneering in both technology and their approach. what I mean by this is that I don't remember seeing "beta" safety features being brought out by other, more traditional manufacturers. It's certainly causing a big shift in the industry (I work on AI for autonomous car tech)
same tech the poor casualty said saved his life a short time before this incident.
[url= http://www.theregister.co.uk/2016/06/30/tesla_autopilot_crash_leaves_motorist_dead/ ]sauce[/url]
Whenever someone quotes "Darwin Awards" in relation to someone's death, it demonstrates an appalling lack of empathy and taste.
The technology didn't see the truck as it was white and it was a sunny day, its not ready. Watching a DVD - well we can be sure he wasn't looking where he was going
Already safer than humans driving.
Of course it's ready and any one who commutes by bike should be going it gets mainstream adoption as soon as possible.
Why is that meaningless? Fatalities per miles driven seems directly comparable to me. Obviously 130 million miles isn't a huge sample size for something that occurs on average once every 100million miles or so but, it's not a bad start.
With only one crash it's meaningless though. If a second crashed tomorrow that wouldn't make the last 130million less safe, it'd just be an incrementally better data set, but still statisticaly insignificant.
1 fatality in 130 million miles is still ahead of the human average of 1 per 95 million miles.
No it isn't. UK fatalities in 2013 were 5.6 per billion miles. Or 1 per 178 million miles. So humans are safer.
😆 ninfan
No it isn't. UK fatalities in 2013 were 5.6 per billion miles. Or 1 per 178 million miles. So humans are safer.
That's in the UK and we [unfortunately] need a bigger set of statistics for self driving cars to get the accurate figure.
It would be nice to never get a high enough sample size in terms of deaths to base the sample size on but we will sure enough see the miles stack up. The next death might occur in 1000 miles time or could be 300 million miles. Either would skew the stats too much at the moment.
Sad incident but I hope it doesnt set back Tesla and self driving tech and lessons are learnt from it.
With only one crash it's meaningless though. If a second crashed tomorrow that wouldn't make the last 130million less safe, it'd just be an incrementally better data set, but still statisticaly insignificant.
Precisely, I do think the technology is advancing fast and hope to see it common in my lifetime but that figure doesn't mean anything just now.
If the truck was in front of him, how did driver know he was watching a film?
jambalaya - Member
Blah blah blah
You know, that's the most intelligent thing you've posted on here in the last few weeks, if ever.
The tech will get there. Anyone see the AI fighter jet beating the experienced pilot?
Although it's no consolation for the poor guy and his family, if it's anything like aerospace then they'll learn from this and improve the software - so each accident should in theory improve future safety. And, unlike people, software doesn't suffer from the complacency of "yeah, but that'll never happen to me..."
Don't think the 1 in 130million miles should be written off as insignificant - that's a massive distance covered for a first go. But equally, if he'd had a car full of people that stat would be looking a lot worse
Do they just use RGB cameras? No time of flight cameras or something similar?
I find it a bit odd that google are spending years and millions testing their self-drive cars, but Tesla can release something with a fraction of the testing.
I find it a bit odd that google are spending years and millions testing their self-drive cars, but Tesla can release something with a fraction of the testing.
Is all. Assistance stuff, it's not a fully self driving car as such you are supposed to actually remain in overall control.
The Tesla isn't properly self driving though, it just does motorway driving including lane changes as far as I know..
I understand the Tesla isn't fully self driving, but that hasn't stopped loads of people doing it like that, just look on YouTube. I think Tesla must bear some responsibility for thinking that people wouldn't use it like this.
I think Tesla must bear some responsibility for thinking that people wouldn't use it like this.
No people need to take responsibility.
Apparently, I find a little hard to believe but then may be, Tesla have a 'red button' that says "Do not push for autonomous driving". They then collect the data to improve their data set and algorithms. Easy. The best testing sometimes is to use a human.
I have momentarily been inside a Tesla I did not see a 'button'. Could have been a menu option though.
I did see a driving mode for Insane. The word insane was a little surprising.
Insane refers to the acceleration.
As the article says autonomous driving is disabled by default. You have to make an active choice to enable it.
Blah blah blahIdiot with car on autopilot watching a DVD. Could have been your family he smashed into.
Really, bit of a rash judgement. Were you there? The Tesla system means you still have to hold the steering wheel.
The accident was the car driving under the trailer from the side from the reports I've seen. In the UK we had to have guards put on lorry trailers to stop that as humans, supposedly paying attention, kept doing it.
The reports that the car systems didn't "see" the lorry are odd, it's a mix of sensors including ultra-sonic. There have been criticisms by some in the field that the Tesla system has blind spots unlike the similar Mercedes system.
Something has gone wrong, but the autopilot system has still logged more miles with less accidents than the meat bags who are supposedly in control of vehicles manage. Does that mean people aren't ready for public roads yet?
Sadly I know the answer, I know I'd trust an autopilot system configured correctly to give me more room as a cyclist than the idiots on the road at the moment. 🙁
If he [i]was[/i] watching a movie, then it wasn't on the Tesla's entertainment system- it won't let you do that whilst driving.
He may have been viewing it on an iPad or something, so its (IMO) only partly Tesla's fault.
[quote=allan23 ] The Tesla system means you still have to hold the steering wheel.
How do the videos work then showing people not holding the wheel ? And in one case, sitting in the back seat with no-one in the front seat 🙂
Really
130million miles?
How many Teslas do you see ?
Are these real world miles, or simulted Play Station miles?
Zero proof that the 130 million miles have been on full autopilot with no driver taking control to save the day.
If it really is 130 million miles by an electric car , just how many thousands of Teslas are out there doing mega miles anyway?
Isn't this from the same country where the fella set cruise control on his Winnebago, then got up to make himself a drink, just before it crashed too?
130million is nothing if you think about driving as a job. It's (at 60mph) 1000 cars driven as a job (8hr day) for a year.
And I'm sure they have 'proof', I'd be surprised if some of it isn't in scientific papers if you look. Or is this one of those 'there's no proof' statements where people say something then expect everyone else to go off and do the digging as somehow the onus is on them to prove your negative?
I assume that as Tesla can do over the air updates, and call this active beta testing that they are getting real feedback from the car so know (i) how many miles have been done on auto and (ii) if the driver ever takes emergency evasive action.Zero proof that the 130 million miles have been on full autopilot with no driver taking control to save the day.
Well they sold roughly 50k vehicles last year, and are planning to signigicantly increase that this year. On top of those sold in 2014 there must be c. 100k cars around. So each one has driven an average of 1300 autonomous miles - seems highly plausible!If it really is 130 million miles by an electric car , just how many thousands of Teslas are out there doing mega miles anyway?
I assume that as Tesla can do over the air updates, and call this active beta testing that they are getting real feedback from the car so know (i) how many miles have been done on auto and (ii) if the driver ever takes emergency evasive action.
Yes they def. do OTA updates and I'm sure the cars will be feeding data back to Tesla.
Out of interest I wonder how a Google car would have dealt with that situation?
The reports that the car systems didn't "see" the lorry are odd
Indeed. When driving straight ahead the system may not be that different to the adaptive cruise control that's available on a number of cars now.
I'd imagine that there are people working on the data/a fix right now.
[i]Also worth noting that this accident probably wouldn't have been fatal had it occurred in Europe where trucks have side underride bars, and possibly wouldn't have happened at all: it's possible that the reason the autopilot hit the truck is because it could "see" straight under it.[\i]
This is why I've never felt safe driving in the States, and a good example of commerce overriding public safety.
Does also seem that statistical the Tesla approach is probably safer than the average driver, and also the stats quoted only cover "reported" accidents. I'm pretty sure every Tesla accident is reported, how many normal accidents don't get reported?
Tesla came out very quickly to explain the autopilot error (didn't see white truck against bright sky) so I assume they had an immediate download perhaps even OTA.
As I said I am very sceptical.
FWIW at Volvo Ocean race stopover a while ago they had the pedestrian auto stop feature on display, a certain individual decided to test this once the puboic had gone. He drove the car right through the pedestrian and into a wall. The feature didn't work.
Does also seem that statistical the Tesla approach is probably safer than the average driver,
There isn't the evidence for that. Firstly as I posted upthread UK drivers are statistically safer than Tesla drivers. Secondly 130 million miles and 1 death is too small a sample. Once Tesla have a couple of billion auto miles logged we'll start getting an idea. In the UK that would be around 10 deaths.
In any case Tesla don't claim their car can drive itself. Elon Musk states "the responsibility remains with the driver" (1.22s on
So Tesla think their system can help the driver to be safer but stats don't show that so far. The problem with the current state of Tesla automation is that it isn't fully auto but the more it takes over the more drivers switch off and once every X number of million miles that results in a fatal.
The feature didn't work
Are you talking about [url= http://www.independent.co.uk/life-style/gadgets-and-tech/news/self-parking-volvo-plows-into-journalists-after-owner-neglects-to-pay-for-extra-feature-that-stops-10277203.html ]this[/url]? In that case they (a) misunderstood what the various safety systems were, and (b) didn't actually have the safety system they were apparently trying to demonstrate installed; it's an optional extra, not standard equipment. I don't think that can be blamed on the car.
irc - I agree with most of what you say, but to compare Tesla miles to "UK driver miles" is bad analysis. You can't say "UK drivers are safer", only possibly "driving in the UK is less likely to result in death than the US". They typically drive different vehicles, their road layouts are different, their driver training and law enforcement is different and potentially even their emergency response is different Its just as bad comparing UK and US stats and attributing that to the person or system controlling the velocity of the vehicle as it is to say "1 death in 130M miles to date means the death rate is 1:130M miles".
And of course that works the other way round too - just because they prove technology in wide straight roads in sunny california doesn't mean you can assume the same safety record on narrow twisty back lanes of somerset on a misty day.
irc - I agree with most of what you say, but to compare Tesla miles to "UK driver miles" is bad analysis.
Point taken. But the gist of my argument stands. One USA fatal per 95 million miles or thereabouts for human drivers is an accurate stat. 1 fatal for 130 million miles for Tesla isn't enough data. We can't say whether the Tesla system as it stands is safer. The advantages of the driver assist features may be outweighed as drivers switch off. We don't know.
oh, and Tesla agree that this system is not fully ready yet
So how the hell is it allowed on the public roads? Has it passed a driving test?
So how the hell is it allowed on the public roads? Has it passed a driving test?
It's there to help and assist the driver (the single most dangerous component in a car) at no point should you be using it as a automatic driving system. The failure yet again is the driver.
at no point should you be using it as a automatic driving system. The failure yet again is the driver.
Or alternatively it's a Tesla failure for not having some system to ensure the driver is paying attention. A modern version of the dead man handle on trains. What it should be? Who knows. Iris scanner? Blink sensor? But an company that semi automates then blames the driver for losing concentration is just shifting blame.
The other issue is driver de-skilling. First generation drivers have years or decades of experience. Drivers in an auto car world won't have that. So full automation - not an issue. Semi auto - the drivers will be worse when they need to take over.
Manual control is a highly skilled activity, and skills need to be practised continuously in order to maintain them. Yet an automatic control system that fails only rarely denies operators the opportunity for practising these basic control skills. One of the consequences of automation, therefore, is that operators become de-skilled in precisely those activities that justify their marginalised existence. But when manual takeover is necessary something has usually gone wrong; this means that operators need to be more rather than less skilled in order to cope with these atypical conditions. Duncan (1987, p. 266) makes the same point: “The more reliable the plant, the less opportunity there will be for the operator to practise direct intervention, and the more difficult will be the demands of the remaining tasks requiring operator intervention.”
An experienced driver today is probably competent enough to monitor a self-driving car but what about a driver twenty years from today who will likely not have spent any meaningful amount of time driving a manual car?
http://www.macroresilience.com/2013/05/09/deskilling-and-the-cul-de-sac-of-near-perfect-automation/
https://www.teslamotors.com/models
From my reading nothing says you don't actually need to pay attention...
[quote> https://www.teslamotors.com/models
From my reading nothing says you don't actually need to pay attention...
Operating manual? Human nature? Which will it be?
An experienced driver today is probably competent enough to monitor a self-driving car but what about a driver twenty years from today who will likely not have spent any meaningful amount of time driving a manual car?
An interesting point, but with technology decreasing the rate of human intervention, statistically insignificant compared to the current reality.
Note also the weasel words "experienced" and "highly skilled" - neither of which are prerequisite for driving a car.
At the opposite end of the spectrum, when Ford first introduced their adaptive cruise system in the new F150, the system would slam on the anchors when met with dark shadows under overpasses on bright days... Caused quite a few wrecks there too! We are on our 4th car that has adaptive cruise (2nd with lane-keeping assist steering) and whilst it does work well, it makes me pretty uncomfortable, and I would never pay any less attention and use it as a crutch! My mother-in-law has fallen asleep at the wheel of her Mercedes ML and the system kept her on the road an woke her up 😯
I've used office computer systems for thirty years. Based on that experience there's no way I would trust my (or anyone else's) life so directly to a computerised system.
I know we are forced indirectly to risk our lives (computerised control of hazardous processes etc) but just because "we" can do something by IT does not mean that we should.
I know we are forced indirectly to risk our lives (computerised control of hazardous processes etc) but just because "we" can do something by IT does not mean that we should.
Missing the elephant in the room here, human beings manage to kill 1000's of people through poor, bad, inattentive and under skilled driving. The most dangerous component in the car is the driver.
It's a classic case of launching an immature system full stop. Tesla can hide behind the fact that 'the driver is always in control and should monitor the system at all times' is a cop out. The causes of most accidents is drivers not being alert - well if they're not alert when actually driven then they're going to be even less alert by simply overseeing an automated system. If the driver is the weak link the system has to eliminate the weak link or you're not addressing the danger. The sort of half-way house driver assist systems that Tesla has introduced are just a stepping stone to fully automated driver-less system, and therefore effectively developing the system in the real world.
Unfortunately there is no global framework for allowing new technology for cars to be proved, certified and released and introduced in a safe way like there is for the aviation industry where there are global aviation authorities that regulate and oversee everything. That hasn't been an issue because before now we've been talking about new tech such as electric windows and sat nav - i.e. technology who's failure will not result in accidents. This is something different and requires a more thorough approach, other than driving around for a few million miles and declaring "that'll do". Where is the thorough global standards that the system has to achieve other than a testing To Do list that a few engineers within Tesla come up with. There is the testing to ensure the system will 'fail in a safe way' when certain critical components within the system fail?
Jesus RIP, it literally be a combination of the driver was part watching, keeping an eye in the road and using the driving feature. This combined with a tragic traffic move = a sad fatality. I just love news media painting a bloke as 100% irresponsible. You can use a feature for a while and start to trust if you know.
On a road with impulsive and sudden lane changes, chain reaction moves with no irrational thought a computer can not beat a driver IMO.
Last week I witnessed a truck driver monster and almost crush a car driver for a small slight- it was ridiculous. Would a computer spot and respond to that or throw a spanner/fault move?
I just wish the media would wait for 100% facts before slandering a dead man but then they wouldn't say '6 months ago a man was killed and here is our follow up story with all the facts now'.
As wobbliscott says.
These systems should undergo the same certification that a new passenger plane has too, which is a bit more than fly it for a few 1000 hours and saying that'll do.
Also, people seem to be too trusting of technology. The GPS is always right. There'll always be a mobile signal. The self drive works and I can read a book on the way to work.
To those saying that they wouldn't trust their life to an automated system on the basis on using a Windows pc, or that new tech up to now isn't safety critical (sat navs etc), then how much control do you think you have driving a modern car with an automatic gearbox? The engine is fly by wire and the shift lever has no mechanical link to the transmission for selection of neutral.
There are standards that if aren't followed the manufacturers will find themselves liable. Tesla are in a very grey area by referring to this functionality as beta, In my opinion it should not be present on a production car if it is genuine beta software.
I agree, the grey area seems to be getting bigger, and no-one seems to be regulating it. Engines being drive by wire is, I think, just about ok, but combine that with self parking systems (which must therefore have some degree of steer by wire and brake by wire) and I'm having to place a lot of trust in the hands of the hardware and software engineers. I'm not sure I'm ready for that yet without someone checking what they're doing. (Or I'm just getting old.)
enough to know that I need to pay attention and not watch frozen on the DVD player I guess. Throttle may be fly by wire but the driver is still putting in command inputs, gearbox maybe computer controlled but still responds to control inputs. Steering however has to have a physical link between steering wheel and front wheels so that if (when) hydraulic and electronic systems fail the driver can still steer. Same with brakes. It's not to do with the gadgetry under the skin of the car, it's about what makes the control decisions. Even with adaptive cruise control and lane assist, that responsibility lies with the driver. Apparently it is the same for the Tesla, but, people are lazy...how much control do you think you have driving a modern car with an automatic gearbox? The engine is fly by wire and the shift lever has no mechanical link to the transmission for selection of neutral.
Even with adaptive cruise control and lane assist, that responsibility lies with the driver. Apparently it is the same for the Tesla, but, people are lazy...
+100
In the end removing the driver is the safest thing, as much assistance to correct the mistakes of the driver are good. But a driver who ignores the road is an idiot and responsible for what happens
The inputs to the engine and gearbox are purely electrical though. The software decides what to do with those inputs, there is no mechanical override other than obviously the brakes - but in hybrid and electrical cars they are becoming increasingly computer controlled due to energy regen.
The point is that we already have a high degree of automation, which if not properly implemented and standards followed, could be dangerous. Teslas autopilot system implies that the driver doesn't need to take any notice (even if that isn't the case, that's how it will be interpreted), and this is a step too far.
My issue is it either has to be fully automated or not automated.
Once you take away the steering and the throttle and the brake - the drivers going to get t bored become inattentive and not pay attention.
Dead man switch like trains ?
I do think it is coming and it's got potential to be a great thing but the ethical dilemma in such cases - so I'm in my automated car I own it's driving automatically and it kills someone.
Who's responsibility ? I reckon I knkw the answer but there will be a dispute I'm sure.
Whenever someone quotes "Darwin Awards" in relation to someone's death, it demonstrates an appalling lack of empathy and taste.
Perhaps we can nominate the OP to explain the concept of 'Darwinism' the victims' children.
The key thing is who (or what) is generating those inputs. Yes a computer will decide to select a different gear ratio, and a different fuel mixture, but the computer is responding to a driver request. Even with cruise control and lane assist, the driver has requested to maintain a certain speed and course.The inputs to the engine and gearbox are purely electrical though. The software decides what to do with those inputs, there is no mechanical override other than obviously the brakes
It's about who is [i]in charge, [/i]and who is [u]responsible.[/u]
V8ninety - I see your point about responding to drivers inputs - if you press the accel pedal halfway, then you might reasonably expect the software to implement this for you. This should be the case but don't be under an illusion that you have direct control, a software bug could prevent this from happening. Fortunately there are standards and procedures followed to mitigate this risk.
In this example though the driver makes the concious decision how far to press the accelerator pedal. Same with steer by wire, and brake by wires systems- they should respond to what the driver is asking for. The problem that I see with autopilot is the computer becomes the driver and makes the combined decision on how far the press the accel pedal, how much to steer, how much to brake. A huge amount of subconscious decisions and observations are made during driving which I don't believe a computer is able to replicate yet, or maybe ever.
Automotive software isn't just hacked together in someone's basement like a lot of everyday software is, it is a highly regulated environment that goes to great lengths to demonstrate correctness at every stage of the process. The more complex or safety critical a function, the more nauseating the certification process, so I have no doubt that a huge amount of effort has been put into making the Tesla system as safe as they can make it, and that will continue into the future as issues like this surface.
While only "beta" this will still be far better designed and tested than 99% of software out there will be. The various agencies such as the US NHTSA wouldn't let them sell it if it wasn't.
Too late - I agree.
Technically, if anyone is explaining Darwinism to the deceased children, then it doesn't [i]actually[/i] apply... Jus' sayinPerhaps we can nominate the OP to explain the concept of 'Darwinism' the victims' children.
The software required to drive a gearbox is many levels of orders of magnitude less complicated than that required for an autopilot system. You can predict quite simply the number of different possibilities and situations that an auto gearbox has to deal with - maybe 20 or 30 different scenario's. But the number of potential possibilities that an autopilot system has to be able to deal with will add up to the millions - almost infinite - so many that it is impossible to hard code into software so you're looking at incorporating elements of artificial intelligence. The two are not even comparable.
I'm not saying that they are comparable, but what I am saying is that we already have a large amount of automation in cars at this point in time. People on this thread are suggesting that based on sat navs and windows computers that they wouldn't trust automated systems, but the truth is that they already do. A software bug in engine software, gearbox software, etc could put people in danger, but we do have standards that have to be adhered to, so that this risk is mitigated.
You are right, the decisions that a gearbox has to make are many many fewer than autopilot, but incorrect gearbox software could still result in tragic consequences.
if you press the accel pedal halfway, then you might reasonably expect the software to implement this for you...but don't be under an illusion that you have direct control, a software bug could prevent this from happening
By the same token, twenty years ago;
If you press the pedal half way you might reasonably expect the carburettor to implement this for you, but don't be under the illusion that you have direct control. The fuel air ratio has been carefully set by some very clever boffins at the factory, and the complex and balanced decision making of how much fuel vs air is allowed into the inlet manifold is made by some very clever self regulating mechanisms that take into account engine temp, load, revs, as well as a variety of other factors. Of course, if a little bug got stuck in a jet, it could prevent this from happening...
Since machines have become increasingly complex, we have had less and less 'direct' control. The paradigm shift that is going on now though is that we are actually handing [i]fundemental control[/i] to the machines. As a driver (responsible for my own errors) I'm not sure I'm ready for that yet.
Not convinced of this, to be honest. Expensive, yes, but under normal driving conditions a gearbox would have to REALLY **** up to cause a crash. (Edit; and a pure mechanical box could **** up in just the same way, so electronics are kinda irrelevant in that example)incorrect gearbox software could still result in tragic consequences.
The Tesla software will be more complicated, but all of it will still have been through the same extremely rigorous certification process; I couldn't even begin to imagine how time consuming and expensive that must have been, and I'm a software engineer who has worked on much, much simpler safety critical systems in the past. It's a phenomenal amount of work.
I bet you're still more likely to get killed by the failure of some really boring bit of hardware like a bonnet catch or something than any software system.
phiiiil which would make it even more frustrating for the engineers having their work shot down by a journalist who half baked stories for a living
The difficulty in the step up to fully autonomous control is actually why I prefer Tesla's approach than Google's; we are generally better at improving what we have in tiny steps than coming up with a finished product in one go.
Both are producing extremely valuable results so I'm still glad people are approaching this from many angles, but the incremental approach makes it less of a shock.
phiiiil which would make it even more frustrating for the engineers having their work shot down by a journalist who half baked stories for a living
Absolutely. The comparison between control system software and desktop or phone software makes things difficult; the gulf between the two is massive.
When some VW bigwig a while ago blamed their test cheating on "a couple of rogue software engineers" he was talking absolute bollocks; there is no way in hell you could ever sneak anything in to that kind of software without armies of people knowing what you're doing.
And yet I'd feel a lot safer if you handed control of everyone else's vehicles to the 'computer' rather than let average drivers be in charge.Since machines have become increasingly complex, we have had less and less 'direct' control. The paradigm shift that is going on now though is that we are actually handing fundemental control to the machines. As a driver (responsible for my own errors) I'm not sure I'm ready for that yet.
I've often said that if you invented the car today you'd never be allowed to introduce it onto the roads. The big challenge in autonomous vehicles isn't driving the vehicle, its guessing what the non-autonomous vehicles (and perhaps to a lesser extent animals, peds and bikes) might do. Interestingly I believe my grand children will never actually drive a car (if I have any they won't be driving for about 25 years at least), and to make that big leap happen it might be necessary for us to do something radical like banning normal cars from some sorts of road, or insisting they all transmit certain data to help make them predictable to their near neighbours. My grandkids will look back at the improved safety statistics and wonder how it took us so long to phase in the technology.
