Viewing 27 posts - 41 through 67 (of 67 total)
  • Tesla's Master Plan, Part Deux
  • DezB
    Free Member

    I wonder how they’ll change the liability laws when driverless cars are out there. If a cyclist gets killed due to a fault with the software… would it mean that they could never be fully driverless? so that the “user” is liable for any incidents..?

    Notter
    Free Member

    Technological sophistication or software quality aside, the issue as I see it between Google (and presumably other auto manufacturers) and Tesla is that Google’s system is building knowledge through controlled testing with engineers overseeing it, with the downside for them being they can’t rack up as many test miles as Tesla can because……

    The owners / drivers of Tesla’s are the “engineers” in the above scenario which is where the controversy comes from. The engineers are absolutely paying attention to what the car’s doing, because it’s their job. The owners of Teslas are probably not paying attention to what the car’s doing, because, well Facebook / texting / watching a movie / viewing pron, whatever….

    Don’t get me wrong, I love tech (and work in it in true STW stereotype fashion!) but have got the concerns on the contrasting methods that these systems gain their experience.

    wobbliscott
    Free Member

    Driverless cars are the future – I didn’t mean to suggest in my previous post that we should abandon them. I just think that Tesla’s approach is sloppy. WHat we first need are some industry standards to define what a driverless system is, what it’s capabilities are, what the limits of it’s abilities are, what the rules are for redundancy and what the system should do if something goes wrong or fails. Without that it’s a free-for all.

    Developing a car that can drive itself is relatively easy – what makes it difficult is dealing with unpredictable situations that we see on the roads all the time. We can address that simply by not mixing driverless cars and driven cars on the road – it’s the human element that introduces the unpredictability.

    I don’t agree with the suggestion that driverless cars only have to be better than human driven cars. They have to be flawless. No company is going to risk liabilities of their car autopilot systems killing people.

    nickc
    Full Member

    bollox to all that, where’s the bloody jet-pack I was promised

    Northwind
    Full Member

    Wobbliscott, your position’s contradictory; practically no complex tech is flawless and no new tech ever is. So you’re saying it’s the future then imposing a condition on its use that it will never meet.

    verses
    Full Member

    – I have a concern that as these things come in, and start obeying traffic rules such as speed and not overtaking on hatched boxes etc, it will allow the numpties another chance to prove their mastery of driving – by intimidating the computer and carrying on, knowing the automated car won’t give them road rage back…When we have enough automated cars, then yes it will improve, but getting there could be interesting.

    Hopefully the number of sensors and cameras on the automated cars will make prosecuting numpties easier.

    The owners of Teslas are probably not paying attention to what the car’s doing, because, well Facebook / texting / watching a movie / viewing pron, whatever….

    If they are doing that then they need prosecuting as that’s not what they’re being sold. If the car detects you’re not holding the wheel it shouts at you. When starting “autopilot” you’re told not to use it carelessly. It’s currently a driving aid like cruise control, not full-on KITT-like self driving.

    I think the worst thing they did was call it Auto Pilot rather than something like cruise-contol-plus…

    prawny
    Full Member

    DezB – Member
    I wonder how they’ll change the liability laws when driverless cars are out there. If a cyclist gets killed due to a fault with the software… would it mean that they could never be fully driverless? so that the “user” is liable for any incidents..?

    Volvo have said that they’ll take responsibility for a crash caused by their driverless cars. Insurance wise it will come down to a products liability claim like any other mechanical failure that causes injury or damage. It’s quite interesting, and there were a few articles in the insurance press (it is a thing) when the Google car thing first kicked off, it’s gone quiet recently though, it’s all brexit at the mo

    DezB
    Free Member

    bollox to all that, where’s the bloody jet-pack I was promised

    or…

    Del
    Full Member

    new york avenue, washington dc 😉

    Notter
    Free Member

    If they are doing that then they need prosecuting as that’s not what they’re being sold. If the car detects you’re not holding the wheel it shouts at you. When starting “autopilot” you’re told not to use it carelessly. It’s currently a driving aid like cruise control, not full-on KITT-like self driving.

    Which is exactly my point, a quick you tube search unearths all sorts of cockwombles who are doing exactly that, ie ignoring the car. This is the problem, people blindly believing that it works because, well why would Tesla allow me to use something that’s called “autopilot” but then expect me to still pay attention.

    So in other words, problem = humans 😉

    Edit – not all humans, but humans who are not being paid to test the system.

    fingerbike
    Free Member

    new york avenue, washington dc

    Elon’s words, not mine, guilty of Copy/Pasta, that’s all. 🙂

    ghostlymachine
    Free Member

    Yeah. Thats called a test program. Where you do testing. Like everyone else is doing.
    What Tesla is doing is getting people to drive the same route at the same time of day, day in, day out. Of their claimed test mileage, i would doubt that more than 5% is actually “testing” as opposed to just accumulating miles.

    footflaps
    Full Member

    What Tesla is doing is getting people to drive the same route at the same time of day, day in, day out.

    As the traffic is always different, they are still testing and therefore honing the algorithms.

    ghostlymachine
    Free Member

    Yeah. That’ll be the 5%.

    It’s far from impressive.

    AlexSimon
    Full Member

    That 5% would still be 150,000 miles every day though. I’m impressed, even if you or your industry friends aren’t.

    Musk plays a clever game and I can see how it would rub competitors up the wrong way, but these industry disrupters often play a completely different game to the one that the established players think they’re in.

    chakaping
    Free Member

    Volvo have said that they’ll take responsibility for a crash caused by their driverless cars.

    I was gonna say this, it seems to be emerging that carmakers will accept liability for accidents caused by the vehicles in self-driving mode. And rightly so.

    Volvo are trialing self-driving cars in Gothenburg this year, with another pilot study scheduled for London.

    I’ve been writing a reasonably in depth B2B document about connected cars recently, learned a lot and the technology is both closer and further away than I thought.

    chakaping
    Free Member

    these industry disrupters often play a completely different game to the one that the established players think they’re in.

    Some of the analysis suggests that the real disruptors in the car industry could be Google, Amazon and Apple, though obviously they’ll be working in conjunction with traditional mnfrs.

    ghostlymachine
    Free Member

    Musk plays a clever game and I can see how it would rub competitors up the wrong way, but these industry disrupters often play a completely different game to the one that the established players think they’re in.

    “Industry Disruptors” is that the latest buzzword for completely ignoring some pretty hefty legislation? And if you are going to play a game of football, using the rules from ice hockey doesn’t really do you (or your business) any favours.

    aracer
    Free Member

    Computers are actually already very good at this sort of thing – given proper development (which is here or close to being here right now, not years in the future) they should be better than humans. I find it interesting that a lot of the issues mentioned – for example fitting into gaps only just wider than the car – are the sort of things which computers are far better at than humans.

    I don’t agree with the suggestion that driverless cars only have to be better than human driven cars. They have to be flawless. No company is going to risk liabilities of their car autopilot systems killing people.

    So you’re happy for lots of needless deaths at the hands of human drivers when driverless cars can do better? There’s a whole load of dodgy risk perception going on here – not just you, but many if not most of the posters on this thread as well as most people in the wider world – even many of those who tend to agree with me. Because we’re conditioned to accept carnage due to motor vehicles which we don’t find acceptable in any other situation.

    fatmax
    Full Member

    big old long interview with elon musk, well worth reading,

    http://waitbutwhy.com/2015/05/elon-musk-the-worlds-raddest-man.html

    Two pages in and no one has mentioned the sustainability focus of Musk, Tesla or his solar company. He / they might not be perfect but I find him quite inspiring.

    chakaping
    Free Member

    Aracer – they have to be perfect (not just better than you or I) so that the manufacturers don’t get sued, and so that driverless cars ydon’ get banned.

    I don’t know if they will be, they will probably get close enough though.

    The real life studies being carried out are as much about seeing how other road users interact with them as testing the actual functionality.

    mikewsmith
    Free Member

    So I think the 1 thing we can all agree on is that the biggest problem for self driving cars is the unpredictable nature of the human operated cars… which one should we be getting rid of 😉

    aracer
    Free Member

    It’s the same old story mike – when interaction between cars (human driven) and other road users results in casualties, I don’t think removing the cars is ever considered as an option.

    Perfect isn’t possible – no other technology is perfect. Hence manufacturers will be sued – for which they’ll have insurance (in reality they’ll probably self-insure). However most of the liability will still be with human drivers, becasue the computers are so much better, so the risk won’t be huge. For the reasons I’ve set out, banning driverless cars due to them not being perfect would be ridiculous – who is going to campaign for, or legislate for increasing the death rate on the roads?

    DezB
    Free Member

    Survey on IAM – https://www.iamroadsmart.com/media-and-policy/polls/pathway-to-driverless-cars

    Brings up some interesting questions, I think.

    breatheeasy
    Free Member

    Which is exactly my point, a quick you tube search unearths all sorts of cockwombles who are doing exactly that, ie ignoring the car. This is the problem, people blindly believing that it works because, well why would Tesla allow me to use something that’s called “autopilot” but then expect me to still pay attention.

    Didn’t somebody successfully sue Winnebago a few years ago because they didn’t make it clear in the manual that turning cruise control on in their van didn’t mean you could wander into the back to make a cup of coffee and (funnily enough) it crashed?

    packer
    Free Member

    The tesla thing is poor engineering, weasel words and marketing spin.
    The sooner he gets called out on it the better.

    I’m no expert in this field, but I’ve read quite a lot about it and about Tesla, and I really don’t see how anyone could arrive at these conclusions.
    Are you sure you haven’t just taken an irrational dislike to the CEO??

Viewing 27 posts - 41 through 67 (of 67 total)

The topic ‘Tesla's Master Plan, Part Deux’ is closed to new replies.