MegaSack DRAW - This year's winner is user - rgwb
We will be in touch
Anyone working across this space in here - there must be a few of us?
Anyone care to speculate where you think things will be going in the next few years, been reading some interesting stuff around the sides but nobody seems to have gone all in yet 🙂
Trying to write some "Going Forward" stuff and be good to get an idea beyond the Gartner/IBM press release
Expecting the usual buzzword haterz to show up 🙂
Happy to chat off the forum too
Marketing **** that Siemens et al are using to try and tie you into their proprietary systems. Making simple shit out to be complicated.
Thanks for your helpful insights.... which presentation from Siemens did you go to?
The price point of IOT devices is so low you can pretty much put them on anything:
The Dairy Monitor solution of Connecterra (www.connecterra.io) uses high-tech pedometers mounted on cows to very accurately sense its movements (Pretz, 2016). The solution is scalable and directly connected to a cloud services platform with advanced algorithms for predictive analytics. Connecterra actually creates Digital Twins of cows that are used to remotely monitor cows and to detect when a cow is in estrus (in heat) and to monitor its health. The Dairy Activity Monitor is able to provide multiple behaviour detection and predictions including animal heat & estrus cycles, health analysis and also provide a forward looking prediction of the next cycle start dates. The devices learn and tune their behaviour based on the individual movements of cattle. Furthermore, Connecterra provides location services that track and trace the movements of dairy cows giving you an accurate measurement of the free-grazing time per animal.
What's Digital Twins?
EDIT ah, read the post above..
Cheers Shinton, loving the Per Cow/Per Month pricing option there, is that something you are involved in there?
https://www.gartner.com/smarterwithgartner/prepare-for-the-impact-of-digital-twins/
We're doing lots around IoT and Industry 4.0. Digital Twins wasn't a term I was familiar with however it does look like we're doing a fair bit of that as well. I work for an IT consultancy that is part of a larger group that also manufactures a lot of IoT relevant devices.
We're starting to see a lot of use cases for IoT now, with a fair few proof of concept type projects spinning up.
@epicsteve, kind of what I'm seeing here, coming form a Simulation background we are partly the brain/engine behind the twin part but it does feel like 3 techs (Data Collection, analytics & simulation) colliding at the moment - obviously the big players are trying to offer across it all
I do get the feeling that all those days asking for data that didn't arrive or wasn't available is going to come back to haunt us as we start talking in Petabytes!!
Heh - strange that a cow feeder would come up. Creating a cow feeder system was the first project Sophie Wilson worked on, before co-inventing the BBC Micro and, eventually, the instruction set for the ARM processor...
Rachel
which presentation from Siemens did you go to?
Sales rep comming round and showing as stuff, tell me it was revolutionary ,(it wasn't) had not been done before (it had). The usual hyperbole.
There is interesting stuff out there but the big players kill it with closed systems and lies.
There is interesting stuff out there but the big players kill it with closed systems and lies.
Just to correct you there one of the integral parts of Siemens in mindsphere is open https://siemens.mindsphere.io/
Although parts of this have been done before I've not seen any truly integrated solutions out there
Lots of this in the Asset Management space regarding smart infrastructure.
Lots of conversations and some projects going on in my area - I'm a technology strategist for Microsoft so we have platforms and capability to support people building these solutions, and a few that are being built by Microsoft Services too.
The company I work for have been into IOT and big data for decades, though of course its seen an exponential increase over recent years. It's given us unbelievable insight into the operation of our products which has transformed the nature of the services we can sell to our customers and transformed our business from a company that designs, manufactures and sells machines, to one that received half of its revenue from value added services - so we're just as much a service delivery company as a supplier now. 99% of it is nothing to do with connecting things to the internet or to our own network - that is the easy bit, it's what to do with the sheer volume of data you start getting back. We only actually do something to about 20% of the data our products generate out in service, we're bringing on new data-driven services on-line all the time, but the real challenge is how to process all the data you're receiving - you just can't employ enough people to crunch numbers manually. It's fine for things you have already established - once you've got an algorithm set up it'll run automatically just fine drawing whatever data source it needs, but it's the potential of what you could do if you could exploit the 80% of data that you're not doing anything with routinely. It's just sat there until there is a problem then you have the data around you to help you fix problems....but we really want to use the data proactively to anticipate problems before they actually happen. For that we're looking to heavily invest in AI and machine learning.
Digital twins is very much in its infancy - we've been doing it on a very very small scale for a while, but has a very long way to go yet before they are of significant use for large complex machines, but it is the future for companies like ours.
Beej - we're on the Microsoft platform for our IT platform. But unfortunately like most IT projects we've touched, we've taken a solution that works fine and managed to completely screw it up during the implementation.
We do a lot on the topic at my place of work (Manufacturing Technology Centre) but I'm not involved directly.
We have an annual Digitalising Manufacturing event too:
http://www.the-mtc.org/event-items/digitalising-manufacturing-conference-2018
Which is a bit of a shame because for some reason I find it hard to say "digitalising"
Yeah the twin bit is going to be the fun, it's much closer to where I come from with the simulations but the live part is the stumbling block at times. We did a small scale thing on a hospital relocation project back in Oz which was interesting and pushed us a fair bit but getting the project team into thinking they had a live scratch pad decision making too was a step too far for them.
I guess the popularity of the thread and the woolliness of a lot of the articles highlight where we are with this in terms of maturity and implementation.
Just to correct you there one of the integral parts of Siemens in mindsphere is open https://siemens.mindsphere.io/
/blockquote>
That does look like a neat environment which is very in Siemens like. The website is a bit detail lite without signing up which is annoying. I still stand by my original statement that it is nothing new or complicated. Nicely packaged for sure.
Yeah I reckon your saleman would get you a free look around anyway 😉 (thats not me by the way)
Be good to see some examples of what you have seen before as an all in solution, I've done bits of this all over the years but never pulled it into one, the last tender I saw was huge on scope and ambition
Digital Twins wasn’t a term I was familiar with however it does look like we’re doing a fair bit of that as well.
Now you are familiar - that’s another £150 on the charge out rate! 😂
Moving from milestone based maintenance to condition based maintenance is a massive cost saving, and IOT/Digital twin, yada... is an enabler.
To Mike's point about petabytes of data it doesn't have to be that way. You have the choice to keep that data, and if you keep it on S3 or even Glacier on AWS, or Blob storage on Azure it should be relatively cheap. But 99% of data from an IOT device is going to be in an acceptable range so you can actually build a case for only keeping the data that matters.
And there will only be a small number of use cases where you can get into Petabytes. I just looked at my garmin and a 3 hour ride has a file of 116k, so I would need to do 8,620,689,655 of those rides to fill up a Petabyte.
AWS and Azure will mop up most of the infrastructure behind this as it can be a real hassle to worry out device management, security etc. Google have some pretty cool stuff around AI and analytics but you can run Tensorflow (great product) on AWS anyway so why bother with Google Cloud Platform.
The data point was more we sued to struggle our side in the modelling and simulation side with decent data, now we can get a lot more if people collect it - again the stuff that should work for us is integration to a machine/sensor level
So basically less big data and more what if/decision making and operational planning
Beej – we’re on the Microsoft platform for our IT platform. But unfortunately like most IT projects we’ve touched, we’ve taken a solution that works fine and managed to completely screw it up during the implementation
I work in the Enterprise area, so big UK/multi-nationals/globals. Some do things really well, some... less so. If you're a big UK manufacturer, you might well be looked after by someone in my team.
I've spent the last 10 years in engineering consultancy with a lot of background in simulation. 'Digital twins' and 'data analytics' seem to be the buzzwords of the moment.
As a few people have picked up on above, there's nothing new in the whole 'digital twin' idea. Simplified / reduced order system models have been used for years in things where it's hard to pin down the operating conditions - military jet engines for example.
The change as far as I can see is that the simulation and analysis needed to build/validate such models is no longer as niche as it was and so is more accessible, cheaper processing power and better software has helped too.
The change as far as I can see is that the simulation and analysis needed to build/validate such models is no longer as niche as it was and so is more accessible, cheaper processing power and better software has helped too.
From the Simulation point of view (discrete event and even agent based) stuff has advanced but not that much in the last 5 years in terms of capability, it's still as niche as it ever was - just ask Tesla who bought software then trawled the globe for experienced users. Processing of DES is not much faster than it was 5 years ago as it's strictly single core, we had experiments over multicore sorted 8+ years ago.
Nobody has really launched into this space in the last 5 years that has changed the balance at all,
For me this is the part that is finally getting hooked up, we were advocating having a simulation model in the control room linked up 10 years ago which everyone thought was wonderful but the planning as still in excel (it still is in a lot of places) the reporting was desperate/disparate and the level of integration was zero.
After nearly 15 years in this industry the first quote we did for a proper digital twin was last year. As a concept that we are pitching it's making a lot of traction now and it's not something people in most places we are dealing with have ever considered.
As somone who doesn't have a clue about this sort of thing but can see a massive potential for my industry who would I need to talk to?
Whats your industry andy? (Happy to chat via messages too but would probably just point you in a direction of somebody else)
Biggest question is what are you trying to solve/improve
Sales rep comming round and showing as stuff, tell me it was revolutionary ,(it wasn’t) had not been done before (it had). The usual hyperbole.
There is interesting stuff out there but the big players kill it with closed systems and lies.
Well, that is marketing bullshit vs reality though.
Working on the technical end I'm amazed by the lack of understanding the sales guys have... selling something made generic by marketing aimed at the CxO level.
Quite often the technical guys do have something new but the marketing and sales people either don't understand it or it's not judged generic enough... and lack of funding until a client commits to buy to do the non-generic.
At least in my organisation it's often killed by the wrong people talking to the wrong people in the client....
