Moving from milestone based maintenance to condition based maintenance is a massive cost saving, and IOT/Digital twin, yada… is an enabler.
To Mike’s point about petabytes of data it doesn’t have to be that way. You have the choice to keep that data, and if you keep it on S3 or even Glacier on AWS, or Blob storage on Azure it should be relatively cheap. But 99% of data from an IOT device is going to be in an acceptable range so you can actually build a case for only keeping the data that matters.
And there will only be a small number of use cases where you can get into Petabytes. I just looked at my garmin and a 3 hour ride has a file of 116k, so I would need to do 8,620,689,655 of those rides to fill up a Petabyte.
AWS and Azure will mop up most of the infrastructure behind this as it can be a real hassle to worry out device management, security etc. Google have some pretty cool stuff around AI and analytics but you can run Tensorflow (great product) on AWS anyway so why bother with Google Cloud Platform.