The Paradox of ‘Big Cloud’ Hyperscalers and Scaling IIoT Solutions

  • The security-centric pitch of hyperscalers is compelling, but it often fails to address the need for sustainability and scalability in the development of IIoT solutions.

  • Security should be equally prioritized with the need to provide valuable data insights to empower an organization.

  • In addition to being difficult to sustain and scale, general software technologies and tools create friction when it comes to solution development.

  • Scalability needs to be designed into a solution at an early stage and is often not considered, until it is too late, when conducting initial projects with general software technologies and tools.

Just because you can create a scalable cloud infrastructure, does that mean you can also successfully create scalable IIoT solutions?

Hyperscalers believe the answer to this question is ‘yes’. An examination of the topic raises several issues and questions.

The hyperscaler pitch is usually directed at the c-suite, and goes something like this:

“Security is concern number one. Your manufacturing facilities and/or assets could be hacked, potentially shutting down and destroying your business. To avoid this disaster, you need a locked down cloud infrastructure with a single pipe going to the cloud, a pipe that will be secure and impenetrable. Of course, that must be our (hyperscaler) pipe. Then you use our suite of products, tools, and services to build the solutions you need from the ground up. This way, you will guarantee security and be aligned with our leading-edge technologies.”

Sounds great, right? 

Possibly, but it might also result in a long, costly digital transformation journey with solutions that are difficult to sustain and scale, leaving a manufacturer, or field equipment provider, competitively disadvantaged.

Let’s start by examining the security issue highlighted by hyperscalers. Yes, security is a big issue - I think there is unanimous agreement. But what has really changed over the years pertaining to data security? Security has always been addressed through network configuration and management. Has the work required to secure a network fundamentally changed?

Everyone has benefited from advancements in cloud security over the last decade, but should security be the singular issue driving an IIoT architecture? Isn’t it just as important to ensure that the right people within the organization get the right insights from the data collected and that those insights are valuable? Empowering and enabling the people in the organization who are using the data is equally important as security. If you achieve the objective of keeping the bad guys out, but as a result, you are left with unusable data, is there any actual benefit?

The ’security’ sales strategy of the hyperscaler is, in many ways, a trojan horse. The real objective is to drive incremental cloud computing consumption.

What happens when a manufacturing company decides to push all their unstructured data into a data lake through that one ‘safe’ pipe? The structuring and querying of the data will occur in the data lake and, to achieve acceptable performance, the required computing power will be orders of magnitude higher. Is the customer aware of this early in the sales process or is this left as a ‘surprise’ to be discovered somewhere down the road?

The alternative is to curate data through a digital twin infrastructure. An industrial digital twin architecture can normalize and curate data across an enterprise. Highly structured digital twin data is then put into the data lake to enable reporting, analytics, and enterprise workflows. This is a more rapid, cost-effective, and sustainable approach to enabling a digital transformation journey.

Hyperscalers are very good at convincing potential customers they are the safe choice. But the base technology infrastructure and tooling promoted by the hyperscaler leads to friction when it comes to solution implementation, which means more customization, time, and money. In the end, this comes down to a ‘build or buy’ decision for a manufacturing customer. 

An analogy to earlier days of industrial automation and data visualization could serve as an example. When Windows was first introduced in manufacturing, some companies chose to purchase applications from vendors, while others chose to build their own using Windows. If you went down the ‘build’ path, you needed your own team to develop, sustain, extend, scale, and support whatever you built with the tools provided. If you purchased an application, the vendor who supplied it was responsible. In the end, most manufacturers concluded that their core competency did not lie in the development and management of software applications. Has anything really changed since then? 

Today, the hyperscaler is providing general technologies and tools that enable both end customer solutions along with partners in their extended ecosystem. If for some reason, a manufacturing customer can’t be convinced that the ‘safe choice’ hyperscaler can meet all requirements, the hyperscaler may include components from their extended partner ecosystem. Either way, friction and variability create revenue opportunities for the hyperscaler.

How much organic innovation occurs at a hyperscaler, or any other large industrial or enterprise software company for that matter? In my experience, most of the time, energy, and resources at these companies is spent maintaining legacy software. 

Innovation is driven through acquisitions. New technology, software, and products are acquired and injected into a mature marketing, sales, and execution engine, with a large existing customer base. This is the formula that drives growth. 

The challenge, however, is the same as discussed in the first two installments of this blog series - you are mashing together several acquired technologies, tools, and product components, along with some organic development sprinkled in, to facilitate solutions. Technologies and products not specifically designed to enable an industrial digital twin infrastructure result in long, costly solution deployments that are difficult to sustain and impossible to scale.

When it comes to solution scalability, the following are questions that should be asked of any vendor being considered for an industrial digital transformation initiative:

  • Is the digital twin infrastructure designed primarily around specific assets or for both assets and more complex manufacturing processes that incorporate multiple assets and material flows? Specifically, how does the platform address these differences?

  • Does the digital twin infrastructure include a ‘class’ concept that incorporates differences in similar assets and processes to facilitate scaling of solutions?

  • Does the platform include packaged applications that are designed to accelerate time to value for the most common manufacturing use cases? Do these applications scale and how?

  • Does the digital twin infrastructure support algorithms that are automatically optimized to incorporate differences and changes across similar assets and processes?

  • Has the platform been proven to scale? How many assets, processes, lines, digital twins, connected devices and sensors, physical locations, and predictive models for an individual customer has the platform derived solution(s) been scaled to?

  • Are actionability and operationalization through alerts, notifications, and workflows designed into the platform? How much coding or customization is required? Is scalability designed in and how?

These are just a few of the questions and issues that should be considered when evaluating the ability of an IIoT platform to generate sustainable and scalable solutions. When it comes to selecting an IIoT platform, the ‘safe’ option may not always be the best option.

Catch up on the first two parts of this series, The Stack and the Twin and The Ubiquitous Connect and Visualize Platform

Dave Westrom
Post by Dave Westrom
March 6, 2023
Dave Westrom is the current Vice President of Sales at TwinThread. With over 30 years in executive roles at well-regarded IIoT and software companies, such as GE, Wonderware (now AVEVA), Lighthammer Software and MachineMetrics, Dave’s legacy is as a known growth accelerator of cutting-edge technologies. As an industrial leader, he’s formed and successfully exited multiple startup ventures. Most notably, Dave headed Business Development at ThingWorx–– an IoT and AR platform for the industrial enterprise––continuously expanding their partner ecosystem until PTC’s 2016 acquisition.

Dave’s a seasoned and savvy business strategist, a builder of global partner/customer networks, and a revenue driver. His successful career is a testament to his keen sense of navigating the complexities of the tech world, consistently identifying opportunities and pushing innovation.

Prior to his professional career, Dave received his BS degree in Mechanical engineering from the University of Maryland. Dave continues to shape TwinThread’s growth and deepen his expertise from his desk in the Greater Philadelphia Area.