How Tignis Digital Twins Power The Future Of AI-Based Process Control

SEPTEMBER 28, 2021  Author: Christopher J. Rust   Category: Thought Leadership

The Internet of Things and the Industrial Internet of Things continues to develop with advances in Artificial Intelligence and Machine Learning.  New solutions offer cost savings, efficiency gains and improved yield that were not possible before.  Startups like Tignis are leveraging these advances to optimize industrial operations and make Artificial Intelligence and Machine Learning available to almost any application.AI and ML, which have been widely deployed for consumer internet applications with great effect, will have to be completely reimagined for usability in the industrial world.  A new approach that adapts AI/ML to the world of physical sciences is required to enable solutions offering cost savings, efficiency gains and improved yields not previously possible. Tignis is pioneering these advances to optimize industrial operations and make Artificial Intelligence and Machine Learning available to almost any application.
The benefits of big data analytics using Artificial Intelligence and Machine Learning (AI/ML) to the Internet economy cannot be understated. Today’s leading Internet companies have pushed the frontier of AI/ML as a foundational enabling technology to power their advertising revenues.  The monetization of the “attention economy” requires the use of AI/ML to aggregate and analyze massive data sets.  Similarly, large retailers use AI/ML to power their merchandizing strategies, personalized commerce, and their potential upsell/cross-sell opportunities (i.e., “if you like this, you will like that”).  This has largely been in the purely digital domain.
The world is in the early days of a major Great Leap Forward. AI/ML is moving from the “digital domain” for big data analytics into the “analog domain” for controlling real-world physical machines and complex systems built from those machines. We are on the verge of centralized computers using AI/ML to actively control machines at a massive scale. To enable this leap forward, investment in AI innovation has surged over the last 5 to 6 years. Over $50 Billion of venture funding has gone into AI/ML companies just in the last four quarters per CB insights. 
There has been much written over the past half-decade about the Internet of Things (IoT) and Industrial Internet of Things (IIoT). A global race is underway to instrument the world with 10s of billions of new network-connected sensors and data collection probes. This tsunami of new data sources is laying the foundation for a not-too-distant future where computers using AI/ML will control and optimize the performance of highly complex large-scale industrial processes that make our daily lives possible.
Industries such as energy production, manufacturing, healthcare, and transportation are being re-imagined with AI/ML running in centralized cloud computing and operations centers controlling industrial processes that have historically been managed by human experts.  The legions of skilled process and operations engineers will not be replaced by AI/ML, but they will be given “superpowers” to do more work, in less time, with greater accuracy.  This article introduces foundational frontier technologies such as “Digital Twins” and “real-time analytics” that are emerging as the key enablers for a new era of industrial efficiency and productivity gains.
Typically, machine learning requires terabytes or even petabytes of data to train a massive ML model. The Internet economy is based on aggregating massive data sets from user clickstreams to determine likes, dislikes, interests, affinities, etc.  In the industrial space, it is common that a data scientist only has megabytes or gigabytes of data for training.  To make AI/ML useful for the industrial world, AI/ML models need to be aware that the industrial world is governed by the physical sciences; statics; dynamics; thermodynamics; fluid mechanics, biology, chemistry, etc.  Only then can the benefits of machine learning be realized for real-time AI/ML-based Process Control (AIPC) while filling in the gaps in small data sets. A nuanced understanding of the physical sciences that govern the behavior and performance of an industrial process or system within the said process is critical to track, control, and optimize its’ performance.
Innovation at the collaborative boundary between physical sciences (broadly defined) and AI/ML is where the next major wave of innovation will be found. We call this “AI+X” for short.

Industrial machines are increasingly becoming “cyber-physical”, meaning they are being infused with a proliferation of new sensors, data collection probes, and network connectivity to transport the data that is generated to a centralized location for analysis.  However, the first step in using these new sources of machine-generated data is to check their validity. AI/ML is fundamentally about analyzing large data sets to search for and identify patterns within the data, to group and classify elements within the data set into groups based on some pre-determined characteristics. This pattern matching is typically done by minimizing some error function between what you are observing and what you are seeking to find.  Sensor data from industrial processes is often inaccurate due to machine calibration issues, tolerances, or failures at some point along the data collection to analytics chain. Physical sciences provide the rules to validate the veracity of real-time IoT/IIoT sensor data before it is analyzed by AI/ML control planes.  This interdisciplinary collaboration between physics broadly defined and AI/ML is the key enabler for machines to control complex industrial processes.  Computers do only, and exactly, what they are told.  Traditional AI/ML from the purely digital domain analyzes the data that it is fed.
Consider the evolution of the modern aircraft engine to understand this industry trend towards 1) heavily instrumented, 2) purpose-built, 3) cloud-managed with AI+X to optimize performance.
The aircraft engine industry has gone through a major transformation over the last two decades. Aircraft engine manufacturers have gone from selling a discrete product (i.e., jet engines) to new business models (i.e., leasing/renting of jet engines) and more recently to a service model (example: selling horsepower). To support this transition, aircraft engine manufacturers such as General Electric have enormously improved their instrumentation by increasing the number of sensors from two hundred in an engine two decades ago to four thousand today. This new level of instrumentation provides vastly better situational awareness and helps the manufacturer improve product quality and helps them come up with creative ways of monetizing their investments.
The automobile industry is another example of how the power of AI/ML is being combined with vast amounts of sensor data and the physical sciences to actively control a complex real-world process.  In this case, the car is a purpose-built physical appliance/robot that exists to provide transportation-as-a-service.  Automobiles began as relatively simple machines that could be maintained by a reasonably handy person with some simple tools. Over the past decade cars evolved with onboard computers that perform advanced system diagnostics for basic maintenance tasks. Something as basic as tire pressure and tire wear is now continuously tracked with purpose-built, wireless sensors.  Modern cars have effectively become data centers on wheels.  Driver-assist via 360-degree view camera arrays, collisions avoidance systems, integrated navigation controls, and even “pet modes” that sense the presence of a pet when a car has been turned off and proactively control the climate in the vehicle are commonplace.  Every major auto manufacturer has a major R&D initiative for autonomous vehicles. AI/ML is foundational to the emerging systems for computer vision, radar, lidar, precision geo-location, collision avoidance, etc. that will make autonomous vehicles increasingly common over the next decade. Each autonomous vehicle will be actively managed by a “Digital Twin” running in the cloud where sensor data can be ingested, rationalized, verified as being viable, aggregated, and put to good use.
According to Strategy Analytics, we already have about 22 billion devices connected to the internet as of 2018 and that is expected to grow to about 50 billion by 2030.  This growth is going to be driven primarily by more connected machines and enables ML in the industrial space.
As the need for bringing AI and ML to the industrial world emerges, it makes sense to focus on high-value industrial processes which are labor-intensive and/or hazardous.  Examples include, but are not limited to:

•    Energy (both traditional and renewable)
•    Semiconductor manufacturing
•    Automotive/transportation
•    Healthcare

These industries have highly talented process and control engineers, but they do not have the tools to make sense of the deluge of data coming from the connected machines that they are now tasked with managing.
To create the factories of the future as envisioned by the Industry 4.0 vision (coined by Klaus Schwab, the Founder of the World Economic Forum.), the following basic steps need to happen:
1.    The flood of data needs to be ingested in a cost-efficient manner.
2.    The data needs to be put into a form that is useful, relevant and actionable.
3.    Complex processes and end-to-end workflows need to be understood.
4.    Representative digital models of the industrial workflows need to be built.
5.    AI/ML techniques need to be applied to enhance workflow outcomes.
With current technologies, this work is only possible with a cross-functional set of skillset and thus can be prohibitively expensive.
As a precursor, IT operations management has evolved over the last decade from stage 1 (i.e., monitoring and observability) to currently at about stage 3 (recommendations) as shown in the graphic below. 

Industrial process control will have to go through a similar transformation if it is to deliver superior business outcomes.  AI and ML will be the foundational technology to enable the evolution along this continuum.
Based in Seattle and founded by the former VMWare Cloud Management CTO Dr. Jonathan Herlocker, Tignis has built a revolutionary platform that is uniquely positioned to address the AI-based Process Control (AIPC) challenges and opportunities outlined above.
Tignis recently launched the world’s first comprehensive solutions suite to quickly build, monitor, deploy and optimize industrial processes with Digital Twins based on AI/ML and physics. The Tignis PAICe Product Suite has three components:

•    PAICe Builder, an analytics tool easy enough for anyone to use 
•    PAICe Monitor, allows for easy deployment to the private or public cloud
•    PAICe Maker, deploys ML-based algorithms that improve with time 

The key capabilities of the suite can be summarized as follows: 

•    Modeling a complex industrial process
•    Data ingestion at web-scale and rationalize data so that it is useful
•    Simple and efficient creation and deployment of ML to data to glean insights
•    ML-based closed-loop control of the processes to optimize performance
•    Interactive data analysis powered by a new intuitive language

Simply put, the Tignis platform enables Internet-scale big data aggregation and validation across all the connected elements in any high-value industrial process. Tignis validates the machine’s logs and sensor data against the physical science parameters, then provides an intuitive representation of the entire end-to-end process. In other words, creating a “Digital Twin” of the physical system that runs in the cloud, but is continuously fed with and powered by real-time data from the physical system.
What makes the Tignis approach different is the introduction of a brand new, low-code programming language called Digital Twin Query Language (DTQL), the first language designed specifically to build machine analytics on digital twins.  This puts the power of AI/ML in the hands of process engineers and helps them achieve process improvements not previously possible…without requiring them to write a line of software. Engineers now can utilize surrogate machine learning models that are more accurate and up to one million times faster than traditional physics-based simulations. Thanks to DTQL and the broader PAICe product suite from Tignis, process engineers can now leverage all the historical data they have collected and convert their deep expertise into machine learning-based predictive models that can be easily deployed and managed across their entire process ecosystem. The outcome is quicker production, better quality control and faster time to market. In essence, the Tignis PAICe product suite gives the world’s process and operations engineers super-powers so they can do their job of keeping the factories working at peak performance better than ever before.
To put the power of the Tignis platform into practice, let’s take the example of the semiconductor manufacturing industry. The ongoing global shortage of semiconductor chips is impacting a host of industries, from phones to laptops to cars to smart appliances to equipment in hospitals. According to Goldman Sachs, the chip shortage could reduce U.S. GDP by as much as 1% in 2021. To make matters worse, the U.S. share of global semiconductor manufacturing capacity has dropped to 12% today from 37% in 1990, according to a study by the Boston Consulting Group. To make the US competitive in this industry, in addition to the financial capital which is expected from the recently passed bill at the US Senate, the semiconductor industry will need new technologies that help accelerate operational efficiency. Tignis’ PAICe Maker can be a key enabler to drive the next phase of innovation in semiconductor manufacturing. Therefore, it is not surprising that Synopsys and Tokyo Electron have elected to use the Tignis platform to optimize their processes. However, the applicability of Tignis’ technology is not limited to the semiconductor industry as evidenced by the usage of the Tignis PAICe Product Suite at energy companies such as Optimum Energy.
To date, the IoT and IIoT have largely focused on connecting more devices to the Internet. This has very little actual business value unless the IoT/IIoT data can be ingested, validated, then used to drive measurable business value from productivity gains, better yields, improved uptime, etc.
Tignis and the emerging field of AI+X (in this case the laws of physics) and Digital Twins running in the cloud while controlling real-world systems can enable the full potential of AI/ML to move from the digital domain to the analog/physical domain.
To unlock this potential, the Tignis approach is anchored on the following key principles: 

•    Instrumenting the physical world to enable data collection and aggregation
•    Creating a high-precision state model for a complex system
•    Building a graph of how the machines are interconnected end-to-end
•    Ingesting massive amounts of sensor data from each machine in the process
•    Enriching and validating the raw data with metadata to validate the plausibility
•    Leveraging the “Digital Twin” to enable a new level of AI-based process control
•    Using Tignis DTQL to enable this entire flow in a no-code / low code approach  

In essence, the Tignis platform has put Machine Learning in the hands of process engineers who have never had access to it until now.  A variety of industries from semiconductor manufacturing to energy to automotive and transportation systems will now be able to leverage Machine Learning-based control algorithms to outperform classic process control and deliver process outcomes that were out of the realm of traditional possibilities. Some of the harshest, most labor-intensive backbreaking tasks out there such as weed abatement and harvesting crops in farming will be automated by AI/ML controlling fleets of heavily instrumented cyber-physical purpose-built robots. All these robots will have “digital twins” ingesting their real-time data sets, track their performance, and provide control stimulus.  Human operators will be used to address anomalies.  Much work remains to realize this grand vision of increased automation and productivity, but the Tignis AIPC suite has taken a great leap forward to democratize the full potential of AI-based process control with Digital Twins for the IIoT.  Please visit www.tignis.com for more reading.


About the author: Chris Rust is the founding partner of Clear Ventures. Rust earned a BS and MS in electrical engineering at the University of Lowell, then a MS telecommunications engineering and MS Engineering Management from the University of Colorado. Rust held engineering and product management roles at MITRE, US West, and broadband pioneer Roadrunner where he was a co-founder and lead architect. After that, Rust spent 14 years at Sequoia Capital and USVP as an early stage technology investor. He co-founded Clear Venture in 2014 where they help founders win in business technology and services. Rust is a Seed investor and Board Member of Tignis.