As a Tignis team member, I can say I am always excited and surprised with the tools put together by our team. Internally, these tools have helped us deliver incident reports to our customers to save them time by quickly guiding their attention to incidents of interest. One such tool, Tignis’s interactive data analysis application powered by DTQL, has become so compelling that it now stands out in its own right. DTQL stands for Digital Twin Query Language because it leverages the interaction between physical components or assets in a system or process and then navigates relationships to strengthen analysis techniques.
The exciting news for you is that we are releasing this previously internal-only functionality for use by any team looking to get advanced analytical value out the sensor data from their physical equipment. We are already seeing strong interest from automation teams and maintenance teams. Tignis’s analysis application powered by DTQL is a line-by-line interpreter and graphing explorer that allows rules to be declared using logic, mathematical, statistical, or ML functions, the results of which can be plotted or used to create highlighted portions of interest. I have had a very pleasant and illuminating experience with this tool, including continuous surprises with its capabilities, and amazement at the anomalies it picks up that would have otherwise escaped me. I think this application would be of immediate help to many people. Allow me to tell you why I love it so much.
Currently, many maintenance teams are using Excel as a tool for data analysis. Don’t get me wrong, Excel has its place in the world and is an impressive tool in its own right. However, for data analysis it can be a bit cumbersome: wait for the program to load all your data, make sure you have selected appropriate rows or columns, deal with missing or incorrect data, find and navigate to the plotting feature, ensure correctness of the settings used to plot your data, select more rows and columns, and visualize with graphs that have pre-configured time ranges with little ability to navigate within them or zoom in/out. Additionally, all of the preceding issues get repeated for every analysis you want to do, there are data limitations on top of this, and lastly data formatting issues can be experienced when you export to csv or during a data import. All of these issues go away with Tignis’s analysis application.
Tignis’s analysis application powered by DTQL aims to connect to existing infrastructure. The application was initially developed to be used exclusively by Tignis team members on Tignis’ platform, but we have also recently added an abstraction interface that allows us to create connectors to pull data in from other sources. Presently, the interpreter can pull data from CSV files and OSISoft Pi servers, but we plan to expand to more data sources in the future. Our aim here is to make it easy to work with your data. In the case of Pi, you can simply specify your server connection details and Tignis takes care of the rest.
Setting up a connection with Tignis’s application is extremely easy. Once the user provides their data historian credentials, it synchronizes data according to timestamp. There are no restrictions on sampling rate. Missing data is handled with flexibility. For example, the application makes it easy for you to extrapolate prior values to fill in gaps which is useful in the case where your system only stores changes in values. If you have sparse data, where some points are just missing, and you don’t know the correct value, a more sophisticated approach is to use ML to predict sensor values. This may sound complicated, but Tignis makes it so easy. I’m confident anyone can do it.
“If you have sparse data, where some points are just missing, and you don’t know the correct value, a more sophisticated approach is to use ML to predict sensor values. This may sound complicated, but Tignis makes it so easy. I’m confident anyone can do it.”
We have implemented the capability to work with templates. If you are familiar with OSI Pi’s element templates you will understand what this means. If not I’ll briefly explain: templates allow us to write rules on an asset class, which can then be applied to every individual entity (think asset) or element that fits within that template, saving a lot of time and effort. You can then select from a drop-down menu any individual entity that fits into that template to validate your rule against it. If you don’t have templates and only have a single entity or element you would like to work with, DTQL will accommodate that as well.
Tignis’ analytics application powered by DTQL has context-based intelligence baked in. If you are writing a rule on a template or entity and click the search button, it will pull up all the entities or templates available in your system. Once you have a target entity or template, clicking the search button again pulls up a list of sensors or features available to that target entity or template. In addition to the context based search feature, the search button also includes a list of all of the logic, mathematical, statistical, or ML functions that are available for use.
Now that I’ve mentioned why I personally love the new analysis application powered by DTQL from Tignis, let me tell you some of the things that you can do. I’ve had a lot of fun with it and I think you will too!
All this allows the possibility to notify yourself or your team that you have a problem, after, during, or even before it has occurred.
I am excited to see our customers get as much out of this tool as we have internally. Tignis onboarded it’s first external users to the interactive data analysis application powered by DTQL this quarter and plans to release it more broadly over the coming months. If you’re interested in being an early adopter or trying the application, please visit info.tignis.com/dtql-app. Also, keep any eye on our website and social media for product release information!
Steven Herchak has worked at Tignis since January 2020. He is an Integration Engineer with a M.A.Sc. degree from the University of Victoria. Previously, Steven has worked with a building automation and controls company and Schneider Electric. At Tignis he is responsible for working directly with customers to interpret their needs and translate them into live analytics using DTQL.