Posts

Getting More from OPC A&E

Easily access OPC A&E from multiple network sources, or convert it to OPC DA, UA and other protocols using DataHub middleware.

Air Liquide uses conditional bridging for robust control

A recent article in CONTROL magazine tells how engineers at Air Liquide now have a standard way to connect the company’s air separation units.  These units need to work with a variety of distributed control systems in a number of locations. The challenge was to build a system flexible enough to connect to the different control systems without having to reconfigure each of them.

Using the Bridging interface in Skkynet’s DataHub industrial middleware, along with a custom DataHub script created in collaboration with Software Toolbox, Air Liquide has successfully installed their first system, and now plans to expand to more locations in a standard way.

“When Air Liquide shared their requirements with us, the DataHub came immediately to mind as the easiest and most reliable approach,” said Win Worrall of Software Toolbox, who worked closely with the engineering team.

Digital Twins Thrive on Data Integration

Digital twins. The term was coined only ten years ago, but the concept is rapidly becoming a must-have in the manufacturing sector. Last year a Gartner poll found that 62 percent of respondents expect to be using digital twin technology by the end of this year, although only 13 percent of them were actually using it at the time. A key factor in this sudden interest is that “digital twins are delivering business value and have become part of enterprise IoT and digital strategies.”

What exactly are digital twins, and why are they getting so much attention lately? A digital twin is made up of three basic components: a physical system, a virtual representation of it, and the data that flows between them. The physical system could be an individual device, a complex machine, a whole production line, or even an entire factory. The virtual representation can be as complex as necessary to represent the system. The data connection keeps the virtual twin as closely in sync as possible with the physical twin, often tracking and updating changes in real time.

The Value and Challenge of Data Integration

A digital twin operating in isolation is useful, but the real rewards come through making connections. Data integration between multiple sub-components of a digital twin, or between multiple digital twins, is key when advancing beyond simple pilot projects. “The ability to integrate digital twins with each other will be a differentiating factor in the future, as physical assets and equipment evolve,” says the Gartner report.

There are at least three types of relationships:

  • Hierarchical, in which digital twins can be grouped together into increasingly complex assemblies, such as when the digital twins for several pieces of equipment are grouped into a larger digital twin for a whole production line.
  • Associational, where a virtual twin for one system is connected to a virtual twin in another system, in the same way that their physical counterparts are interrelated, such as wind turbines connected to a power grid.
  • Peer-to-peer, for similar or identical equipment or systems working together, like the engines of a jet airplane.

Making these connections is not always easy. A recent publication from the Industrial Internet Consortium (IIC), titled A Short Introduction to Digital Twins puts it this way, “Since the information comes from different sources, at different points in time and in different formats, establishing such relations in an automatic way is one of the major challenges in designing digital twins.”

The IIC article briefly discusses some of the technical aspects this kind of integration, such as:

  • Connectivity, the necessary first step for data integration.
  • Information synchronization keeps a virtual twin in sync with its physical twin, and among multiple connected twins, maintaining a history and/or real-time status, as required.
  • APIs allow digital twins to interact with other components of a system, and possibly with other digital twins as well.
  • Deployment between the edge and the cloud pushes data beyond the OT (Operations Technology) domain to the IT domain, that is, from the physical twin to the virtual twin.
  • Interoperability between systems from different vendors may be necessary to gain a more complete picture of the total system functionality.

Another useful resource, Digital Twin Demystified from ARC Advisory Group, identifes data connectivity, collection, tracking volume & fidelity, and ensuring the quality of real-time data as being “key challenges associated with using real-time and operational data” in digital twins.

A Good Fit

Skkynet’s software and services are well-positioned to provide the kind of data integration that digital twins require. Most data on an industrial system is available to an OPC client like the DataHub, which ensures robust connectivity. Virtually any other connection to or between digital twins, such as from legacy hardware or custom software, is possible using the DataHub’s open APIs.

Real-time data mirroring between DataHubs can handle the synchronization needed for tight correlation between the physical and virtual systems. The secure-by-design architecture of DHTP provides a proven way to connect twins across insecure networks or the Internet, even through a DMZ, to ensure the highest level of security for both the physical twin on the OT side, as well as the virtual twin on the IT side.

By supporting the most popular industrial communications protocols, and through secure, real-time data mirroring, Skkynet software and services are often used to build fully integrated systems out of components from different vendors. A recent example of this is in the TANAP project in which DataHub software was used to integrate OPC A&E (Alarm and Event) data from ABB systems with other suppliers, effectively creating a virtual digital twin of the entire 1800 km pipeline.

Digital twinning can be seen as one aspect of the whole area of digital transformation in industry. As companies move towards digitizing their operations, the ability to create a virtual twin of each component, machine, production line, or plant, and connecting that twin to their IT systems will put better control of production into the hands of managers and executives, leading to greater efficiencies. The success of this undertaking, at every step of the way, depends on secure data integration among the digital twins.

Case Study: TEVA API Pharmaceuticals, Hungary

TEVA combines tunnelling and aggregation to network OPC data through a firewall

Laszlo Simon is the Engineering Manager for the TEVA API plant in Debrecen, Hungary. He had a project that sounded simple enough. Connect new control applications through several OPC stations to an existing SCADA network. The plant was already running large YOKOGAWA DCS and GE PLC control systems, connected to a number of distributed SCADA workstations. However, Mr. Simon did face a couple of interesting challenges in this project:

  • The OPC servers and SCADA systems were on different computers, separated by a company firewall. This makes it extremely difficult to connect OPC over a network, because of the complexities of configuring DCOM and Windows security permissions.
  • Each SCADA system needed to access data from all of the new OPC server stations. This meant Mr. Simon needed a way to aggregate data from all the OPC stations into a single common data set.

After searching the web, Mr. Simon downloaded and installed DataHub® software. Very quickly he was able to connect a DataHub instance to each of his OPC servers, and determine that he was reading live process data from TEVA’s new control systems. He was also able to easily set up OPC tunnelling links between the OPC server stations and the SCADA workstations, by simply installing another DataHub instance on each SCADA computer and configuring it to connect to the OPC server stations. This unique combination allows him to view data from both OPC servers on either SCADA system

“I wanted to reduce and simplify the communication over the network because of our firewall. It was very easy with DataHub software.” said Mr. Simon after the system was up and running. Currently about 7,000 points are being transferred across the network, in real-time. “In the future, the additional integration of the existing or new OPC servers will be with DataHub technology.”

Case Study: Plastics Manufacturer, Scandinavia

Leading plastics manufacturere uses live process data to optimize production, saving time and materials

One of Scandinavia’s leading plastics manufacturers has chosen DataHub® software from Cogent Real-Time Systems (a subsidiary of Skkynet) to extract data and interact with their state-of-the-art plastic manufacturing equipment. The firm can now access any desired process data for the purposes of engineering analysis and enterprise-level resource planning.  DataHub software was the only additional piece of software required to realize substantial savings of time, materials, and production costs.

“The DataHub application is exactly the kind we needed,” said the project coordinator. “Our system is extensive, and we need to visualize a lot of production parameters. We looked at other solutions but they were too expensive and more complicated.”

plastics-manufacturer-plantWhen the company installed new equipment recently, the necessary system integration grew very complex. Progress was slow. After almost a year they were facing a deadline and had little to show for their time and effort. The goal was to pull together data from 15 machinery units, and feed it in real time into the company’s business processing systems. And if possible, to enable plant engineers to view and work with the live data as well. When they found DataHub software they were pleased to learn that most of the work had already been done.

The first test was to connect a DataHub instance to an OPC server and put live data into ODBC databases, Excel spreadsheets, and web browsers, as well as to aggregate OPC servers and tunnel data across a network. DataHub technology proved to be easy to use and reliable, and it performed remarkably well. The next step was to set up a test system.

The test system connected all of the OPC servers for the plant’s plastics production machines to a central DataHub instance. Another DataHub instance at a network node in the engineering department is connected to the central instance by a mirroring connection, for tunnelling data across the network. This second DataHub instance is then connected to an Excel spreadsheet to give a live display of the data in real time. When a piece of equipment machine starts up on the production line, the chart comes to life—cells spontaneously update values and bar charts spring into existence.

The engineering department was able to develop a custom TCP application that uses the DataHub C++ API to make a direct connection from the DataHub instance to their SQL Server database. Once connected that database gets updated in milliseconds with any change in the plastic-manufacturing machinery. From the SQL Server database the data is accessed by the company’s ERP and accounting software. Using DataHub software in these ways allows the company to:

  • Aggregate the data from all machinery into one central location.
  • Distribute the data across the network to various users.
  • Do decimal conversions of the data as it passes through the DataHub instance.
  • Put selected subsets of data into Excel for engineers to view and run calculations on.
  • Feed values into a SQL Server database in the company’s IT and business processing system. The OPC points are read-only to ensure a clean separation between the management and production areas.

“This system pays for itself,” said a company spokesman, “and we save money in many ways. We have seen substantial gains in productivity and performance because we can monitor our processes far more effectively. Our accounting and planning departments have, for the first time ever, an up-to-the-second record of actual production variables and statistics. At the same time, our engineering staff can use real-time data in their calculations, and feed the results directly back into the process.”

DataHub technology also saved substantial programming costs. The time alone saved on development work has paid for the system many times over. With a single tool the project coordinator has met the various needs of both the engineers and company managers. “The software is easy to install and it works well,” he said. “It’s at the correct level for our needs.”

Case Study: City of Montreal, Canada

The City of Montreal uses DataHub for data connectivity in 10 billion dollar project to upgrade the quality of drinking water production and distribution.