Getting More from OPC A&E

Easily access OPC A&E from multiple network sources, or convert it to OPC DA, UA and other protocols using DataHub middleware.

Air Liquide uses conditional bridging for robust control

A recent article in CONTROL magazine tells how engineers at Air Liquide now have a standard way to connect the company’s air separation units.  These units need to work with a variety of distributed control systems in a number of locations. The challenge was to build a system flexible enough to connect to the different control systems without having to reconfigure each of them.

Using the Bridging interface in Skkynet’s DataHub industrial middleware, along with a custom DataHub script created in collaboration with Software Toolbox, Air Liquide has successfully installed their first system, and now plans to expand to more locations in a standard way.

“When Air Liquide shared their requirements with us, the DataHub came immediately to mind as the easiest and most reliable approach,” said Win Worrall of Software Toolbox, who worked closely with the engineering team.

Digital Twins Thrive on Data Integration

Digital twins. The term was coined only ten years ago, but the concept is rapidly becoming a must-have in the manufacturing sector. Last year a Gartner poll found that 62 percent of respondents expect to be using digital twin technology by the end of this year, although only 13 percent of them were actually using it at the time. A key factor in this sudden interest is that “digital twins are delivering business value and have become part of enterprise IoT and digital strategies.”

What exactly are digital twins, and why are they getting so much attention lately? A digital twin is made up of three basic components: a physical system, a virtual representation of it, and the data that flows between them. The physical system could be an individual device, a complex machine, a whole production line, or even an entire factory. The virtual representation can be as complex as necessary to represent the system. The data connection keeps the virtual twin as closely in sync as possible with the physical twin, often tracking and updating changes in real time.

The Value and Challenge of Data Integration

A digital twin operating in isolation is useful, but the real rewards come through making connections. Data integration between multiple sub-components of a digital twin, or between multiple digital twins, is key when advancing beyond simple pilot projects. “The ability to integrate digital twins with each other will be a differentiating factor in the future, as physical assets and equipment evolve,” says the Gartner report.

There are at least three types of relationships:

  • Hierarchical, in which digital twins can be grouped together into increasingly complex assemblies, such as when the digital twins for several pieces of equipment are grouped into a larger digital twin for a whole production line.
  • Associational, where a virtual twin for one system is connected to a virtual twin in another system, in the same way that their physical counterparts are interrelated, such as wind turbines connected to a power grid.
  • Peer-to-peer, for similar or identical equipment or systems working together, like the engines of a jet airplane.

Making these connections is not always easy. A recent publication from the Industrial Internet Consortium (IIC), titled A Short Introduction to Digital Twins puts it this way, “Since the information comes from different sources, at different points in time and in different formats, establishing such relations in an automatic way is one of the major challenges in designing digital twins.”

The IIC article briefly discusses some of the technical aspects this kind of integration, such as:

  • Connectivity, the necessary first step for data integration.
  • Information synchronization keeps a virtual twin in sync with its physical twin, and among multiple connected twins, maintaining a history and/or real-time status, as required.
  • APIs allow digital twins to interact with other components of a system, and possibly with other digital twins as well.
  • Deployment between the edge and the cloud pushes data beyond the OT (Operations Technology) domain to the IT domain, that is, from the physical twin to the virtual twin.
  • Interoperability between systems from different vendors may be necessary to gain a more complete picture of the total system functionality.

Another useful resource, Digital Twin Demystified from ARC Advisory Group, identifes data connectivity, collection, tracking volume & fidelity, and ensuring the quality of real-time data as being “key challenges associated with using real-time and operational data” in digital twins.

A Good Fit

Skkynet’s software and services are well-positioned to provide the kind of data integration that digital twins require. Most data on an industrial system is available to an OPC client like the DataHub, which ensures robust connectivity. Virtually any other connection to or between digital twins, such as from legacy hardware or custom software, is possible using the DataHub’s open APIs.

Real-time data mirroring between DataHubs can handle the synchronization needed for tight correlation between the physical and virtual systems. The secure-by-design architecture of DHTP provides a proven way to connect twins across insecure networks or the Internet, even through a DMZ, to ensure the highest level of security for both the physical twin on the OT side, as well as the virtual twin on the IT side.

By supporting the most popular industrial communications protocols, and through secure, real-time data mirroring, Skkynet software and services are often used to build fully integrated systems out of components from different vendors. A recent example of this is in the TANAP project in which DataHub software was used to integrate OPC A&E (Alarm and Event) data from ABB systems with other suppliers, effectively creating a virtual digital twin of the entire 1800 km pipeline.

Digital twinning can be seen as one aspect of the whole area of digital transformation in industry. As companies move towards digitizing their operations, the ability to create a virtual twin of each component, machine, production line, or plant, and connecting that twin to their IT systems will put better control of production into the hands of managers and executives, leading to greater efficiencies. The success of this undertaking, at every step of the way, depends on secure data integration among the digital twins.

Case Study: TEVA API Pharmaceuticals, Hungary

TEVA combines tunnelling and aggregation to network OPC data through a firewall

Laszlo Simon is the Engineering Manager for the TEVA API plant in Debrecen, Hungary. He had a project that sounded simple enough. Connect new control applications through several OPC stations to an existing SCADA network. The plant was already running large YOKOGAWA DCS and GE PLC control systems, connected to a number of distributed SCADA workstations. However, Mr. Simon did face a couple of interesting challenges in this project:

  • The OPC servers and SCADA systems were on different computers, separated by a company firewall. This makes it extremely difficult to connect OPC over a network, because of the complexities of configuring DCOM and Windows security permissions.
  • Each SCADA system needed to access data from all of the new OPC server stations. This meant Mr. Simon needed a way to aggregate data from all the OPC stations into a single common data set.

After searching the web, Mr. Simon downloaded and installed the DataHub®. Very quickly he had connected the DataHub to his OPC servers and determined that he was reading live process data from TEVA’s new control systems. He was also able to easily set up the OPC tunnelling link between the OPC server stations and the SCADA workstations, by simply installing another DataHub on the SCADA computer and configuring it to connect to the OPC server stations.

“I wanted to reduce and simplify the communication over the network because of our firewall. It was very easy with the DataHub.” said Mr. Simon after the system was up and running. Currently about 7,000 points are being transferred across the network, in real-time. “In the future, the additional integration of the existing or new OPC servers will be with the DataHub.”

Case Study: Plastics Manufacturer, Scandinavia

Leading plastics manufacturere uses live process data to optimize production, saving time and materials

One of Scandinavia’s leading plastics manufacturers has chosen the DataHub® from Cogent Real-Time Systems (a subsidiary of Skkynet) to extract data and interact with their state-of-the-art plastic manufacturing equipment. The firm can now access any desired process data for the purposes of engineering analysis and enterprise-level resource planning. The DataHub was the only additional piece of software required to realize substantial savings of time, materials, and production costs.

“The DataHub is exactly the kind of application we needed,” said the project coordinator. “Our system is extensive, and we need to visualize a lot of production parameters. We looked at other solutions but they were too expensive and more complicated.”

plastics-manufacturer-plantWhen the company installed new equipment recently, the necessary system integration grew very complex. Progress was slow. After almost a year they were facing a deadline and had little to show for their time and effort. The goal was to pull together data from 15 machinery units, and feed it in real time into the company’s business processing systems. And if possible, to enable plant engineers to view and work with the live data as well. When they found the DataHub they were pleased to learn that most of the work had already been done.

The first test was to connect the DataHub to an OPC server and put live data into ODBC databases, Excel spreadsheets, and web browsers, as well as to aggregate OPC servers and tunnel data across a network. The DataHub proved to be easy to use and reliable, and it performed remarkably well. The next step was to set up a test system.

The test system connected all of the OPC servers for the plant’s plastics production machines to a central DataHub. Another DataHub at a network node in the engineering department is connected to the central DataHub by a mirroring connection, for tunnelling data across the network. This second DataHub is then connected to an Excel spreadsheet to give a live display of the data in real time. When a piece of equipment machine starts up on the production line, the chart comes to life—cells spontaneously update values and bar charts spring into existence.

The engineering department was able to develop a custom TCP application that uses the DataHub C++ API to make a direct connection from the DataHub to their SQL Server database. Once connected that database gets updated in milliseconds with any change in the plastic-manufacturing machinery. From the SQL Server database the data is accessed by the company’s ERP and accounting software. Using the DataHub in these ways allows the company to:

  • Aggregate the data from all machinery into one central location.
  • Distribute the data across the network to various users.
  • Do decimal conversions of the data as it passes through the DataHub.
  • Put selected subsets of data into Excel for engineers to view and run calculations on.
  • Feed values into a SQL Server database in the company’s IT and business processing system. The OPC points are read-only to ensure a clean separation between the management and production areas.

“This system pays for itself,” said a company spokesman, “and we save money in many ways. We have seen substantial gains in productivity and performance because we can monitor our processes far more effectively. Our accounting and planning departments have, for the first time ever, an up-to-the-second record of actual production variables and statistics. At the same time, our engineering staff can use real-time data in their calculations, and feed the results directly back into the process.”

The DataHub also saved substantial programming costs. The time alone saved on development work has paid for the system many times over. With a single tool the project coordinator has met the various needs of both the engineers and company managers. “The software is easy to install and it works well,” he said. “It’s at the correct level for our needs.”

Case Study: City of Montreal, Canada

DataHub used for connectivity and integration on $10 billion project

Situated on an island in the St. Lawrence River, the City of Montreal in Quebec, Canada has been blessed with an abundant supply of water. Yet ensuring that clean, fresh water reaches the city’s millions of residents every day requires constant attention. In 2004, the City of Montreal embarked on a 20-year, 10 billion dollar project to upgrade the quality of drinking water production and distribution in the city. This initiative includes better metering, infrastructure repair, new purification systems, and plant upgrades. The goal is to improve efficiency throughout the system.

As part of this project, water resource engineers at the Charles J. Des Baillets plant’s head office were recently given the job of integrating the production data from all of the city’s seven pumping stations. Their task was to provide a reliable and secure way to bring key data from those satellite plants into a central control location for storage and analysis.

The data is available on SCADA systems at each pumping plant, accessed through OPC servers. However, networking this vital data proved to be a challenge. Networking OPC DA using DCOM was neither reliable nor secure, so the engineering team decided to use OPC tunneling. They tried several popular OPC tunneling products, and the only one that worked well was the Cogent DataHub®.

The data collection and redistribution architecture that the project planners had in mind was quite complex. Primarily, they needed to collect data from all of the remote stations in a highly secure way, and log it at their central control location. Neither the central client nor anyone else should be able to write back to the OPC servers. They also needed to send the collected data to a third location for the company’s IT staff, and bridge to other OPC servers there. In addition, each pumping station needed to receive some of the data collected from the other pumping stations. And finally, some of the pumping stations were running fully redundant SCADA systems, so they needed redundancy built into the system at those locations.

“We started by connecting the Pierrefonds plant and the central location in Atwater for logging the data, with a second tunnel to the IT office for analytical use of the data,” said the project manager. “We had a few initial issues related to configuration and network addresses, and Cogent’s quick response was very helpful to resolve them. After this first experience with the DataHub, we were very enthusiastic to apply this solution to the rest of the plants in Montreal.”

As each location came online, while they were configuring the tunneling to the central office, the team realized that they had the necessary tool to share the data securely between satellite locations. On the DataHub at the central office they established a separate data domain for each plant, and created a read-only tunnel to receive the data. Then at each plant, they created a read-only tunnel from their local DataHub to the central DataHub to get the data from each of the other plants. This gave the operators at each plant a complete picture of what was going on throughout the system.

“To make intelligent decisions at a satellite plant, it is very helpful to know what’s happening across the city,” the project manager said. “Since all the data was there in the DataHub anyway, we decided to use it.”

With data logging and secure tunneling in place, the next feature to implement was redundancy. Several locations had completely redundant SCADA systems, each with its own OPC server. With help from Cogent, the team was able to establish a connection to the redundant OPC servers such that if one server failed for any reason, the DataHub would start receiving data from the second OPC server.

“The system has been running for months without any problems, logging the data we need to stay efficient,” said the project manager. “We are very pleased with the high quality of the DataHub, its flexibility to do what we need, and with Cogent’s excellent technical support at every point of the way. The data integration aspect of the City of Montreal’s water system upgrade project is meeting or exceeding its goals.”