Posts

Case Study: TEVA API Pharmaceuticals, Hungary

TEVA combines tunnelling and aggregation to network OPC data through a firewall

Laszlo Simon is the Engineering Manager for the TEVA API plant in Debrecen, Hungary. He had a project that sounded simple enough. Connect new control applications through several OPC stations to an existing SCADA network. The plant was already running large YOKOGAWA DCS and GE PLC control systems, connected to a number of distributed SCADA workstations. However, Mr. Simon did face a couple of interesting challenges in this project:

  • The OPC servers and SCADA systems were on different computers, separated by a company firewall. This makes it extremely difficult to connect OPC over a network, because of the complexities of configuring DCOM and Windows security permissions.
  • Each SCADA system needed to access data from all of the new OPC server stations. This meant Mr. Simon needed a way to aggregate data from all the OPC stations into a single common data set.

After searching the web, Mr. Simon downloaded and installed the DataHub®. Very quickly he had connected the DataHub to his OPC servers and determined that he was reading live process data from TEVA’s new control systems. He was also able to easily set up the OPC tunnelling link between the OPC server stations and the SCADA workstations, by simply installing another DataHub on the SCADA computer and configuring it to connect to the OPC server stations.

“I wanted to reduce and simplify the communication over the network because of our firewall. It was very easy with the DataHub.” said Mr. Simon after the system was up and running. Currently about 7,000 points are being transferred across the network, in real-time. “In the future, the additional integration of the existing or new OPC servers will be with the DataHub.”

Case Study: Plastics Manufacturer, Scandinavia

Leading plastics manufacturere uses live process data to optimize production, saving time and materials

One of Scandinavia’s leading plastics manufacturers has chosen the DataHub® from Cogent Real-Time Systems (a subsidiary of Skkynet) to extract data and interact with their state-of-the-art plastic manufacturing equipment. The firm can now access any desired process data for the purposes of engineering analysis and enterprise-level resource planning. The DataHub was the only additional piece of software required to realize substantial savings of time, materials, and production costs.

“The DataHub is exactly the kind of application we needed,” said the project coordinator. “Our system is extensive, and we need to visualize a lot of production parameters. We looked at other solutions but they were too expensive and more complicated.”

plastics-manufacturer-plantWhen the company installed new equipment recently, the necessary system integration grew very complex. Progress was slow. After almost a year they were facing a deadline and had little to show for their time and effort. The goal was to pull together data from 15 machinery units, and feed it in real time into the company’s business processing systems. And if possible, to enable plant engineers to view and work with the live data as well. When they found the DataHub they were pleased to learn that most of the work had already been done.

The first test was to connect the DataHub to an OPC server and put live data into ODBC databases, Excel spreadsheets, and web browsers, as well as to aggregate OPC servers and tunnel data across a network. The DataHub proved to be easy to use and reliable, and it performed remarkably well. The next step was to set up a test system.

The test system connected all of the OPC servers for the plant’s plastics production machines to a central DataHub. Another DataHub at a network node in the engineering department is connected to the central DataHub by a mirroring connection, for tunnelling data across the network. This second DataHub is then connected to an Excel spreadsheet to give a live display of the data in real time. When a piece of equipment machine starts up on the production line, the chart comes to life—cells spontaneously update values and bar charts spring into existence.

The engineering department was able to develop a custom TCP application that uses the DataHub C++ API to make a direct connection from the DataHub to their SQL Server database. Once connected that database gets updated in milliseconds with any change in the plastic-manufacturing machinery. From the SQL Server database the data is accessed by the company’s ERP and accounting software. Using the DataHub in these ways allows the company to:

  • Aggregate the data from all machinery into one central location.
  • Distribute the data across the network to various users.
  • Do decimal conversions of the data as it passes through the DataHub.
  • Put selected subsets of data into Excel for engineers to view and run calculations on.
  • Feed values into a SQL Server database in the company’s IT and business processing system. The OPC points are read-only to ensure a clean separation between the management and production areas.

“This system pays for itself,” said a company spokesman, “and we save money in many ways. We have seen substantial gains in productivity and performance because we can monitor our processes far more effectively. Our accounting and planning departments have, for the first time ever, an up-to-the-second record of actual production variables and statistics. At the same time, our engineering staff can use real-time data in their calculations, and feed the results directly back into the process.”

The DataHub also saved substantial programming costs. The time alone saved on development work has paid for the system many times over. With a single tool the project coordinator has met the various needs of both the engineers and company managers. “The software is easy to install and it works well,” he said. “It’s at the correct level for our needs.”

Case Study: City of Montreal, Canada

DataHub used for connectivity and integration on $10 billion project

Situated on an island in the St. Lawrence River, the City of Montreal in Quebec, Canada has been blessed with an abundant supply of water. Yet ensuring that clean, fresh water reaches the city’s millions of residents every day requires constant attention. In 2004, the City of Montreal embarked on a 20-year, 10 billion dollar project to upgrade the quality of drinking water production and distribution in the city. This initiative includes better metering, infrastructure repair, new purification systems, and plant upgrades. The goal is to improve efficiency throughout the system.

As part of this project, water resource engineers at the Charles J. Des Baillets plant’s head office were recently given the job of integrating the production data from all of the city’s seven pumping stations. Their task was to provide a reliable and secure way to bring key data from those satellite plants into a central control location for storage and analysis.

The data is available on SCADA systems at each pumping plant, accessed through OPC servers. However, networking this vital data proved to be a challenge. Networking OPC DA using DCOM was neither reliable nor secure, so the engineering team decided to use OPC tunneling. They tried several popular OPC tunneling products, and the only one that worked well was the Cogent DataHub®.

The data collection and redistribution architecture that the project planners had in mind was quite complex. Primarily, they needed to collect data from all of the remote stations in a highly secure way, and log it at their central control location. Neither the central client nor anyone else should be able to write back to the OPC servers. They also needed to send the collected data to a third location for the company’s IT staff, and bridge to other OPC servers there. In addition, each pumping station needed to receive some of the data collected from the other pumping stations. And finally, some of the pumping stations were running fully redundant SCADA systems, so they needed redundancy built into the system at those locations.

“We started by connecting the Pierrefonds plant and the central location in Atwater for logging the data, with a second tunnel to the IT office for analytical use of the data,” said the project manager. “We had a few initial issues related to configuration and network addresses, and Cogent’s quick response was very helpful to resolve them. After this first experience with the DataHub, we were very enthusiastic to apply this solution to the rest of the plants in Montreal.”

As each location came online, while they were configuring the tunneling to the central office, the team realized that they had the necessary tool to share the data securely between satellite locations. On the DataHub at the central office they established a separate data domain for each plant, and created a read-only tunnel to receive the data. Then at each plant, they created a read-only tunnel from their local DataHub to the central DataHub to get the data from each of the other plants. This gave the operators at each plant a complete picture of what was going on throughout the system.

“To make intelligent decisions at a satellite plant, it is very helpful to know what’s happening across the city,” the project manager said. “Since all the data was there in the DataHub anyway, we decided to use it.”

With data logging and secure tunneling in place, the next feature to implement was redundancy. Several locations had completely redundant SCADA systems, each with its own OPC server. With help from Cogent, the team was able to establish a connection to the redundant OPC servers such that if one server failed for any reason, the DataHub would start receiving data from the second OPC server.

“The system has been running for months without any problems, logging the data we need to stay efficient,” said the project manager. “We are very pleased with the high quality of the DataHub, its flexibility to do what we need, and with Cogent’s excellent technical support at every point of the way. The data integration aspect of the City of Montreal’s water system upgrade project is meeting or exceeding its goals.”

Case Study: Citect (Schneider Electric), USA

Citect optimizes OPC-based system using the DataHub

A major battery manufacturing plant in the United States was recently faced with an interesting data integration challenge. Management needed access to data coming from a large number of different processes. Over 220 OPC-enabled field devices across the plant had to be connected to a single Citect MES system. The many OPC servers used for these connections are unique in that their data set is very dynamic. From one minute to the next any of the 220 devices may be present or absent in the data set.

citect-logo “Our challenge was to provide data from our dynamically changing OPC servers to a Citect system that is designed to work with a fixed data set,” said the company project leader. They decided to bring in a team from Citect to come up with a solution.

Citect, of Schneider Electric, is well known in the industrial process control world for their line of automation and control software solutions, particularly their MES systems. Dan Reynolds, the team leader for Citect, had heard about the DataHub® through his support department, and thought it might work. They configured the DataHub for OPC tunneling, to communicate across the network without the hassles of DCOM. And, thanks to the DataHub’s unique approach to OPC tunnelling, Dan found that it also solved the problem of providing a fixed data set.

citect-battery-manufacturing-system

“The DataHub mirrors data across the tunnel,” said Dan, “so the Citect system sees a constant data set. When a device goes offline, the tag remains in the DataHub. Just the quality changes from ‘Good’ to ‘Not Connected’.” Confident in their approach, the Citect team moved the testing from their location to the battery plant. But they soon found themselves faced with another challenge.

The production system is designed so that a field device can add or remove OPC items at any time. So, not only the OPC servers, but individual tags can suddenly appear or disappear from the system. When a new tag comes online, the server updates its tag count, but doesn’t say that a new value is available, because the OPC specification doesn’t require a server to say when a new point is created. This looked like a show-stopper for the configuration team. They knew that there is no OPC product on the market that can deal with that kind of behavior. Continually rereading the data set was not possible, because new points may be added during the read. So Dan got in touch with Cogent (a subsidiary of Skkynet), and working together they came up with a plan.

The solution was two-fold. First, the device behavior was modified to compact the tag add/delete cycle to a limited time. Then Cogent wrote a DataHub script that monitors a few OPC server tags, and when these tags change, a time-delayed function in the script re-reads the server’s data set. The scripted time delay allows for all the new points to be added before the data set is reread, and the DataHub thus discovers all of the new data as soon as it all becomes available.

“We are pleased with the performance of the DataHub for this application,” said Dan Reynolds. “There is no way we could have done this project with any other OPC tunneling product, or combination of products.”

“The Skkynet software has become an integral part of our MES solution,” said the project leader. “Without the DataHub, we would not be getting reliable data. If we hadn’t had it, our MES integration project would probably have come to a halt.”

Case Study: Biomass Biotechnology Bio, Japan

Monitoring Nature’s Wonder Workers
Biomass Biotechnology Bio logo

Finally, someone has found a good use for pesky flies—let them eat manure! The BBB company (Biomass Biotechnology Bio) in the Chiba prefecture near Tokyo, Japan, has developed a technology that uses fly larvae to convert manure from pigs and other farm animals into organic fertilizer and high-protein fish food. And they are using the Cogent DataHub running on a cloud server to provide real-time monitoring of their production powerhouse—swarms of flies.

The process is quite simple. BBB keeps thousands of specially bred flies in captivity, collects their eggs, and sells them to local pork farmers. The farmers put the fly eggs on their pig manure, and when they hatch, the fly larvae feed off the manure. Enzymes in the larvae saliva break down the manure into rich, organic fertilizer, doing the job in one week that normally takes up to four months using conventional composting techniques. When the larvae are finished, they don’t need to be separated from the finished fertilizer—they crawl out by themselves, seeking a dry environment. At this point, before they can turn into flies, the larvae are collected, dried, and processed as fish food.

There is an English-language video of the whole BBB process which was produced by the Japanese news agency NHK and is available for viewing on the Internet.

The benefits of producing fertilizer from waste material this way are substantial, but until recently costs have been high. The company plans to expand their services to large numbers of farms, and to do so they need an inexpensive, automated way to monitor their production environment. Unlike most of us who use window screens to keep flies out, BBB has special screened rooms to keep flies in. To ensure the flies stay healthy and lay large numbers of eggs, the air temperature and humidity in these rooms must be maintained at optimal levels, and monitored around the clock.

To automate the monitoring, Cogent and their partner, Nissin Systems Co. Ltd of Kyoto Japan, provided a real-time, cloud-based system using the Cogent DataHub® and WebView™. At the BBB facility they installed a Wi-Fi-enabled environmental sensor module from TOA Musendenki to measure the temperature and humidity, and connected it directly to a Cogent DataHub running on a cloud server. Using WebView, they then created a monitoring page to track key environmental variables such as temperature and humidity in the flies’ living quarters.

“Monitoring our system on the web is very convenient,” said Mr. Yamaguchi, President of BBB. “We have been able to reduce our costs significantly, which will be even more important as we expand our operation.”

Case Study: RWTH Aachen University, Germany

Closed-Loop Control of Weaving Machines using the Cogent DataHub

From ancient times people have been using looms to weave cloth, canvas, and carpets. As the centuries passed, weaving became one of the first tasks to be mechanized in the industrial revolution. The various repetitive tasks such as raising and lowering rows of threads (the warp), and passing a shuttle with the cross-thread (the weft) back and forth between them, were a good fit for simple machines. Today, high-speed air-jet weaving machines are fully automated, and capable of sending a warp thread between weft strands at a rate of 2000 times per minute.

One of the challenges of automating a weaving machine is maintaining proper tension on the threads. With each pass of the weft, a certain amount of tautness must be applied to the warp to keep the woven fabric uniform at all times. Early looms used the weaver’s body weight or hanging stones to keep the warp taut, but an air-jet machine needs a more sophisticated technology.

This challenge has been addressed by the students and faculty of the Institut für Textiltechnik (ITA) der RWTH Aachen University in Aachen, Germany, who are investigating how to optimize the tension of warp threads in an air-jet weaving machine.

“Our main research goal is self-optimization of the warp tension,” said Dr. Ing. Yves-Simon Gloy, Research Manager at ITA, “to enable the loom to set the warp tension automatically at a minimum level without reducing the process stability.” Keeping the proper tension maximizes the speed of the process, while yielding the highest possible quality of fabric.

“We started by creating an automated sequence routine, with the help of regression models for a model-based setting of the loom, and implemented in the weaving process,” said Dr. Gloy. “The automated sequence routine was implemented using the ibaPADU-S-IT as a fast, stand-alone control system and the software ibalogic from iba AG in Fürth, Germany.”

Once the necessary hardware was in place, the team needed to choose a way to monitor and control the loom. They got in touch with Logic Park, in Heimberg, Switzerland, who recommended the Cogent DataHub® as the ideal solution. Connecting the DataHub to the iba System OPC server, Dr. Gloy and his team were able to use WebView™ to quickly build a web HMI.

“The DataHub was the perfect tool to develop the new HMI – easy to install, easy to handle. I got very fast results and the control of the loom via the web browsers is totally stable,” said Dr. Gloy. “Our students are very impressed by the DataHub and its functionality. We can even view the HMI on a tablet, which is beyond state-of-the-art for a textile machine. Now we are investigating new applications for other textile machines in our Institute.”