Posts

Case Study: Citect (Schneider Electric), USA

Citect optimizes OPC-based system using the DataHub

A major battery manufacturing plant in the United States was recently faced with an interesting data integration challenge. Management needed access to data coming from a large number of different processes. Over 220 OPC-enabled field devices across the plant had to be connected to a single Citect MES system. The many OPC servers used for these connections are unique in that their data set is very dynamic. From one minute to the next any of the 220 devices may be present or absent in the data set.

citect-logo “Our challenge was to provide data from our dynamically changing OPC servers to a Citect system that is designed to work with a fixed data set,” said the company project leader. They decided to bring in a team from Citect to come up with a solution.

Citect, of Schneider Electric, is well known in the industrial process control world for their line of automation and control software solutions, particularly their MES systems. Dan Reynolds, the team leader for Citect, had heard about the DataHub® through his support department, and thought it might work. They configured the DataHub for OPC tunneling, to communicate across the network without the hassles of DCOM. And, thanks to the DataHub’s unique approach to OPC tunnelling, Dan found that it also solved the problem of providing a fixed data set.

citect-battery-manufacturing-system

“The DataHub mirrors data across the tunnel,” said Dan, “so the Citect system sees a constant data set. When a device goes offline, the tag remains in the DataHub. Just the quality changes from ‘Good’ to ‘Not Connected’.” Confident in their approach, the Citect team moved the testing from their location to the battery plant. But they soon found themselves faced with another challenge.

The production system is designed so that a field device can add or remove OPC items at any time. So, not only the OPC servers, but individual tags can suddenly appear or disappear from the system. When a new tag comes online, the server updates its tag count, but doesn’t say that a new value is available, because the OPC specification doesn’t require a server to say when a new point is created. This looked like a show-stopper for the configuration team. They knew that there is no OPC product on the market that can deal with that kind of behavior. Continually rereading the data set was not possible, because new points may be added during the read. So Dan got in touch with Cogent (a subsidiary of Skkynet), and working together they came up with a plan.

The solution was two-fold. First, the device behavior was modified to compact the tag add/delete cycle to a limited time. Then Cogent wrote a DataHub script that monitors a few OPC server tags, and when these tags change, a time-delayed function in the script re-reads the server’s data set. The scripted time delay allows for all the new points to be added before the data set is reread, and the DataHub thus discovers all of the new data as soon as it all becomes available.

“We are pleased with the performance of the DataHub for this application,” said Dan Reynolds. “There is no way we could have done this project with any other OPC tunneling product, or combination of products.”

“The Skkynet software has become an integral part of our MES solution,” said the project leader. “Without the DataHub, we would not be getting reliable data. If we hadn’t had it, our MES integration project would probably have come to a halt.”

Case Study: Biomass Biotechnology Bio, Japan

Monitoring Nature’s Wonder Workers
Biomass Biotechnology Bio logo

Finally, someone has found a good use for pesky flies—let them eat manure! The BBB company (Biomass Biotechnology Bio) in the Chiba prefecture near Tokyo, Japan, has developed a technology that uses fly larvae to convert manure from pigs and other farm animals into organic fertilizer and high-protein fish food. And they are using the Cogent DataHub running on a cloud server to provide real-time monitoring of their production powerhouse—swarms of flies.

The process is quite simple. BBB keeps thousands of specially bred flies in captivity, collects their eggs, and sells them to local pork farmers. The farmers put the fly eggs on their pig manure, and when they hatch, the fly larvae feed off the manure. Enzymes in the larvae saliva break down the manure into rich, organic fertilizer, doing the job in one week that normally takes up to four months using conventional composting techniques. When the larvae are finished, they don’t need to be separated from the finished fertilizer—they crawl out by themselves, seeking a dry environment. At this point, before they can turn into flies, the larvae are collected, dried, and processed as fish food.

There is an English-language video of the whole BBB process which was produced by the Japanese news agency NHK and is available for viewing on the Internet.

The benefits of producing fertilizer from waste material this way are substantial, but until recently costs have been high. The company plans to expand their services to large numbers of farms, and to do so they need an inexpensive, automated way to monitor their production environment. Unlike most of us who use window screens to keep flies out, BBB has special screened rooms to keep flies in. To ensure the flies stay healthy and lay large numbers of eggs, the air temperature and humidity in these rooms must be maintained at optimal levels, and monitored around the clock.

To automate the monitoring, Cogent and their partner, Nissin Systems Co. Ltd of Kyoto Japan, provided a real-time, cloud-based system using the Cogent DataHub® and WebView™. At the BBB facility they installed a Wi-Fi-enabled environmental sensor module from TOA Musendenki to measure the temperature and humidity, and connected it directly to a Cogent DataHub running on a cloud server. Using WebView, they then created a monitoring page to track key environmental variables such as temperature and humidity in the flies’ living quarters.

“Monitoring our system on the web is very convenient,” said Mr. Yamaguchi, President of BBB. “We have been able to reduce our costs significantly, which will be even more important as we expand our operation.”

Case Study: RWTH Aachen University, Germany

Closed-Loop Control of Weaving Machines using the Cogent DataHub

From ancient times people have been using looms to weave cloth, canvas, and carpets. As the centuries passed, weaving became one of the first tasks to be mechanized in the industrial revolution. The various repetitive tasks such as raising and lowering rows of threads (the warp), and passing a shuttle with the cross-thread (the weft) back and forth between them, were a good fit for simple machines. Today, high-speed air-jet weaving machines are fully automated, and capable of sending a warp thread between weft strands at a rate of 2000 times per minute.

One of the challenges of automating a weaving machine is maintaining proper tension on the threads. With each pass of the weft, a certain amount of tautness must be applied to the warp to keep the woven fabric uniform at all times. Early looms used the weaver’s body weight or hanging stones to keep the warp taut, but an air-jet machine needs a more sophisticated technology.

This challenge has been addressed by the students and faculty of the Institut für Textiltechnik (ITA) der RWTH Aachen University in Aachen, Germany, who are investigating how to optimize the tension of warp threads in an air-jet weaving machine.

“Our main research goal is self-optimization of the warp tension,” said Dr. Ing. Yves-Simon Gloy, Research Manager at ITA, “to enable the loom to set the warp tension automatically at a minimum level without reducing the process stability.” Keeping the proper tension maximizes the speed of the process, while yielding the highest possible quality of fabric.

“We started by creating an automated sequence routine, with the help of regression models for a model-based setting of the loom, and implemented in the weaving process,” said Dr. Gloy. “The automated sequence routine was implemented using the ibaPADU-S-IT as a fast, stand-alone control system and the software ibalogic from iba AG in Fürth, Germany.”

Once the necessary hardware was in place, the team needed to choose a way to monitor and control the loom. They got in touch with Logic Park, in Heimberg, Switzerland, who recommended the Cogent DataHub® as the ideal solution. Connecting the DataHub to the iba System OPC server, Dr. Gloy and his team were able to use WebView™ to quickly build a web HMI.

“The DataHub was the perfect tool to develop the new HMI – easy to install, easy to handle. I got very fast results and the control of the loom via the web browsers is totally stable,” said Dr. Gloy. “Our students are very impressed by the DataHub and its functionality. We can even view the HMI on a tablet, which is beyond state-of-the-art for a textile machine. Now we are investigating new applications for other textile machines in our Institute.”

Secure Remote Monitoring and Supervisory Control

New technologies such as Software as a Service, the Internet of Things and cloud computing for industrial process temperature bring new challenges, but there are solutions.

Interest in using cloud computing — also known as Software as a Service (SaaS) — to provide remote access to industrial systems continues to rise. Vendors and company personnel alike point to potential productivity improvements and cost savings as well as convenience. Operators and plant engineers may want to receive alarms and adjust heating controls while moving around the plant. Managers would like to see production data in real time — not just in end-of-shift or daily reports. Hardware vendors could benefit from getting live readings from their installed equipment for maintenance and troubleshooting operations.

Some industrial processors are attempting to provide this kind of window into their production systems. Yet, many question the wisdom of opening up a plant’s mission-critical control network to the possibility of malicious attack or even misguided errors. With a proper understanding of what is at stake, what is being proposed and how it can best be implemented, you can better decide whether remote access to your production data could benefit your company.

Security First for Industrial Networks

When talking about remote access to plant data, the first concern is security. Any approach that exposes the control system to unauthorized entry should be off the table. One popular approach is to secure the network against any potential intruders and open it only to trusted parties. Connections into the plant typically originate from smartphones, tablets, laptops or desktop computers. These systems usually are running a human-machine interface (HMI), remote desktop application, database browser or other proprietary connector.

In most cases, the plant engineering staff or IT department can grant client access to the network via a virtual private network (VPN), so authorized users can get the data they need. However, a typical VPN connection provides link-layer integration between network participants. This means that once on a network, an outsider has access to all other systems on the network. Thus, the company must either fully trust each person who comes is granted access to the network, or the company must task the IT manager with securing and protecting the resources within the network.

It would be unwise to risk giving visitors full access to everything that a VPN exposes. Using a VPN this way is a little like having a visitor come into your plant. Suppose a service technician arrives at the gate saying he needs to check a piece of equipment. You could just tell the guard to check his credentials, and if he checks out, give him a hardhat, directions and send him in. That is the limited-security approach. A better way would be to provide a guide to ensure that the technician finds his destination, does his work and leaves with only the information he came to get. It takes more effort and planning, but if you are going to allow someone to enter the premises, such effort is necessary to ensure security.

Better than VPN

An even better approach is to only allow access to the data itself. Consider this: the user of the data — be it vendor, customer or even corporate manager — does not need access to the whole network. Instead, they just need the data. So, rather than allowing a client to log on via a VPN connection while the IT manager works to secure confidential areas of the network from the inside, wouldn’t it be better to provide access to the data outside of the network altogether?

To continue our analogy, this would be like the guard handing the service technician exactly the data he need he arrived at the gate. There is no need to open the gate and no need to let him into the plant. In fact, the service company, vendor or other authorized party could request the data be sent to their own location, so they do not even have to go to the plant in the first place. This approach to remote monitoring is far more secure.

Is such a scenario realistic? Yes, if you use the right technology in the right way. For example, WebSocket is a protocol that supports communication over TCP, similar to HTML. But unlike HTML, once a WebSocket connection is established, client and server can exchange data indefinitely. The protocol also supports SSL encryption, a well-tested security protocol. Thus, WebSocket technology can be used to open and maintain a secure data tunnel over TCP from a plant to a cloud server without opening any ports in any firewalls. Once the tunnel connection is established, data can flow bi-directionally.

Isolating the Industrial Process Data

Such a data-centric approach to remote monitoring and supervisory control has several benefits. One key advantage is that the process can run in complete isolation from the remote client. Low-level control — and, in fact, all systems within the plant — remain completely invisible to the remote clients. The only point of contact for the remote client is the selected data set being streamed from the plant, and that data resides in the cloud.

While nobody seriously imagines making low-level control changes over a cloud connection, a solution based on WebSocket technology could allow both read-only and read/write client connections for those applications where remote changes are deemed acceptable. Authorized personnel then would have the ability to effect change in plant processes for diagnostic or maintenance purposes via a secure connection. This approach would not require any open firewall ports, so the plant remains invisible to the Internet.

Regardless of the intended use of the data, a correctly provisioned WebSocket connection to the cloud provides the process isolation needed to provide access to data without jeopardizing your in-plant systems.

Any Data Protocols

Another advantage to this approach is that it can be protocol-agnostic. Ideally, the system would carry only the raw data over TCP in a simple format: name, value and timestamp for each change in value. The connector would convert the plant protocol, such as OPC or Modbus, to a simple data feed to the cloud. Requiring a minimum of bandwidth and system resources, the data would flow in real time to all registered clients.

Each client, in turn, can convert the data into whatever format is most convenient and appropriate for their application. Options include spreadsheets, databases, web pages or custom programs.

Better yet, this approach to remote monitoring is not necessarily limited to in-plant connections. Custom-developed WebSocket connectors small enough to fit on embedded devices such as temperature sensors or flowmeters could be placed at remote locations any distance from the plant. Then, by wired or cellular connections to the Internet, the devices would connect directly to the cloud via WebSocket tunnels, without going through the traditional SCADA system, if need be. Such high-performance connectivity would support secure, real-time M2M communications and meet essential requirements of the industrial Internet of Things (IoT).

Changes and Challenges

However you look at it, change is on the horizon for industrial process control systems. The current state of the art for networked control systems was made possible by dramatic technical breakthroughs in the 80s and 90s. Many industry experts say that we are now on the verge of similar breakthroughs in remote monitoring and supervisory control. Whether they call it cloud computing, Software as a Service (SaaS), Industry 4.0 or the Industrial Internet of Things (IIoT), most will agree that the biggest challenge right now is security.

New technology provides new capabilities, and it also presents new demands that may challenge our way of thinking. Accessing data from a plant or remote sensor halfway across the world needs a different approach to security than our current models were designed for. Yet, there is no need to remain attached to the status quo if it does not truly meet the needs. These are engineering problems, and there are engineering solutions.

Bob McIlvride is the director of communications with Skkynet Cloud Systems Inc., Mississauga, Ontario, Canada. Skkynet provides secure cloud-service remote monitoring services and can be reached at 888-628-2028 or visit the website at http://skkynet.com.

Click here for original article