Automation.com, a leading online publisher of automation-related content, recently ran an article on the value of pairing OPC UA with a good IIoT protocol like DHTP. The article discusses how OPC UA was initially expected to serve as an IIoT protocol, but more recently the trend seems to be towards using OPC UA at the plant level only. Other protocols, such as MQTT and AMQP are being offered as candidates for connecting outside the plant, but they are not ideally suited to IIoT. This article explains why, and introduces 9 criteria for good IIoT data communication.
Since the beginning of the industrial revolution, automation has been a steadily growing trend for the manufacturing and process industries, to the joy of some and the dismay of others. On the one hand, automation is synonymous with lower production costs and higher quality, providing more consistent output with less physical labor. On the other hand, from time to time there is concern about job loss as machines replace unskilled labor, and put people out of work. As far back as 1779, so the story goes, a young weaver’s apprentice named Ned Ludd vandalized a couple of knitting machines, thus becoming the namesake of the Luddite movement, a group of skilled workers who violently protested one of the world’s first industrial automation initiatives.
Now there is a new automation revolution taking place that may have an even greater social impact. Thanks to new digital technologies like artificial intelligence, big data, robotics, satellite geopositioning, and others, jobs that we once thought only humans can do are now seen as potential targets for automation.
“In the past, automation was largely restricted to simple manual or procedural tasks,” said Carolyn Wilkins, Senior Deputy Governor of the Bank of Canada, in a recent speech to the Toronto Board of Trade. “Today’s technology makes it possible to automate an increasing number of cognitive and non-routine tasks across a wide range of industries.”
The impact of automation on virtually every employment sector for the near future was the subject of The Future of Employment by Carl Benedikt Frey & Michael Osborne of the University of Oxford. “According to our estimates around 47 percent of total US employment is in the high risk category,” the paper states in its conclusion. “We refer to these as jobs at risk – i.e. jobs we expect could be automated relatively soon, perhaps over the next decade or two.”
Some of the jobs most at risk were in categories like “Machine Setters, Operators, and Tenders” in various industries. This what we might expect, given the recent robotics trend in manufacturing. More surprising were job categories like hotel desk clerk, agricultural inspector, bill collector, animal breeder, restaurant cook, and legal secretary. Twenty years ago, who would have imagined these occupations being automated? Yet most of them will be in the next twenty years, according to the study.
Where does that leave us? “What we need to do is embrace the technologies in areas where we can make a difference and promote productivity,” recommends Carolyn Wilkins. She mentioned in particular the STEM subjects (science, technology, engineering, and math) as “solid foundations that provide a platform for future learning.” Perhaps she is right. The Oxford study lists a number of occupational areas with a low chance of replacement, and engineering is among them, for sure. And for those with a more humanistic interest, health care, education, the arts and entertainment are other options, as they also are not expected to be automated any time soon.
At Skkynet we our doing our part to make automation easy to embrace, by making our products and services convenient and affordable. And internally, we are always looking for ways to streamline our work flow. The more we automate the boring and repetitive jobs here in the office, the more time we have to do the cool, fun, and interesting stuff that keeps us at the leading edge.
There will be live demonstrations of DataHub, SkkyHub, and the ETK in two different areas of the Automate show at the McCormick Place in Chicago next week. The Automate show is one of the largest industrial automation shows in North America, with displays of robotics, vision and motion control, and other cutting-edge technologies that attract automation and control engineers, managers, and researchers from across the world.
A Renesas demo at the Renesas pavilion, Booth #866, is being powered by Skkynet’s SkkyHub service and ETK. The demo lets show attendees monitor the movement of a Festo linear piston from their mobile phones. The base-level control of the piston is through a PLC that is connected to a Renesas Synergy S7 chip running on a development board. The S7 chip has the Skkynet ETK loaded on it, which makes a connection to SkkyHub to provide the data and a user interface. Anyone can call up a URL on their smartphone and then view the data in a seamless connection.
“This demo makes the Industrial IoT come alive,” said Paul Thomas, President of Skkynet. “Everyone attending the Automate show has probably heard about the IIoT, and now they will have a chance to experience a secure-by-design implementation of it, first-hand.”
The Cogent demo will be shown at the OPC Foundation pavilion, Booth #2265. We will be demonstrating the latest features of the DataHub, in addition to an integrated solution using Red Lion’s mobile gateway and an embedded demo using Renesas Synergy S7 running Cogent’s beta implementation of OPC UA. Attendees will be able to control LEDs on the S7 demo board itself, as well as control a bank of lights on the booth. Additionally, they will be able to see output from the board’s light and motion sensors in their mobile displays.
Backing up the demo with insight, Xavier Mesrobian, Cogent’s VP of Sales and Marketing will be presenting a talk, Share your Data Not your Network, at the Future of Automation Theater on Tuesday afternoon. “Both of our demos at this show rely on our secure-by-design technology,” said Mesrobian, “but few realize how revolutionary it is. When you are talking about security for the IIoT, most people think ‘VPN’. But that’s the wrong technology, by far. We want people to know that there is a better, safer, and more affordable alternative.”
Come and meet us, hear the talk, and see the demos. Members of the Skkynet and Cogent team will be at the Cogent area in the OPC Foundation pavilion, Booth #2265. Don’t forget to bring your smartphone!
State-of-the-art Coca-Cola plant uses DataHub scripts to integrate alarm data and reports.
One of the largest soft drink manufacturing plants in the world, Coca-Cola’s Ballina Beverages facility, recently installed the DataHub® from Cogent Real-Time Systems (Skkynet’s subsidiary), to log alarm data and create end-of-shift reports. The 62,000 square meter plant, located in Ballina, Ireland, uses the most up-to-date manufacturing automation systems available, and management is constantly looking for ways to improve them.
Some of the equipment used at Ballina Beverages is designed and manfactured by Odenberg Engineering. Odenberg, in turn, relies on their subsidiary, Tricon Automation to handle the process control of the machinery.
In a recent upgrade to the system, the Odenberg/Tricon team chose the DataHub to construct custom log files to track and archive their alarms. They wanted to combine the live data from each triggered alarm with a text description of the alarm, and then log the results to a file. The alarms were being generated by an Allen-Bradley system from Rockwell Automation Inc., and the 1500 alarm descriptions were stored in an Excel spreadsheet. Each row of the final log would have to combine the time, date, and code of a triggered alarm with the corresponding description of that alarm.
After considering several different scenarios, the most effective approach was to connect the DataHub to Rockwell Automation’s RSLinx using its OPC server, and then to read in the alarm condition strings from a text file (instead of from the spreadsheet), using a DataHub script. The same script writes the data to the log file. This works so well that they decided to use another script to create end-of-shift reports.
“We got the basic system up and running in a few hours,” said Gus Phipps, team member from Odenberg, “which was good, because we were working under a tight deadline. Cogent helped us out with the DataHub scripting, but we were able to do most of the work ourselves. It went surprisingly quickly.”
“Using the DataHub’s scripting language let us customize it to exactly meet our needs,” said George Black, Tricon’s project manager. “It is very flexible, and yet completely robust. It is months now since the project was completed, and the DataHub continues working away merrily every day, just doing its job. We plan to use it again in other projects very soon.”
Closed-Loop Control of Weaving Machines using the Cogent DataHub
From ancient times people have been using looms to weave cloth, canvas, and carpets. As the centuries passed, weaving became one of the first tasks to be mechanized in the industrial revolution. The various repetitive tasks such as raising and lowering rows of threads (the warp), and passing a shuttle with the cross-thread (the weft) back and forth between them, were a good fit for simple machines. Today, high-speed air-jet weaving machines are fully automated, and capable of sending a warp thread between weft strands at a rate of 2000 times per minute.
One of the challenges of automating a weaving machine is maintaining proper tension on the threads. With each pass of the weft, a certain amount of tautness must be applied to the warp to keep the woven fabric uniform at all times. Early looms used the weaver’s body weight or hanging stones to keep the warp taut, but an air-jet machine needs a more sophisticated technology.
This challenge has been addressed by the students and faculty of the Institut für Textiltechnik (ITA) der RWTH Aachen University in Aachen, Germany, who are investigating how to optimize the tension of warp threads in an air-jet weaving machine.
“Our main research goal is self-optimization of the warp tension,” said Dr. Ing. Yves-Simon Gloy, Research Manager at ITA, “to enable the loom to set the warp tension automatically at a minimum level without reducing the process stability.” Keeping the proper tension maximizes the speed of the process, while yielding the highest possible quality of fabric.
“We started by creating an automated sequence routine, with the help of regression models for a model-based setting of the loom, and implemented in the weaving process,” said Dr. Gloy. “The automated sequence routine was implemented using the ibaPADU-S-IT as a fast, stand-alone control system and the software ibalogic from iba AG in Fürth, Germany.”
Once the necessary hardware was in place, the team needed to choose a way to monitor and control the loom. They got in touch with Logic Park, in Heimberg, Switzerland, who recommended the Cogent DataHub® as the ideal solution. Connecting the DataHub to the iba System OPC server, Dr. Gloy and his team were able to use WebView™ to quickly build a web HMI.
“The DataHub was the perfect tool to develop the new HMI – easy to install, easy to handle. I got very fast results and the control of the loom via the web browsers is totally stable,” said Dr. Gloy. “Our students are very impressed by the DataHub and its functionality. We can even view the HMI on a tablet, which is beyond state-of-the-art for a textile machine. Now we are investigating new applications for other textile machines in our Institute.”
New technologies such as Software as a Service, the Internet of Things and cloud computing for industrial process temperature bring new challenges, but there are solutions.
Interest in using cloud computing — also known as Software as a Service (SaaS) — to provide remote access to industrial systems continues to rise. Vendors and company personnel alike point to potential productivity improvements and cost savings as well as convenience. Operators and plant engineers may want to receive alarms and adjust heating controls while moving around the plant. Managers would like to see production data in real time — not just in end-of-shift or daily reports. Hardware vendors could benefit from getting live readings from their installed equipment for maintenance and troubleshooting operations.
Some industrial processors are attempting to provide this kind of window into their production systems. Yet, many question the wisdom of opening up a plant’s mission-critical control network to the possibility of malicious attack or even misguided errors. With a proper understanding of what is at stake, what is being proposed and how it can best be implemented, you can better decide whether remote access to your production data could benefit your company.
Security First for Industrial Networks
When talking about remote access to plant data, the first concern is security. Any approach that exposes the control system to unauthorized entry should be off the table. One popular approach is to secure the network against any potential intruders and open it only to trusted parties. Connections into the plant typically originate from smartphones, tablets, laptops or desktop computers. These systems usually are running a human-machine interface (HMI), remote desktop application, database browser or other proprietary connector.
In most cases, the plant engineering staff or IT department can grant client access to the network via a virtual private network (VPN), so authorized users can get the data they need. However, a typical VPN connection provides link-layer integration between network participants. This means that once on a network, an outsider has access to all other systems on the network. Thus, the company must either fully trust each person who comes is granted access to the network, or the company must task the IT manager with securing and protecting the resources within the network.
It would be unwise to risk giving visitors full access to everything that a VPN exposes. Using a VPN this way is a little like having a visitor come into your plant. Suppose a service technician arrives at the gate saying he needs to check a piece of equipment. You could just tell the guard to check his credentials, and if he checks out, give him a hardhat, directions and send him in. That is the limited-security approach. A better way would be to provide a guide to ensure that the technician finds his destination, does his work and leaves with only the information he came to get. It takes more effort and planning, but if you are going to allow someone to enter the premises, such effort is necessary to ensure security.
Better than VPN
An even better approach is to only allow access to the data itself. Consider this: the user of the data — be it vendor, customer or even corporate manager — does not need access to the whole network. Instead, they just need the data. So, rather than allowing a client to log on via a VPN connection while the IT manager works to secure confidential areas of the network from the inside, wouldn’t it be better to provide access to the data outside of the network altogether?
To continue our analogy, this would be like the guard handing the service technician exactly the data he need he arrived at the gate. There is no need to open the gate and no need to let him into the plant. In fact, the service company, vendor or other authorized party could request the data be sent to their own location, so they do not even have to go to the plant in the first place. This approach to remote monitoring is far more secure.
Is such a scenario realistic? Yes, if you use the right technology in the right way. For example, WebSocket is a protocol that supports communication over TCP, similar to HTML. But unlike HTML, once a WebSocket connection is established, client and server can exchange data indefinitely. The protocol also supports SSL encryption, a well-tested security protocol. Thus, WebSocket technology can be used to open and maintain a secure data tunnel over TCP from a plant to a cloud server without opening any ports in any firewalls. Once the tunnel connection is established, data can flow bi-directionally.
Isolating the Industrial Process Data
Such a data-centric approach to remote monitoring and supervisory control has several benefits. One key advantage is that the process can run in complete isolation from the remote client. Low-level control — and, in fact, all systems within the plant — remain completely invisible to the remote clients. The only point of contact for the remote client is the selected data set being streamed from the plant, and that data resides in the cloud.
While nobody seriously imagines making low-level control changes over a cloud connection, a solution based on WebSocket technology could allow both read-only and read/write client connections for those applications where remote changes are deemed acceptable. Authorized personnel then would have the ability to effect change in plant processes for diagnostic or maintenance purposes via a secure connection. This approach would not require any open firewall ports, so the plant remains invisible to the Internet.
Regardless of the intended use of the data, a correctly provisioned WebSocket connection to the cloud provides the process isolation needed to provide access to data without jeopardizing your in-plant systems.
Any Data Protocols
Another advantage to this approach is that it can be protocol-agnostic. Ideally, the system would carry only the raw data over TCP in a simple format: name, value and timestamp for each change in value. The connector would convert the plant protocol, such as OPC or Modbus, to a simple data feed to the cloud. Requiring a minimum of bandwidth and system resources, the data would flow in real time to all registered clients.
Each client, in turn, can convert the data into whatever format is most convenient and appropriate for their application. Options include spreadsheets, databases, web pages or custom programs.
Better yet, this approach to remote monitoring is not necessarily limited to in-plant connections. Custom-developed WebSocket connectors small enough to fit on embedded devices such as temperature sensors or flowmeters could be placed at remote locations any distance from the plant. Then, by wired or cellular connections to the Internet, the devices would connect directly to the cloud via WebSocket tunnels, without going through the traditional SCADA system, if need be. Such high-performance connectivity would support secure, real-time M2M communications and meet essential requirements of the industrial Internet of Things (IoT).
Changes and Challenges
However you look at it, change is on the horizon for industrial process control systems. The current state of the art for networked control systems was made possible by dramatic technical breakthroughs in the 80s and 90s. Many industry experts say that we are now on the verge of similar breakthroughs in remote monitoring and supervisory control. Whether they call it cloud computing, Software as a Service (SaaS), Industry 4.0 or the Industrial Internet of Things (IIoT), most will agree that the biggest challenge right now is security.
New technology provides new capabilities, and it also presents new demands that may challenge our way of thinking. Accessing data from a plant or remote sensor halfway across the world needs a different approach to security than our current models were designed for. Yet, there is no need to remain attached to the status quo if it does not truly meet the needs. These are engineering problems, and there are engineering solutions.
Bob McIlvride is the director of communications with Skkynet Cloud Systems Inc., Mississauga, Ontario, Canada. Skkynet provides secure cloud-service remote monitoring services and can be reached at 888-628-2028 or visit the website at http://skkynet.com.