Real-Time Manufacturing Trends

The world of industrial automation is changing rapidly, generating a need for real-time manufacturing.  Most industrialized nations are seeing their economies shift from labor-rich to labor-scarce, forcing plants to automate to keep costs down.  At the same time, consumers are demanding more customized products and sustainable use of resources, which requires smarter and more versatile production lines.  Adding to the challenge, obtaining raw materials and parts has become less predictable since the start of the pandemic, creating a need for more dynamic and flexible supply chains.

Responding to these circumstances, executives and managers are increasingly adopting new ways of managing their businesses, according to Bill Lydon at Automation.com.  In a recent report, The Digitalization Dozen, he wrote: “The foundations of manufacturing and production are being reshaped by their integration into a comprehensive real-time business system, creating more efficient and responsive production to increase sales and profits.”

Real-time data

Real-time business systems rely on real-time data.  ERP (Enterprise Resource Planning) systems of the past were not directly synchronized with operations, providing data that was weeks or months old.  That led to the use of MES (Manufacturing Execution Systems) which are quicker, but add a layer of cost, complexity, and fragility.  What is needed, according to Lydon, is to rebuild the enterprise as a real-time manufacturing business.

A few pioneering companies have read the writing on the wall, and are now looking at ways to implement the necessary changes.  Melanie Kalmar, spokesperson for Dow Corporation said, “We are really focused on being a real-time company, using and leveraging the data we have to drive better decisions, be a more sustainable company, and a favored company.”

Many others will follow, says Lydon.  He explains how digital communication in real time unifies the corporate vision by providing accurate and timely data for interested parties throughout the enterprise, as well as among suppliers and customers.  This data transparency keeps employees at all levels well informed, improving their decisions, which leads in turn to greater success.

Closed-loop operations

Lydon envisions a digital manufacturing architecture that is real-time, synchronized, and optimized through the use of “closed loop operations of IT and Operational Technology OT groups.”  By this he means that data coming from sensors and field equipment, edge devices, plant or process operations gets passed in real time to business systems like digital twin models and analytical tools, including artificial intelligence engines.  These systems pass commands back to the OT systems in a closed loop, all in real time.

Needless to say, this must all be based on secure, bidirectional real-time data communications.  Security is essential because plant and operations networks must be kept isolated, completely separate from business networks.  And robust, bidirectional real-time communication is necessary for closed-loop performance.  Otherwise it would be like driving a car with a three-second lag between the steering wheel, brake pedal, and tires―a recipe for disaster.

Other trends

Two other trends in industrial automation are helping make real-time manufacturing work.  The first is wide-spread use of open standards like TCP and OPC (Open Process Communication).  Open data communication standards like these give multiple vendors a chance to compete and contribute, which brings new ideas and more product choices for system designers and integrators.  Industrial systems are complex, with a wide variety of sensors, devices, tools, machines, and other components that need to be connected seamlessly.  Standard protocols make these connections possible.

A second trend is towards less programming, by using off-the-shelf software and services.  These make it easier, faster, and cheaper for a system integrator to test, build, and deliver a working automation system. A generation of engineers who had to build solutions from scratch is retiring, just as systems are growing more complex.  The new generation understands the value of using ready-made tools to quickly implement solutions, rather than starting from the ground up on each new project.

From our perspective, these trends all point towards a need for products and services that provide secure, real-time industrial data communications.  Our latest release, DataHub 10, runs both on-site or in the cloud, connects OT to IT securely through DMZs, and supports real-time networking of live and historical data. It is well positioned to lead the way for digital and real-time manufacturing.

White House Pushes for Security

Since the ransomware attack on the Colonial Pipeline last month, the US government has become more vocal on the need for industrial cybersecurity. A recent memo from the White House to corporate executives and business leaders across the country urges them to protect their companies against hackers. Among the action items is the need to segment networks, to isolate OT from IT.

“It’s critically important that your corporate business functions and manufacturing/production operations are separated,” the memo states, “and that you carefully filter and limit internet access to operational networks, identify links between these networks and develop workarounds or manual controls to ensure ICS networks can be isolated and continue operating if your corporate network is compromised.”

The memo says that although the government is leading the fight against cyber attacks of all kinds, the private sector is also expected to play their part. They are urged to back up data, update systems, and test response plans and implementations. The memo also listed five best practices from the president’s Improving the Nation’s Cybersecurity Executive Order, including:

  1. Multifactor authentication
  2. Endpoint detection
  3. Response to an incursion
  4. Encryption
  5. A capable security team
Isolate Control Networks

Most of the recommendations could apply to any system or network exposed to the Internet, but the White House also included one directly related to industrial systems: Segment your networks to protect operations. Industrial control system networks, it says, should be isolated so they can continue operating even when the management network is compromised.

This was the case with the Colonial Pipeline incident last month. Although the hack caused turmoil in the company and a week of problems for the whole East Coast of the US, it could have been much worse. If the hackers had been able to take control of the pipeline itself, we might have witnessed physical damage both to property and the environment.

To avoid such problems, isolating control networks is critical. This is best accomplished using a DMZ, a “demilitarized zone” that separates control systems from management systems. Using a DMZ ensures that there is no direct link between corporate networks and control networks, and that only known and authenticated actors can enter the system at all.

Skkynet recommends using a DMZ for OT/IT networking, and provides the software needed to seamlessly pass industrial data across a DMZ-enabled connection. Most industrial protocols require opening a firewall to access the data, but Skkynet’s patented DataHub architecture keeps all inbound firewall ports closed on both the control and corporate sides, while still allowing real-time, two-way data communication through the DMZ.

We are pleased to see support for securing industrial control systems coming from the White House and US government, as well as governments and agencies throughout the industrialized world. A more secure environment will keep costs down and production running smoothly by keeping hackers out of our control systems.

Emergency at Colonial Pipeline

Another ransomware attack hit the headlines last week.  This time it’s Colonial Pipeline, the largest in the USA by some estimates, 8,850 km long, with carrying capacity of over 3 million barrels of petroleum products.  The attack has prompted the US Department of Transportation to issue an emergency declaration, easing restrictions on overland transport of supply by truck, a necessary but high-cost alternative for the company.

Colonial is wisely reluctant to release details, so we might never know exactly who did this or how it happened.  But that’s not the point.  One way or another, a malicious actor may have compromised a node on the IT network, which could have been used as a staging ground to launch an attack on the OT (Operations Technology) network.

What we do know is how to prevent that kind of attack from spreading.  There should be no need for emergency declarations.  As we have discussed previously, most people in the know―from government regulators and standards agencies to top management and on-site engineering staff―understand that you must isolate your networks.  In this age of cloud, IoT, and digital transformation, when it is becoming possible to connect everything together, we also need to implement ways to keep things separate.

A Well-Known Solution

Isolating a control network from an IT network is not difficult.  The technology has been around for decades.  It involves inserting a defensive layer, a DMZ (Demilitarized Zone) between the two networks, and using firewalls to protect them.

The challenge lies in moving production data securely across the DMZ in real time.  This is where Skkynet’s DataHub technology shines.  The DataHub can connect to equipment and SCADA systems on the industrial side, and pass that data through the DMZ to the IT side, without opening any firewall ports on either side.

We hope Colonial Pipeline recovers quickly from this emergency, and that oil and gas will soon begin to flow again up the East Coast of the USA.  Meanwhile, we encourage others to heed this wake-up call.  The attack surface of an entire company is huge.  Persistent hackers are bound to find their way in, eventually.  The best way to prevent damage to the production systems is to isolate the corporate network from the control network and insert a DMZ.  They may get that far, but no farther.

OPC Attack Surface Exposed

Industrial systems, once of little interest to hackers, are now targeted on a regular basis, making security an ever-growing concern.  At the same time, as more companies update and add to their control systems, the OPC industrial protocol continues to grow in popularity. So it would make sense to ask the question, how vulnerable to attack is an industrial system that uses OPC?

A recent white paper by Claroty, Exploring the OPC Attack Surface, discusses a number of security vulnerabilities in the products of three well-known suppliers of OPC software.  These issues, reported to the Industrial Control System Cyber Emergency Response Team (ICS-CERT), could “expose organizations to remote code execution, denial-of-service conditions on ICS devices, and information leaks,” according to the report.

The companies involved have isolated the bugs, fixed them, and issued upgrades to their software, but the underlying problem remains.  All software has bugs, and OPC software is no exception.  Every connection to the Internet risks exposing an attack surface that could be exploited.

Unforeseen requirements

Like most industrial protocols, OPC was conceived and developed before the advent of Industrie 4.0 and the Industrial IoT.  Back then, nobody seriously considered connecting their process control systems to the Internet.  All production equipment and networks were entirely disconnected (“air-gapped”) from the outside world, or at least secured behind closed firewalls.

Connecting a factory or industrial process to an IT department or cloud service introduces risk.  The design of OPC requires an open firewall port to make a connection.  Most companies are currently using workarounds to overcome this Achilles heel, but none of them are adequate.  Using a VPN simply expands the security perimeter of a control network to the outside world of phishing emails and ransomware attacks. Using an IoT gateway to connect an OPC server to a cloud service still requires connecting the plant network to the Internet in some way.

The most secure approach

Instead, the most secure way to get data from OPC servers running on a plant network is by using one or more DMZs.  According to a recent NIST report, “The most secure, manageable, and scalable control network and corporate network segregation architectures are typically based on a system with at least three zones, incorporating one or more DMZs.”

Using a DMZ makes it possible to isolate the plant from the Internet. Although OPC alone cannot connect through multiple hops across a DMZ, adding Skkynet’s DataHub technology makes it possible.  A DataHub tunnel for OPC can establish secure, real-time data flow across the connection, without opening any inbound firewall ports.  This effectively cuts the attack surface to zero.  Even if there is an undiscovered bug lurking somewhere in an OPC server, there is much less risk.  After all, hackers cannot attack what they can’t see.

NIS 2 Raises the Bar for Network Security

Key directive: One or more DMZs are needed for the most secure, manageable, and scalable segregation of control and corporate networks.

The recent adoption of a new NIS 2 Directive by the European Commission is a sign of the times.  Beset by a world-wide pandemic, many companies across the EU have turned to digital technologies to allow their workforce to stay productive, and to facilitate access to valuable production data.  This has led to unprecedented levels of industrial data being passed between company networks and across the Internet, increasing the risk of exposure to malicious intruders.

To combat the threat, the European Commission has accepted revisions to the Directive on Security of Network and Information Systems (NIS), now calling it NIS 2. Among other things, this document mandates a number of basic security elements, including standards for networking data between the production and corporate levels of a company.

The Commission has tasked ENISA, the European Union Agency for Network and Information Security, with implementing the standards.  In pursuit of this mandate, ENISA relies on the expertise of three well-known bodies, NIST, ISO, and ISA to provide detailed descriptions of how network security should be implemented, as published in its Mapping of OES Security Requirements to Specific Sectors document.

Using DMZs

For example, the recommended way to bring process data into the corporate office is summed up in NIST document SP-800-82.  It says: “The most secure, manageable, and scalable control network and corporate network segregation architectures are typically based on a system with at least three zones, incorporating one or more DMZs.”

These three zones are the control zone (OT), the corporate zone (IT), and the DMZ itself.  The document describes the value and use of firewalls to separate these zones, and to ensure that only the correct data passes from one to the other. Using a DMZ ensures that there is no direct link between corporate networks and control networks, and that only known and authenticated actors can enter the system at all.

Skkynet recommends using a DMZ for OT/IT networking, and provides the software needed to seamlessly pass industrial data across a DMZ-enabled connection.  Most industrial protocols require opening a firewall to access the data, but Skkynet’s patented DataHub architecture keeps all inbound firewall ports closed on both the control and corporate sides, while still allowing real-time, two-way data communication through the DMZ.

Unlike MQTT, which cannot reliably daisy-chain connections across the three zones as ENISA recommends, the DataHub maintains a complete copy of the data and connection status from the source to final destination.  Thus, it provides accurate indicators of data reliability at each zone in the system, along with making the data itself available.

We applaud the European Commission for its no-nonsense stance on cybersecurity with NIS 2, and encourage all EU members, indeed any company expanding its use of corporate networking, Industrie 4.0, or Industrial IoT technologies to adhere closely to the guidance of ENISA, and to implement three-zone security using one or more DMZs.

A Sustainable Future

After a year of uncertainty, confusion, and disruption, suddenly people are talking about sustainability.  Public opinion, government policy, and economic realities seem to be converging on this theme, pushing us towards creating a more sustainable future.  The pandemic has underscored the need for us to be more effective and resilient in many ways.  For industry, this points to digitization.

At a recent AVEVA World Digital event, Craig Hayman, AVEVA CEO said, “As business leaders, it’s our duty to go further and faster than we believed possible to realize a sustainable future through digital transformation.”  Summing it up, he calls this the “Decade to Deliver.”

He is talking about delivering on the promises of Industrial IoT, of Industrie 4.0, and digitization.  We are at a critical juncture.  Many people are looking for jobs, and there is lots of work to be done.  Thankfully, we have developed the necessary technologies, ready for use.

Staying connected

“Covid-19 is a massive catalyzer of digital adoption, because people want to be more efficient, we want to be more resilient. Therefore, things have to be connected, ” said Jean-Pascal Tricoire, Chairman and CEO of Schneider Electric, in that same on-line conference.  “We want to operate things from remote, to enforce social distancing. ”

The need is there, and so are the resources.  “All of this has been accelerated, ” continued Tricoire, “by the massive recovery packages like no other in history, put in place by countries, and a large part of those are about digital and green.  Because countries have understood that you can’t dissociate a step-change to sustainability from digitization. ”

In the AVEVA World Digital presentations, the links from digital to sustainable were made clear through example after example.  Emerging technologies for carbon capture, plant optimization, circular systems, remote access, solar parks, wind farms, decentralized power grids, and more―all rely heavily on digitized data communications.  Since most CO2 emissions come from industry, transportation, and buildings, securely connecting them in real time to IT platforms empowered by AI promises to make them greener and more sustainable.

The Great Acceleration

Mike Walsh, futurist, author, and CEO of Tomorrow explained how the pandemic has opened new opportunities.  It has unleashed, in his words, “the great acceleration. ”  We are now living a full decade ahead of the predictions, he says, in a space that we could only imagine twelve months ago. He sees three rules in play:

  1. It is no longer “digital disruption, ” now it’s digital delivery. We are all disruptors now. If you are not a digital business, you are no longer in business.
  2. There is no such thing as “remote work”, just work. Each of us has more mobility and autonomy than ever before, and that will require documenting our decision-making, relying on data, and acting on it more quickly.
  3. AI will not destroy jobs, but it will change them. Rather than doing work, we will be increasingly called upon to design work. We will need to bring more of our humanity to the table.

Who would have guessed that a global pandemic would accelerate a need for digitization?  Whatever the reason, all this digital data needs to be connected, and this is where Skkynet shines.  We have the tools and experience to meet current and future demand for the secure, real-time data communications used in remote access and the convergence of OT, IT and the cloud.  We are playing our part to help ensure a more sustainable future for industry and for the planet.