Posts

Will Low Oil & Gas Prices Prod an IIoT Embrace?

There’s no doubt about it, oil prices have dropped over the past couple of years. Even if you don’t follow the news or the markets, you can tell by prices at the pump. Now averaging below $50 per barrel, the price of oil is a far cry from its heady climb to $140 per barrel in 2008, or even the $80 – $100 prices from 2010 to 2014.

As good as this news is to anyone who drives a car or takes the occasional flight, as helpful as it might be to ease pressure on the economy as a whole, oil and gas companies have had to scramble to cut costs. That’s OK for the short term, according to Craig Resnick at the ARC Advisory Group, but medium to long term they need to find a new and different way of working. And this, he says, means embracing the Industrial IoT.

In a recent blog, The Oil and Gas Industry “New Normal” Pricing Justifies Greater IIoT Investment, Resnick said, “To increase capital efficiency and profitability; reduce marginal costs; minimize downtime; improve health, safety and environmental conditions; and capture the knowledge of the retiring workforce and productivity gains hidden in data and workflow silos; the oil & gas industry must embrace IIoT and digital transformation fully, from assets to the oil field; and from design to process and operations.”

Whew! That’s quite a list of benefits. And look how deeply it needs to penetrate the industry. To achieve that level of integration, as Resnick points out later in the article, success in IIoT is impossible without a corresponding convergence of IT and OT. Done well, this combination results in a complete end-to-end solution that connects the sophisticated data processing and analytical experience of IT to the hard data coming from production machinery—be it legacy equipment that has been hammering away for decades or newly installed systems with the latest digital technologies.

Resnick’s vision of a full embrace of IIoT may seem far-fetched to old-timers, indeed to anyone who was responsible for designing industrial automation systems 20 or even 10 years ago. But “new normal” pricing in the oil & gas sector has set the bar to a point where people have to pull themselves out of their old mindsets. The rewards are tempting—the benefits of IIoT may bring the industry into a new era of prosperity. Who knows? Five or ten years from now people may wonder how anyone was able to turn a profit without it.

At Skkynet, the shift to IIoT has been fast-paced, and yet still somehow evolutionary. Sure, there are technical challenges, and security is a real issue. But whenever we step back from our work and take a look at what we are creating, we realize that it really can, in the words of Resnick, “support end-to-end process excellence, with enterprise integration and visibility that leverages existing systems and the strengths of industrial products.”

What is Edge Processing anyway?

Part 12 of Data Communication for Industrial IoT

Edge processing refers to the execution of aggregation, data manipulation, bandwidth reduction and other logic directly on an IoT sensor or device.  The idea is to put basic computation as close as possible to the physical system, making the IoT device as “smart” as possible.

Is this a way to take advantage of all of the spare computing power in the IoT device?  Partially.  The more work the device can do to prepare the data for the cloud, the less work the cloud needs to do.  The device can convert its information into the natural format for the cloud server, and can implement the proper communication protocols.  There is more, though.

Data Filter

Edge processing means not having to send everything to the cloud.  An IoT device can deal with some activities itself.  It can’t rely on a cloud server to implement a control algorithm that would need to survive an Internet connection failure.  Consequently, it should not need to send to the cloud all of the raw data feeding that algorithm.

Let’s take a slightly contrived example.  Do you need to be able to see the current draw of the compressor in your smart refrigerator on your cell phone?  Probably not.  You might want to know whether the compressor is running constantly – that would likely indicate that you left the door ajar.  But really, you don’t even need to know that.  Your refrigerator should recognize that the compressor is running constantly, and it should decide on its own that the door is ajar.  You only need to know that final piece of information, the door is ajar, which is two steps removed from the raw input that produces it.

Privacy

This has privacy and information security implications.  If you don’t send the information to the Internet, you don’t expose it.  The more processing you can do on the device, the less you need to transmit on the Internet.  That may not be a big distinction for a refrigerator, but it matters a lot when the device is a cell tower, a municipal water pumping station or an industrial process.

Bandwidth

Edge processing also has network bandwidth implications.  If the device can perform some of the heavy lifting before it transmits its information it has the opportunity to reduce the amount of data it produces.  That may be something simple, like applying a deadband to a value coming from an A/D converter, or something complex like performing motion detection on an image.  In the case of the deadband, the device reduces bandwidth simply by not transmitting every little jitter from the A/D converter.  In the case of the motion detection, the device can avoid sending the raw images to the cloud and instead just send an indication of whether motion was detected.  Instead of requiring a broadband connection the device could use a cellular connection and never get close to its monthly data quota.

Data Protocol

There is just one thing to watch for.  In our example of the motion detection, the device probably wants to send one image frame to the cloud when it detects motion.  That cannot be represented as a simple number.  Generally, the protocol being used to talk to the cloud server needs to be rich enough to accept the processed data the device wants to produce.  That counts out most industrial protocols like Modbus, but fits most REST-based protocols as well as the higher level protocols like OPC UA and MQTT.

Continue reading, or go back to Table of Contents

Is MQTT the Answer for IIoT?

Part 8 of Data Communication for Industrial IoT

MQTT, or Message Queue Telemetry Transport, is a publish/subscribe messaging protocol that was originally created for resource-constrained devices over low-bandwidth networks. It is being actively promoted as an IoT protocol because it has a small footprint, is reasonably simple to use, and features “push” architecture.

MQTT works by allowing data sources like hardware devices to connect to a server called a “broker”, and publish their data to it. Any device or program that wants to receive the data can subscribe to that channel. Programs can act as both publishers and subscribers simultaneously. The broker does not examine the data payload itself, but simply passes it as a message from each publisher to all subscribers.

The publish/subscribe approach has advantages for general IoT applications. “Push” architecture is inherently more secure, because it avoids the client-server architecture problem, allowing devices to make outbound connections without opening any firewall ports. And, by using a central broker, it is possible to establish many-to-many connections, allowing multiple devices to connect to multiple clients. MQTT seems to solve the communication and security problems I have identified in previous posts.

Despite these architectural advantages, though, MQTT has three important drawbacks that raise questions about its suitability for many IIoT systems and scenarios.

MQTT is a messaging protocol, not a data protocol

MQTT is a messaging protocol, not a data communications protocol. It acts as a data transport layer, similar to TCP, but it does not specify a particular format for the data payload. The data format is determined by each client that connects, which means there is no interoperability between applications. For example, if data publisher A and subscriber B have not agreed on their data format in advance, it’s not likely that they’ll be able to communicate. They might send and receive messages via MQTT, but they’ll have no clue to what they mean.

Imagine an industrial device that talks MQTT, say a chip on a thermometer. Now, suppose you have an HMI client that supports MQTT, and you want to display the data from the thermometer. You should be able to connect them, right? In reality, you probably can’t. This is not OPC or some other industrial protocol that has invested heavily into interoperability. MQTT is explicitly not interoperable. It specifies that each client is free to use whatever data payload format it wants.

How can you make it work? You must either translate data protocols for each new connected device and client, or you need to source all devices, programs, HMIs, etc. from a single vendor, which quickly leads to vendor lock-in.

The broker cannot perform intelligent routing

MQTT brokers are designed to be agnostic to message content. This design choice can cause problems for industrial applications communicating over the IoT. Here are a few reasons why:

1) The broker cannot be intelligent about routing, based on message content. It simply passes along any message it gets. Even if a value has not changed, the message gets sent. There is no damping mechanism, so values can “ring” back and forth between clients, leading to wasted system resources and high bandwidth use.

2) The broker cannot distinguish between messages that contain new or previously transmitted values, to maintain consistency. The only alternative is to send all information to every client, consuming extra bandwidth in the process.

3) There is no supported discovery function because the broker is unaware of the data it is holding. A client cannot simply browse the data set on the broker when it connects. Rather, it needs to have a list of the topics from the broker or the data publisher before making the connection. This leads to duplication of configuration in every client. In small systems this may not be a problem, but it scales very poorly.

4) Clients cannot be told when data items become invalid. In a production system a client needs to know that the source of data has been disconnected, whether due to a network failure or an equipment failure. MQTT brokers do not have sufficient knowledge to do that. The broker would need to infer that when a client disconnects it needs to synthesize messages as if they had originated from that client indicating that the data in certain topics are no longer trustworthy. MQTT brokers do not know how to synthesize those messages, and since they don’t know the message format, they would not know what to put in them. For this reason alone MQTT is a questionable choice in a production environment.

5) There is no opportunity to run scripts or otherwise manipulate the data in real time to perform consolidation, interpretation, unit conversion, etc. Quite simply, if you don’t know the data format you cannot process it intelligently.

No acceptable quality of service

MQTT defines 3 levels of quality of service (QoS), none of which is right for the IIoT. This is an important topic and one that I have gone into depth about in a previous post (see Which Quality of Service is Right for IIoT?). MQTT might work for small-scale prototyping, but its QoS failure modes make it impractical at industrial scale.

In summary, although the MQTT messaging protocol is attracting interest for IoT applications, it is not the best solution for Industrial IoT.

Continue reading, or go back to Table of Contents

Growing IIoT Security Risks

As the Industrial Internet of Things (IIoT) grows, the security risks grow as well, according to a recent article by Jeff Dorsch in Semiconductor Engineering. According to his sources, the use of the IIoT is expanding both in the amount of new implementations, as well as how the data is being used. In addition to the traditional SCADA-like applications of machine-to-machine (M2M) connectivity, monitoring, and remote connectivity applications, it seems that more and more the IIoT is being used to power a data-driven approach to increasing production efficiency. Using big data tools and technologies, companies can employ better and more sophisticated analytics on industrial process data, thereby enhancing operational performance based on real-time data.

With the increase in use of the IIoT comes a corresponding increase in the potential for risk.  Looking at big picture, Robert Lee, CEO of Dragos, and a national cybersecurity fellow at New America commented, “There are two larger problems that have to be dealt with. First, there are not enough security experts. There are about 500 people in the United States with security expertise in industrial control systems. There are only about 1,000 worldwide. And second, most people don’t understand the threats that are out there because they never existed in the industrial space.”

Both of these problems are real, and need to be addressed.  And is often the case in issues of security, the human factor is closely intertwined with both. On the one hand, there is a crying need for security experts world wide, and on the other hand the man on the street, or in our case factory floor, control room, or corporate office, needs to quickly get up to speed on the unique security risks and challenges of providing data from live production systems over the Internet.

Addressing the Problems

As we see it, correctly addressing the second problem can help mitigate the first one.  When we understand deeply the nature of the Internet, as well as how the industrial space may be particularly vulnerable to security threats from it, then we are in a position to build security directly into control system design.  A secure-by-design approach provides a platform on which a secure IIoT system can run.

Like any well-designed tool, from electric cars to smart phones, the system should be easy to use.  When the platform on which a system runs is secure by design, it should not require someone with security expertise to run it.  The expertise is designed-in.  Of course, the human factor is always there.  Users will need to keep their guard up—properly handling passwords, restricting physical access, and adhering to company policies.  But they should also have confidence in knowing that security has been designed into system they are working on.

Thus, the most effective use of our world’s limited security manpower and resources is to focus them on understanding the unique security challenges of the IIoT, and then on designing industrial systems that address these challenges. This has been our approach at Skkynet, and we find it satisfying to be able to provide a secure IIoT platform that anyone can use.  We are confident that through this approach, as the IIoT continues to grow, the security risks will actually diminish for our users.

Secure by Design for IIoT

Securing the Industrial IoT is a big design challenge, but one that must be met. Although the original builders of industrial systems did not anticipate a need for Internet connectivity, companies now see the value of connecting to their plants, pipelines, and remote devices, often over the Internet. The looming question: How to maintain a high level of security for a mission-critical system while allowing remote access to the data?

As you can imagine the answer is not simple.  What’s called for is a totally new approach, one that is secure by design.  This blog entry, published on the ARC Advisory’s Industrial IoT/Industrie 4.0 Viewpoints blog, gives an overview of why standard industrial system architecture is not adequate to ensure the security of plant data on the Internet, and introduces the two main considerations that must go into creating a more secure design.

Don’t WannaCry on your Industrial IoT System

Pretty much anyone who has a computer or listens to the news has heard about the WannaCry virus that swept across the world a few days ago, installing itself on computers in businesses, hospitals, government agencies, and homes, encrypting hard drives and demanding ransom payments.  After scrambling to ensure that our operating systems are up-to-date and protected against this latest threat, the question soon comes up: How can we protect ourselves against similar threats in the future?

“How?” indeed.  That would seem difficult.  Our reliance on networked computers for business and personal use is fully entrenched, and business/personal PCs will remain vulnerable for the foreseeable future.  In the industrial arena, some may conclude this latest attack is yet another reason to hold off on their IoT strategy.  Or, at least: “You should use a VPN to keep it safe.”

And yet neither of these instincts is necessarily correct because (i) it is possible to build a secure Industrial IoT (“IIoT”) system, and (ii) VPN is not the way to do it.  Industrial control systems may use the same underlying operating systems as PCs but they are different in one critical aspect.  They exchange real-time control data, not files and emails.

How WannaCry Got In

WannaCry comes in two parts – an email “bomb” that exploits your anti-virus software and a “worm” that propagates throughout your network by exploiting configuration weaknesses and operating system bugs.  The special danger of WannaCry is that it can infect a computer through email even if you never open the email message.  Once WannaCry arrives through email, the worm takes over to attack the rest of the computers on your network.

The worm portion of the virus spreads itself by finding other machines on the network.  According to analysis of the code by Zammis Clark at Malwarebytes Labs, “After initializing the functionality used by the worm, two threads are created. The first thread scans hosts on the LAN. … The scanning thread tries to connect to port 445, and if so creates a new thread to try to exploit the system using MS17-010/EternalBlue.” (the bug that the virus exploits)

If there is no open port on the other computer, the virus cannot spread.  But the VPN is not much help here.  If anyone on the VPN is struck by the virus, then every machine on the LAN is exposed.  Suppose you have an IIoT system connecting a corporate office to a process control system over a VPN.  If the virus activates on any of the connected machines in the IT department, it can easily propagate itself to any of the connected machines on the industrial LAN.

How to Keep WannaCry Out

The tongue-in-cheek answer is “don’t use email”.  More seriously, industrial systems and IT systems should be separated from one another.  There is no need to read email from the industrial LAN.  Don’t install email software on your industrial computers, and don’t allow email traffic through your firewall.

But industrial systems still need to communicate their data.  How can you reach the data without exposing the industrial network?  The solution is spelled out in detail in the latest white paper from Cogent (a Skkynet company) titled: Access Your Data, Not Your Network. This paper explains why the traditional architecture of industrial systems is not suitable for secure Industrial IoT or Industrie 4.0 applications, and discusses the inherent risks of using a VPN.  But most important, it introduces the best approach for secure IIoT and Industrie 4.0, which is to provide access to industrial data without exposing the network at all.

Specifically, the Skkynet-provisioned devices and the DataHub can make outbound connections to SkkyHub without opening any firewall ports.  These connections are robust channels that support bidirectional, real-time communications for doing monitoring and supervisory control.  The WannaCry virus or anything similar cannot spread into this system because they can’t see anything to infect.  The devices on the network are completely invisible.  Skkynet’s approach provides access to the data only, not to the network.