Posts

IIoT Protocol Comparison

What Makes an Ideal IIoT Protocol?

Agood IIoT protocol is the basis for effective IIoT data communication. Without a secure, robust IIoT protocol, data can be late, missing, inconsistent, or dangerously incorrect, leading to costly errors and wasted time.

With the IIoT still in its infancy, companies have turned first to familiar, well-tested data communication and messaging protocols such as MQTT, AMQP, REST and OPC UA for an IIoT protocol. Valid as these may be for their designed purposes, they were never intended to support IIoT data communication. Thus, when evaluated according to criteria for a robust, secure Industrial IoT implementation, they all come up somewhat short.

Skkynet’s software and services are designed for the IIoT, and meet all of the criteria for effective data communication. Here we provide a comparison report on how well MQTT, AMQP, REST, OPC UA, and Skkynet’s own DHTP (DataHub Transfer Protocol) meet the criteria summarized in the above table for an ideal IIoT protocol.  Each of the criteria enumerated above is explained in further detail in subsequent sections.

DHTP Protocol Comparison - Closed Firewalls

Keeps all inbound firewall ports closed for both data sources and data users.

DHTP Protocol Comparison - Closed Firewalls Diagram

Keeping all inbound firewall ports closed at the plant resolves many security issues for Industrial IoT. MQTT, AMQP, REST and DHTP meet this criterion. OPC UA does not because it has a client/server architecture, which requires at least one firewall port be open on the server side (typically the plant) to allow for incoming client connections. This is an unacceptable risk for most industrial systems. Skkynet’s DataHub and ETK connect locally to servers and clients in the plant, and make outbound connections via DHTP to SkkyHub running on a cloud server, or to another DataHub running on a DMZ computer. This outbound connection keeps all inbound firewall ports closed and hides the plant from the outside world.

DHTP Protocol Comparison - Low Bandwith

Consumes minimal bandwidth, while functioning with the lowest possible latency.

DHTP Protocol Comparison - Low Bandwith Diagram

One goal of any industrial communication or IIoT protocol is to consume as little bandwidth as possible, and function with the lowest possible latency. MQTT and AMQP do this well. REST does not, because every transaction includes all of the socket set-up time and communication overhead. OPC-UA is partial, because it uses a smart polling mechanism that trades bandwidth for latency. Skkynet software and services maintain a connection and transmit only the data via DHTP, consuming very little bandwidth, at very low latencies.

DHTP Protocol Comparison - Ability to Scale

Can support hundreds or thousands of interconnected data sources and users.

DHTP Protocol Comparison - Ability to Scale Diagram

An important aspect of the Internet of Things is the vision of connecting hundreds, thousands, and even millions of things via the Internet, and providing access to the data from any single thing, or groups of things to any number of clients. Event-driven protocols like MQTT and AMQP allow for this kind of scaling up, while REST’s polling model prevents it. OPC UA is also event-driven, and so theoretically can scale up, but its underlying polling model does not allow for very large numbers of simultaneous connections. DHTP abstracts the data from the protocol across the connection, and also implements an event-driven model, which allows it to scale up well.

DHTP Protocol Comparison - Real-Time

Adds virtually no latency to the data transmission.

DHTP Protocol Comparison - Real Time Diagram

Any kind of remote HMI or supervisory control system is much more effective when functioning in at least near-real time. Propagation delays of one or more seconds may be tolerable under certain conditions or for certain use cases, but they are not ideal. AMQP and MQTT offer real-time behavior only if they are not operating with a delivery guarantee. That is, if you choose the “guaranteed delivery” quality of service then a slow connection will fall further and further behind real-time. By contrast, DHTP guarantees consistency, not individual packet delivery, and can sustain that guarantee in real time on a slow connection. REST simply has too much connection overhead to allow real-time performance in most circumstances. OPC UA, being an industrial protocol, meets this criterion well.

DHTP Protocol Comparison - Interoperable Data Format

Encodes the data so that clients and servers do not need to know each other’s protocols.

DHTP Protocol Comparison - Interoperable Diagram

A well-defined data format is essential for interoperability, allowing any data source to communicate seamlessly with any data user. Interoperability was the primary driving force behind the original OPC protocols, and is fully supported by the OPC UA data format. Any Industrial IoT software or service should support at least one, if not multiple interoperable data formats. Skkynet’s DataHub software and ETK support several, and allow for real-time interchange between them and DHTP. MQTT, AMQP and REST do not support interoperability between servers and clients because they do not define the data format, only the message envelope format. Thus, one vendor’s MQTT server will most likely not be able to communicate with another vendor’s MQTT client, and the same is true for AMQP and REST.

DHTP Protocol Comparison - Intelligent Overload

A messaging broker responds appropriately when a data user is unable to keep up with the incoming data rate.

DHTP Protocol Comparison - Intelligent Overload Handling Diagram

Overload handling refers to how the broker responds when a client is unable to keep up with the incoming data rate, or when the server is unable to keep up with the incoming data rate from the client. MQTT and AMQP respond in one of two ways. Either they block, effectively becoming inoperative and blocking all clients. Or they drop new data in favor of old data, which leads to inconsistency between client and server. REST saturates its web server and becomes unresponsive. OPC UA attempts to drop old data in favor of new data, but consumes massive amounts of CPU resources to do so. When needed, Skkynet’s DataHub and SkkyHub can drop old data efficiently, and using DHTP they guarantee consistency between client and server even over multiple hops. Data coming from or going to overloaded clients remains consistent, and all other clients are unaffected.

DHTP Protocol Comparison - Propagation of Failure Notification

Each client application knows with certainty if and when a connection anywhere along the data path has been lost, and when it recovers.

DHTP Protocol Comparison - Propagation of Failure Notifications Diagram

Most protocols do not provide failure notification information from within the protocol itself, but rather rely on clients to identify that a socket connection is lost. This mechanism does not propagate when there is more than one hop in the communication chain. Some protocols (such as MQTT) use a “last will and testament” that is application-specific and thus not portable, and which is only good for one connection in the chain. Clients getting data from multiple sources would need to be specifically configured to know which “last will” message is associated with which data source. In MQTT, AMQP, REST and OPC UA alike, the protocol assumes that the client will know how many hops the data is traversing, and that the client will attempt to monitor the health of all hops. That is exceptionally fragile, since knowledge about the data routing must be encoded in the client. In general, this cannot be made reliable. DHTP propagates not only the data itself, but information about the quality of the connection. Each node is fully aware of the quality of the data, and passes that information along to the next node or client.

DHTP Protocol Comparison - Quality of Service

Guarantees consistency of data, preserved through multiple hops.

DHTP Protocol Comparison - Quality of Service Diagram

An important goal of the IIoT is to provide a consistent picture of the industrial data set, whether for archival, monitoring, or supervisory control. MQTT’s ability to guarantee consistency of data is fragile because its Quality of Service options only apply to a single hop in the data chain. And within that single hop, delivery can be guaranteed only at the expense of losing real-time performance. Real-time performance can be preserved, but only by dropping messages and allowing data to become inconsistent between client and server. AMQP’s ability to guarantee consistency of data is fragile because like MQTT it only applies to a single hop in the chain. Additionally, its delivery guarantee blocks when the client cannot keep up with the server and becomes saturated. REST provides no Quality of Service option, and while OPC UA guarantees consistency it cannot work over multiple hops. DHTP guarantees consistency, and the guarantee is preserved through any number of hops.

DHTP Protocol Comparison - Can Daisy Chain?

Brokers can connect to other brokers to support a wide range of collection and distribution architectures.

DHTP Protocol Comparison - Daisy Chain Diagram

The requirements of the IIoT take it beyond the basic client-to-server architecture of traditional industrial applications. To get data out of a plant and into another plant, corporate office, web page or client location, often through a DMZ or cloud server, typically requires two or more servers, chained together. The OPC UA protocol is simply too complex to reproduce in a daisy chain. Information will be lost in the first hop. Attempts to daisy chain some aspects of the OPC UA protocol would result in synchronous multi-hop interactions that would be fragile on all but the most reliable networks, and would result in high latencies. Nor would OPC UA chains provide access to the data at each node in the chain. REST servers could in theory be daisy chained, but would be synchronous, and not provide access to the data at each node in the chain. MQTT and AMQP can be chained, but it requires each node in the chain to be aware that it is part of the chain, and to be individually configured. The QoS guarantees in MQTT and AMQP cannot propagate through the chain, so daisy chaining makes data at the ends unreliable. Skkynet’s DataHub and SkkyHub both support daisy-chained servers because DHTP allows them to mirror the full data set at each node, and provide access to that data both to qualified clients, as well as the next node in the chain. The DHTP QoS guarantee states that any client or intermediate point in the chain will be consistent with the original source, even if some events must be dropped to accommodate limited bandwidth.

In Conclusion

Far from exhaustive, this overview of effective IIoT data communication provides an introduction to the subject, and attempts to highlight some of the key concepts, through sharing what we have found to be essential criteria for evaluating some of the protocols currently on offer. Because none of MQTT, AMQP, REST, or OPC UA were designed specifically for use in Industrial IoT, it is not surprising that they do not fulfill these criteria. DHTP, on the other hand, was created specifically to meet the needs of effective industrial and IIoT data communication, making it an ideal choice for an IIoT protocol.

Top 10 IoT Technology Challenges for 2017 and 2018

Gartner, Inc., the IT research firm based in Stamford, Connecticut, recently published a forecast for the top ten IoT technology challenges for the coming two years.  The list covers a lot of ground, from hardware issues like optimizing device-level processors and network performance to such software considerations as developing analytics and IoT operating systems to abstract concepts like maintaining standards, ecosystems, and security.

“The IoT demands an extensive range of new technologies and skills that many organizations have yet to master,” said Nick Jones, Gartner vice president analyst. “A recurring theme in the IoT space is the immaturity of technologies and services and of the vendors providing them.”

Heading the list of needed expertise is security.  “Experienced IoT security specialists are scarce, and security solutions are currently fragmented and involve multiple vendors,” said Mr. Jones. “New threats will emerge through 2021 as hackers find new ways to attack IoT devices and protocols, so long-lived ‘things’ may need updatable hardware and software to adapt during their life span.”

To anyone considering the IoT, and particularly the Industrial IoT (IIoT) or Industrie 4.0, this should be a wake-up call.  As the recent power-grid hack in the Ukraine shows us, old-school approaches like VPNs will not be sufficient when an industrial system is exposed to the Internet. In the IoT environment, Skkynet’s secure by design approach ensures not only a fully integrated approach for the security issues that many are aware of today, but also a forward-looking approach that will meet future challenges.

Having taken security into consideration, there are other items on the list that we see as significant challenges, and for which we provide solutions.  Among these are:

  • IoT Device Management – Each device needs some way to manage software updates, do crash analysis and reporting, implement security, and more. This in turn needs some kind of bidirectional data flow such as provided by SkkyHub, along with a management system capable of working with huge numbers of devices.
  • Low-Power Network Support – Range, power and bandwidth restraints are among the constraints of IoT networks.  The data-centric architecture of SkkyHub and the Skkynet ETK ensure the most efficient use of available resources.
  • IoT Processors and Operating Systems – The tiny devices that will make up most of the IoT demand specialized hardware and software that combine the necessary capabilities of low power consumption, strong security, tiny footprint, and real-time response.  The Skkynet ETK was designed for specifically this kind of system, and can be modified to meet the requirements of virtually any operating system.
  • Event-Stream Processing – As data flows through the system, some IoT applications may need to process and/or analyze it in real time.  This ability, combined with edge processing in which some data aggregation or analysis might take place on the device itself, can enhance the value of an IoT system with little added cost.  Skkynet’s unique architecture provides this kind of capability as well.

According to Gartner, and in our experience, these are some of the technical hurdles facing the designers and implementers of the IoT for the coming years.  As IoT technology continues to advance and mature, we can expect other challenges to appear, and we look forward to meeting those as well.

Recent IoT Attack on Dyn Calls for Secure By Design

The recent denial of service attack on Dyn, a DNS service company for a huge chunk of the Internet, sure woke up a lot of people.  Somehow when it happens to you, you tend to feel it more.  Twitter, Netflix, Reddit, eBay, and Paypal users certainly felt it when they couldn’t access those sites.  Now that most of us are awake, what can we do about it?

In the short term, not a lot, apparently.  In a recent article about the attack titled Vulnerability Is the Internet’s Original Sin, Internet security expert and author of Dark Territory: The Secret History of Cyber War, Fred Kaplan points out that from the beginning the costs and challenges of designing security into the Internet from the ground up was considered too challenging and costly.

Kaplan tells how, back in 1967, Willis Ware, the head of the Rand Corporation’s computer science department and a NSA scientific advisory board member, wrote a paper warning the ARPANET team and others that “once you put information on a network—once you make it accessible online from multiple, unsecure locations—you create inherent vulnerabilities … You won’t be able to keep secrets anymore.”

The Dyn attack was simple in concept and easy to execute.  The devices used were accessible household appliances and electronics, configured out of the box with simple default user names and passwords like “username”, “password”, and “12345”.  The virus cycled through these default credentials to recruit thousands of devices into a giant collective, which was then coordinated to flood Dyn with traffic.

To prevent this kind of hack, device manufacturers may start updating their devices to ensure more secure usernames and passwords.  But that ignores the elephant in the room.  The fundamental problem is that these IoT devices are available (they are always on, ready to communicate over the internet), they are accessible (they can be seen on the internet), and they are numerous (with numbers growing exponentially).  This combination of availability and accessibility, multiplied by the huge numbers, makes IoT devices perfect for coordinated attacks.  We can be sure that the bad actors are already working hard on defeating username/password protection on IoT devices.

Considering the first of these three critical factors, IoT functionality requires that IoT devices are available for communication.  There is not a lot we can do about availability.  Secondly, the business opportunities and economic promise make device proliferation unstoppable.  We have to expect continued rapid growth.  But we can do something about the third critical factor: accessibility.

No IoT device should be sitting on the Internet with one or more open ports, waiting for something to connect to it.  The device can and should be invisible to incoming probes and requests to connect.  A hacker or bot should not even see the device, let alone be given the chance to try a username or password.  That technology exists, is easy and inexpensive to implement, and has been proven in thousands of industrial installations for over a decade.  Governments and manufacturers need to be employing it across the full range of IoT applications.

IBM Realizes the Value of the Industrial IoT

A recent report in Fortune magazine claims that one of the key areas for growth at IBM this year has been its Industrial IoT (“IIoT”) business.  In the past 9 months alone, the number of their IIoT customers shot up 50%, to 6,000.  The area of IIoT is one of IBM’s “strategic imperatives”, which contributed an overall increase in growth of 7% for the company.  In contrast, the more traditional hardware and services areas experienced a 14% decline year-on-year.

The report quotes a survey released last month from IDC (International Data Corporation) that found the trend towards IIoT implementation is increasing industry-wide. Over 30% of the companies participating in the survey have already launched IoT initiatives, and another 43% expect to do so in the coming year.  “This year we see confirmation that vendors who lead with an integrated cloud and analytics solution are the ones who will be considered as critical partners in an organization’s IoT investment,” said Carrie MacGillivray, Vice President, Mobility and Internet of Things at IDC.

Results of the IDC survey of 4,500 managers and executives from a wide range of industries in over 25 countries suggest that many companies have completed proof-of-concept projects, and are now moving towards pilot implementations and scalable IoT deployments.  This trend is acknowledged by Bret Greenstein, IBM’s vice president for IoT platforms, who commented in the Forbes interview, “There was so much tire-kicking a year ago. Now you are seeing adopters in every single industry actually building solutions.”

What is driving this demand for IoT among IBM’s customers?  The Forbes article didn’t say, but the IDC survey found that much of the value of the IoT is seen to be internal to the company itself, to become or stay more competitive.  Respondents cited boosting productivity, streamlining procedures, and cutting costs as reasons for implementing the IoT, rather than any direct services or other benefits for customers.

Although the IDC survey was for the IoT in a broad range of industries, including manufacturing, retail, utilities, government, health, and finance, its results correlate with the experience of IBM in the Industrial IoT.  The company plans to bring on 25,000 new people for IIoT-related projects and services worldwide, with 1,000 of them in their Munich global IoT headquarters alone. As we see it, both the survey results and the experience of IBM point to a common reality: the Industrial IoT is quickly moving into the mainstream.

Industrial IoT, Big Data & M2M Summit―Takeaways

Last week several of us here at Skkynet had the pleasure to attend and present a case study at the Industrial IoT, Big Data & M2M Summit in Toronto.  IoT specialists representing a wide range of industries, from mining, manufacturing, and energy to telecom and software gathered to share insights and learn from collective experience how to get the most out of Industrial IoT.

Challenges to IoT adoption was a key topic of discussion.  There was considerable agreement among summit participants that one of the primary challenges is not technical, but cultural.  Switching from software ownership to data as a service requires a new mind-set, which not everyone is willing to adopt.  Speaker after speaker underlined the need to communicate value and get buy-in from all concerned parties. You should start with a small pilot project, with minimal investment, and demonstrate ROI.  Other challenges discussed included incompatible protocols and security risks.

Summit Theme: Partnerships

A common theme that prevailed in presentations and comments throughout the summit was that the IoT casts such a wide net that nobody can do all of it well.  We need to work together.

“IoT is all around partnerships,” said Christopher Beridge, Director of Business Development – IoT and Business Solutions at Bell Mobility.

“A lot of people have a part to play when you are talking IoT,” according to Matthew Wells, Senior Product General Manager at GE Digital.

“Smartness depends on how interconnected you are,” commented Steven Liang, Associate Professor at the University of Calgary, and conference chair.

Above all, there was agreement that the IoT is here to stay. “Our focus is to make things more efficient, reliable, affordable, and convenient, and the IoT is a way to do it,” said Michael Della Fortune, Chief Executive Officer of Nexeya Canada.  “It powers and upholds the 4 Vs—Variety, Volume, Velocity, and Veracity—of Big Data.”

Perhaps Timon LeDain, Director, Internet of Things at Macadamian summed it up best when he said, “IoT will be done by you, or done to you.”

Survey: Valuable Lessons from IoT Early Adopters

A recent survey by Machina Research (Lessons Learned from Early Adopters of the IoT: A Global Study of Connected Businesses) suggests that the IoT is moving quickly from novelty to necessity. Nearly two thousand management-level employees in companies earning $15 million and up per year in the USA, UK, Japan, Australia, and Brazil representing all major sectors of industry took part.  About 20% of the respondents have started some kind of IoT initiative, and close to 30% expect to do so in the next 6 months to 2 years.

Focusing on the innovators and early adopters of the IoT, the survey gleaned some useful information which may be helpful for those who have not yet implemented a strategy—and in many cases, those who have.  It seems that the majority of early adopters of the IoT took a do-it-yourself approach, and most of them found the IoT more complicated to implement than they expected. Future adopters say they will not repeat that mistake.

“When asked about primary concerns around IoT, adopters have some insight that non-adopters just don’t yet have,” states the report. “Adopters point to ‘complexity of the IoT solution’ as the largest concern around IoT, a concern that non-adopters have yet to consider fully.” Among those who have taken IoT initiatives, over half of them mentioned concerns about complexity, compared to only a quarter of those who have not yet taken the first step.

Other top concerns included security, ease of integration with existing systems, and the expense of implementation. These commonly-held concerns are undoubtedly part of the reason for the reluctance of others to undertake IoT projects on their own.  The majority of them responded that they are planning to work with an IoT-capable partner.

“Based on past experience of our adopters, companies who haven’t yet adopted IoT initiatives should not go it alone,” the report recommends. “Instead they should focus on finding partners whose core competency is connecting products securely.”

The report suggests that an ideal partner should not only have a technology platform, but should be able to simplify the complexity of the IoT.  They ensure that security is not an ad-hoc afterthought, but instead is inherent to the design of the system itself.  The partner should be able to easily integrate the IoT solution with existing and legacy systems, and offer significant cost savings and ROI.