Skkynet Opens OPC UA Lab in Yokohama

Hardware and software vendors and users now have a facility for testing OPC UA products.

Mississauga, Ontario, July 12, 2018 – Skkynet Cloud Systems, Inc. (“Skkynet” or “the Company”) (OTCQB: SKKY), a global leader in real-time cloud information systems, announces the opening of the Skkynet UA Lab in Yokohama, Japan.  Working in close cooperation with Skkynet partner BellChild and OPC development company Puerto Co., the Skkynet UA Lab will provide testing, development, and advisory services for users and vendors of OPC UA-enabled hardware and software.

“We are pleased to offer this center of excellence to the OPC UA community in Japan and worldwide,” said Paul Thomas, President of Skkynet.  “OPC UA is a robust and secure industrial protocol, and we expect to see a steady increase in implementations.”

The Skkynet UA Lab offers three kinds of support for its clients.  For developers who need to test OPC UA projects, the UA Lab provides powerful OPC UA servers with a large assortment of evaluation software.  For vendors who have a product on the market, the UA Lab offers an interoperability test environment, allowing each vendor’s product to connect with and test against all other vendors’ products.  In addition to these, the UA Lab has inaugurated its Worldwide OPC UA Network DEvelopment and Research (WONDER) program to test various ways of connecting OPC UA.  These include testing on untrusted networks using push technologies, utilzing proxy-enabled DMZs, tunnelling through firewalls, and incorporating the OPC UA Pub/Sub specification when those implementations become available.

“OPC UA is a sophisticated protocol, demanding much attention to detail,” said Minoru Yamazaki, advisor to Skkynet Japan and project organizer.  “Real-world experience is the ultimate evaluation criteria, and we are grateful for the support we have been receiving from a number of OPC UA product distributors, including MOXA, Contec, Comtrol, and Hi-Flying, as well as BellChild, and Puerto.”

Skkynet’s DataHub middleware, SkkyHub service, and ETK provide secure access to industrial data through OPC UA and other protocols, allowing users to fully integrate OT (operations technology) with IT systems and other applications anywhere in the world. Secure by design, it requires no VPN, no open firewall ports, no special programming, and no additional hardware. Secure integration of embedded devices, on-premise systems, and remote locations through seamless, end-to-end connectivity in real time lets users derive maximum value from Industrial IoT and Industrie 4.0.

About Skkynet

Skkynet Cloud Systems, Inc. (OTCQB: SKKY) is a global leader in real-time cloud information systems. The Skkynet Connected Systems platform includes the award-winning SkkyHub™ service, DataHub®, WebView™, and Embedded Toolkit (ETK) software. The platform enables real-time data connectivity for industrial, embedded, and financial systems, with no programming required. Skkynet’s platform is uniquely positioned for the “Internet of Things” and “Industry 4.0” because unlike the traditional approach for networked systems, SkkyHub is secure-by-design. For more information, see

Safe Harbor

This news release contains “forward-looking statements” as that term is defined in the United States Securities Act of 1933, as amended and the Securities Exchange Act of 1934, as amended. Statements in this press release that are not purely historical are forward-looking statements, including beliefs, plans, expectations or intentions regarding the future, and results of new business opportunities. Actual results could differ from those projected in any forward-looking statements due to numerous factors, such as the inherent uncertainties associated with new business opportunities and development stage companies. Skkynet assumes no obligation to update the forward-looking statements. Although Skkynet believes that any beliefs, plans, expectations and intentions contained in this press release are reasonable, there can be no assurance that they will prove to be accurate. Investors should refer to the risk factors disclosure outlined in Skkynet’s annual report on Form 10-K for the most recent fiscal year, quarterly reports on Form 10-Q and other periodic reports filed from time-to-time with the U.S. Securities and Exchange Commission.

What Makes an Ideal Protocol for IIoT?

If you want to ship goods, you need infrastructure.  Trucks, trains, ships, and planes rely on highways, tracks, ports, and airports.  In a similar way, a key element of Industrial IoT (IIoT) is the infrastructure, in other words, a data protocol.  Just as there are many transportation modes to choose from (some better than others), there are a number of IIoT protocols on offer―and they are not all the same.

Since the IIoT is still quite new, it has been an ongoing question as to what makes an ideal IIoT protocol.  With limited experience in this new sphere, many early adopters have looked to existing protocols.  For example, companies are currently using or considering MQTT or AMQP messaging protocols, the REST web services protocol, or the OPC UA industrial protocol.  Each of these works fine in its own application space, and each seems like it could work as an IIoT protocol.  But are any of these really suited to task? Or is there something better out there?

9 Criteria for an Ideal Protocol

To answer that question, we did a comparison.  We distilled over 20 years of hands-on experience in industrial data protocols and TCP networking into 9 criteria for what makes an ideal protocol for IIoT.  The results are summarized in a new white paper, IIoT Protocol Comparison.

These 9 criteria cover all of the essential areas of high-quality industrial data communication, like real-time performance and interoperability.  They also cover the broader arena of the Internet, with its greater security risks, variations in bandwidths and latencies, and multi-node architectures.  The white paper considers specific criteria for each of these in turn, and provides a simple explanation of how each of the protocols does or does not meet them.

If you’ve been following the growth and development of Skkynet over the years, the results of the comparison should come as no surprise.  The only protocol we are aware of that was designed from the ground up to provide secure networking of industrial data both on-premise and over the Internet is DHTP.  DHTP is what our products and services have been using for over 20 years, and it is one of the keys to their success.  We invite you to read the white paper, consider the criteria, and see for yourself what makes an ideal protocol for IIoT.

IIoT Protocol Comparison

What Makes an Ideal IIoT Protocol?

Agood IIoT protocol is the basis for effective IIoT data communication. Without a secure, robust IIoT protocol, data can be late, missing, inconsistent, or dangerously incorrect, leading to costly errors and wasted time.

With the IIoT still in its infancy, companies have turned first to familiar, well-tested data communication and messaging protocols such as MQTT, AMQP, REST and OPC UA for an IIoT protocol. Valid as these may be for their designed purposes, they were never intended to support IIoT data communication. Thus, when evaluated according to criteria for a robust, secure Industrial IoT implementation, they all come up somewhat short.

Skkynet’s software and services are designed for the IIoT, and meet all of the criteria for effective data communication. Here we provide a comparison report on how well MQTT, AMQP, REST, OPC UA, and Skkynet’s own DHTP (DataHub Transfer Protocol) meet the criteria summarized in the above table for an ideal IIoT protocol.  Each of the criteria enumerated above is explained in further detail in subsequent sections.

DHTP Protocol Comparison - Closed Firewalls

Keeps all inbound firewall ports closed for both data sources and data users.

DHTP Protocol Comparison - Closed Firewalls Diagram

Keeping all inbound firewall ports closed at the plant resolves many security issues for Industrial IoT. MQTT, AMQP, REST and DHTP meet this criterion. OPC UA does not because it has a client/server architecture, which requires at least one firewall port be open on the server side (typically the plant) to allow for incoming client connections. This is an unacceptable risk for most industrial systems. Skkynet’s DataHub and ETK connect locally to servers and clients in the plant, and make outbound connections via DHTP to SkkyHub running on a cloud server, or to another DataHub running on a DMZ computer. This outbound connection keeps all inbound firewall ports closed and hides the plant from the outside world.

DHTP Protocol Comparison - Low Bandwith

Consumes minimal bandwidth, while functioning with the lowest possible latency.

DHTP Protocol Comparison - Low Bandwith Diagram

One goal of any industrial communication or IIoT protocol is to consume as little bandwidth as possible, and function with the lowest possible latency. MQTT and AMQP do this well. REST does not, because every transaction includes all of the socket set-up time and communication overhead. OPC-UA is partial, because it uses a smart polling mechanism that trades bandwidth for latency. Skkynet software and services maintain a connection and transmit only the data via DHTP, consuming very little bandwidth, at very low latencies.

DHTP Protocol Comparison - Ability to Scale

Can support hundreds or thousands of interconnected data sources and users.

DHTP Protocol Comparison - Ability to Scale Diagram

An important aspect of the Internet of Things is the vision of connecting hundreds, thousands, and even millions of things via the Internet, and providing access to the data from any single thing, or groups of things to any number of clients. Event-driven protocols like MQTT and AMQP allow for this kind of scaling up, while REST’s polling model prevents it. OPC UA is also event-driven, and so theoretically can scale up, but its underlying polling model does not allow for very large numbers of simultaneous connections. DHTP abstracts the data from the protocol across the connection, and also implements an event-driven model, which allows it to scale up well.

DHTP Protocol Comparison - Real-Time

Adds virtually no latency to the data transmission.

DHTP Protocol Comparison - Real Time Diagram

Any kind of remote HMI or supervisory control system is much more effective when functioning in at least near-real time. Propagation delays of one or more seconds may be tolerable under certain conditions or for certain use cases, but they are not ideal. AMQP and MQTT offer real-time behavior only if they are not operating with a delivery guarantee. That is, if you choose the “guaranteed delivery” quality of service then a slow connection will fall further and further behind real-time. By contrast, DHTP guarantees consistency, not individual packet delivery, and can sustain that guarantee in real time on a slow connection. REST simply has too much connection overhead to allow real-time performance in most circumstances. OPC UA, being an industrial protocol, meets this criterion well.

DHTP Protocol Comparison - Interoperable Data Format

Encodes the data so that clients and servers do not need to know each other’s protocols.

DHTP Protocol Comparison - Interoperable Diagram

A well-defined data format is essential for interoperability, allowing any data source to communicate seamlessly with any data user. Interoperability was the primary driving force behind the original OPC protocols, and is fully supported by the OPC UA data format. Any Industrial IoT software or service should support at least one, if not multiple interoperable data formats. Skkynet’s DataHub software and ETK support several, and allow for real-time interchange between them and DHTP. MQTT, AMQP and REST do not support interoperability between servers and clients because they do not define the data format, only the message envelope format. Thus, one vendor’s MQTT server will most likely not be able to communicate with another vendor’s MQTT client, and the same is true for AMQP and REST.

DHTP Protocol Comparison - Intelligent Overload

A messaging broker responds appropriately when a data user is unable to keep up with the incoming data rate.

DHTP Protocol Comparison - Intelligent Overload Handling Diagram

Overload handling refers to how the broker responds when a client is unable to keep up with the incoming data rate, or when the server is unable to keep up with the incoming data rate from the client. MQTT and AMQP respond in one of two ways. Either they block, effectively becoming inoperative and blocking all clients. Or they drop new data in favor of old data, which leads to inconsistency between client and server. REST saturates its web server and becomes unresponsive. OPC UA attempts to drop old data in favor of new data, but consumes massive amounts of CPU resources to do so. When needed, Skkynet’s DataHub and SkkyHub can drop old data efficiently, and using DHTP they guarantee consistency between client and server even over multiple hops. Data coming from or going to overloaded clients remains consistent, and all other clients are unaffected.

DHTP Protocol Comparison - Propagation of Failure Notification

Each client application knows with certainty if and when a connection anywhere along the data path has been lost, and when it recovers.

DHTP Protocol Comparison - Propagation of Failure Notifications Diagram

Most protocols do not provide failure notification information from within the protocol itself, but rather rely on clients to identify that a socket connection is lost. This mechanism does not propagate when there is more than one hop in the communication chain. Some protocols (such as MQTT) use a “last will and testament” that is application-specific and thus not portable, and which is only good for one connection in the chain. Clients getting data from multiple sources would need to be specifically configured to know which “last will” message is associated with which data source. In MQTT, AMQP, REST and OPC UA alike, the protocol assumes that the client will know how many hops the data is traversing, and that the client will attempt to monitor the health of all hops. That is exceptionally fragile, since knowledge about the data routing must be encoded in the client. In general, this cannot be made reliable. DHTP propagates not only the data itself, but information about the quality of the connection. Each node is fully aware of the quality of the data, and passes that information along to the next node or client.

DHTP Protocol Comparison - Quality of Service

Guarantees consistency of data, preserved through multiple hops.

DHTP Protocol Comparison - Quality of Service Diagram

An important goal of the IIoT is to provide a consistent picture of the industrial data set, whether for archival, monitoring, or supervisory control. MQTT’s ability to guarantee consistency of data is fragile because its Quality of Service options only apply to a single hop in the data chain. And within that single hop, delivery can be guaranteed only at the expense of losing real-time performance. Real-time performance can be preserved, but only by dropping messages and allowing data to become inconsistent between client and server. AMQP’s ability to guarantee consistency of data is fragile because like MQTT it only applies to a single hop in the chain. Additionally, its delivery guarantee blocks when the client cannot keep up with the server and becomes saturated. REST provides no Quality of Service option, and while OPC UA guarantees consistency it cannot work over multiple hops. DHTP guarantees consistency, and the guarantee is preserved through any number of hops.

DHTP Protocol Comparison - Can Daisy Chain?

Brokers can connect to other brokers to support a wide range of collection and distribution architectures.

DHTP Protocol Comparison - Daisy Chain Diagram

The requirements of the IIoT take it beyond the basic client-to-server architecture of traditional industrial applications. To get data out of a plant and into another plant, corporate office, web page or client location, often through a DMZ or cloud server, typically requires two or more servers, chained together. The OPC UA protocol is simply too complex to reproduce in a daisy chain. Information will be lost in the first hop. Attempts to daisy chain some aspects of the OPC UA protocol would result in synchronous multi-hop interactions that would be fragile on all but the most reliable networks, and would result in high latencies. Nor would OPC UA chains provide access to the data at each node in the chain. REST servers could in theory be daisy chained, but would be synchronous, and not provide access to the data at each node in the chain. MQTT and AMQP can be chained, but it requires each node in the chain to be aware that it is part of the chain, and to be individually configured. The QoS guarantees in MQTT and AMQP cannot propagate through the chain, so daisy chaining makes data at the ends unreliable. Skkynet’s DataHub and SkkyHub both support daisy-chained servers because DHTP allows them to mirror the full data set at each node, and provide access to that data both to qualified clients, as well as the next node in the chain. The DHTP QoS guarantee states that any client or intermediate point in the chain will be consistent with the original source, even if some events must be dropped to accommodate limited bandwidth.

In Conclusion

Far from exhaustive, this overview of effective IIoT data communication provides an introduction to the subject, and attempts to highlight some of the key concepts, through sharing what we have found to be essential criteria for evaluating some of the protocols currently on offer. Because none of MQTT, AMQP, REST, or OPC UA were designed specifically for use in Industrial IoT, it is not surprising that they do not fulfill these criteria. DHTP, on the other hand, was created specifically to meet the needs of effective industrial and IIoT data communication, making it an ideal choice for an IIoT protocol.

Developing DHTP, the Ideal Protocol for IIoT

Ever since the concept of the Industrial IoT (IIoT) became popular, people have been trying to find the ideal protocol for it.  After all, IIoT is something new.  As the “Internet of Things,” it clearly involves data travelling across the Internet.  But because it is also “Industrial”, it requires more than the common Internet protocols like FTP or HTTP to do the job.  The best choice for an IIoT protocol is one that has been designed from the ground up to fulfill both industrial and Internet requirements.

Here at Skkynet we use such a protocol every day—DHTP (DataHub Transfer Protocol).  From its inception over 20 years ago, DataHub technology involved connecting disparate systems in real time over a network and the Internet.  It all started back in the ’90s with a product called Cascade Connect that exchanged data between programs running on a QNX real-time operating system, and the InTouch HMI running in Windows.  Cascade Connect used two connectors, precursors of DataHub, one running in QNX and the other in Windows.  Each of these connected to programs running on their respective operating systems using standard industrial protocols, and they also connected to each other using TCP over a network.  The protocol they used to connect over TCP way back then has evolved into what we now call DHTP.

An Open Protocol

DHTP was made open from the start, with a published Cogent API.  Each subsequent Cogent product, such as Cascade DataHub, the Gamma scripting language, Cascade Historian, and so on were accessible through the Cogent API.  As the DataHub product evolved to become the OPC DataHub and then the Cogent DataHub, more commands were added, and the API was made available in Windows.  Today DHTP consists of the DataHub API and DataHub Command Set.

Meeting the Needs

Each step of this evolutionary process took place within an industrial context, in response to the needs of specific projects.  As our customers demanded more robust and secure data communication over TCP, we improved DHTP capabilities by adding SSL and other features.  Nowhere is that more obvious than the success of the Cogent DataHub for OPC tunnelling applications.  The DataHub DA Tunneller and DataHub UA Tunneller are unrivalled for their ability to connect OPC servers and clients across a network or the Internet.

Cloud and Embedded

As one of the first companies to recognize the value of industrial communications via the cloud, Skkynet enhanced DHTP with WebSocket capability for DataHub-to-SkkyHub connectivity.  DHTP’s unique, patented ability to support secure, outbound connections from industrial systems for bidirectional communication without opening any firewall ports is key to Skkynet’s secure-by-design architecture.  The introduction of the ETK for embedded systems a few years later completed the picture. DHTP is now the standard protocol used by DataHub, SkkyHub, and the ETK, the three core components of Skkynet’s IIoT products and services.

In our next blog we will explain in more detail why DHTP is the ideal protocol for the IIoT.  We will provide an overview of the criteria for effective IIoT data communications, and show how DHTP meets all of them.  As you learn more about DHTP, keep in mind that its success as an IIoT protocol is due to how it was developed—in the challenging environment where industrial and Internet communications meet.

IIoT Security: Attacks Grow More Likely, Users Unaware

A few weeks ago hackers of industrial systems reached a new milestone. For the first time in history, someone was able to break into the safety shutdown system of a critical infrastructure facility. Roaming undetected through the system for an unknown amount of time, the hackers finally got stopped when they inadvertently put some controllers into a “fail-safe” mode that shut down other processes, which alerted plant staff that something was wrong.

The danger was not just in the safety mechanisms themselves, but for the whole plant. “Compromising a safety system could let hackers shut them down in advance of attacking other parts of an industrial plant, potentially preventing operators from identifying and halting destructive attacks,” said cyber experts interviewed by Reuters.

Plan Ahead

That facility was lucky this time around. What about next time? What about the next plant? Rather than relying on luck, it is better to plan for the future. As attacks grow more likely, those systems that are secure by design, that offer zero attack surface, that are undetectable on the Internet, stand a much better chance. This has always been Skkynet’s approach, and as the threats increase, it makes more and more sense.

In fact, the industrial world is largely unprepared for these kinds of attacks. Having evolved for decades cut off from the Internet, until recently there has been little need to change. And a surprising number of users seem unwilling to acknowledge the risks. According to a recent article in ARS Technica, hundreds of companies across Europe are running a popular model of Siemens PLC (Programmable logic controller) with TCP port 102 open to the Internet. “It’s an open goal,” commented security researcher Kevin Beaumont.

Government Mandates

The situation has attracted the attention of governments, who realize the need to protect critical infrastructure for the sake of their citizens. The United Kingdom has issued a new directive authorizing regulators to inspect cyber security precautions taken by energy, transport, water and health companies, reports the BBC. The National Cyber Security Centre has published guidelines, and companies that fail to comply are liable for fines of up to 17 million pounds. “We want our essential services and infrastructure to be primed and ready to tackle cyber-attacks and be resilient against major disruption to services,” said Margot James, Minister for Digital.

IT to OT Challenges

What has brought all of this into focus over the past few years has been the increased awareness of a need for process data outside of the production facility. Companies are recognizing the value of the data in their OT (operational technology) systems, and want to integrate it into their IT systems to help cut costs and improve overall efficiency for the company as a whole. What they may not realize is that the tools of IT were not designed for the world of OT, and the security practices of OT are not adequate for the Internet.

The WannaCry virus that affected many companies worldwide last year is a case in point. Companies using VPNs to protect their IT-to-OT connections found out first-hand that a VPN merely extends the security perimeter of the plant out into an insecure world. A breach in an employee email can expose the whole plant to the threat of a shutdown. “WannaCry is the personification of why computers on the corporate networks should not be directly connected to OT networks,” according to Gartner Analyst Barika Pace in a recent report, Why IIoT Security Leaders Should Worry About Cyberattacks Like WannaCry, January 30, 2018. “It is also the reflection of the inevitable convergence of IT and OT. Based on your risk tolerance and operational process, segmentation, where possible, is still critical.”

Segment Your Systems

By segmentation, Pace means dividing networks into security zones, and maintaining security between each zone through the use of firewalls, DMZs, data diodes and other similar technologies to ensure that if one system gets hacked, it cannot affect others. Segmentation is part of a secure-by-design approach that Skkynet endorses and provides. Our software and services offer a way to connect IT and OT systems through DMZs or the cloud without opening any outbound firewall ports.

A Siemens PLC in this kind of segmented system could be accessed by authorized parties, and exchange data in both directions, without opening TCP port 102 to the Internet. Managers of critical infrastructure that implement this secure-by-design approach to segmentation are not only ready for government inspection, they have taken the best precaution against those who would intrude, hack, and attack their mission-critical systems.

As attacks on critical infrastructure become more likely, users must become aware, and prepare. The acknowledged benefits of IIoT need not entail unnecessary risk—securing an industrial system can be done, and done well. A big step is to segment your OT system though a secure-by-design approach, such as that offered by Skkynet.

Wider Adoption of IIoT Forecast for 2018

With the New Year upon us, now is the time to look back at 2017 to see how far we’ve come, and look ahead to see what’s on the horizon.  After sifting through a number of predictions, it seems that most of the pundits agree that the forecast is good.  The Industrial IoT continues to grow steadily in popularity, as it becomes one of the leading application spaces for the IoT.

“There’s no question the industrial side of IoT is growing rapidly,” said Bret Greenstein, VP of IBM’s Watson IoT Consumer Business.  “In a way, it’s kind of supercharging manufacturing operators and people who do maintenance on machines by providing real-time data and real-time insights.”

“It’s clear that the internet of things is transforming the business world in every industry,” says Andrew Morawski, President and Country Chairman of Vodafone Americas. “As the technology has evolved over time, adoption among businesses has skyrocketed.”

Finding business cases

As part of this growth, the forecast is to see companies begin to apply the knowledge they have gained from small-scale test implementations and pilots to build solid use cases for IIoT technology.  “The focus is shifting from what the IoT could do to what it does, how it fits in business goals and how it generates value,” said J-P De Clerck, technology analyst at i-SCOOP.  We have seen this among our customers here at Skkynet, and we plan to share some of their experiences and use cases later this year.

Edge computing becoming a necessity

Most analysts foresee growth of edge computing as part of an overall IIoT solution.  As we explain in a recent Tech Talk, edge computing means doing some data processing directly on an IoT sensor or device, as close as possible to the physical system, to reduce bandwidth and processing on cloud systems. Daniel Newman, a Forbes contributor says, “Edge networking will be less of a trend and more of a necessity, as companies seek to cut costs and reduce network usage.” He sees IT companies like Cisco and Dell supporting the move to edge computing in IIoT hardware, as well as the industrial providers that you would expect, such as GE and ABB.

Security remains a fundamental challenge

There is one thing that pretty much every analyst and pundit agrees on: security is still a challenge.  Various ideas are being discussed.  One commentator suggested that companies making large investments in IIoT have gained or eventually will gain the expertise and resources needed to meet the challenge.  Others suggest that an altogether new model might be necessary.  “We have reached a point in the evolution of IoT when we need to re-think the types of security we are putting in place,” said P.K. Agarwal, Dean of Northeastern University’s Silicon Valley in a recent Network World article. “Have we truly addressed the unique security challenges of IoT, or have we just patched existing security models into IoT with hope that it is sufficient?”

As we see it, patching up existing models is not the answer.  Providing secure access to industrial data in real time over the Internet is not something that traditional industrial systems were designed to do.  As more and more IIoT implementations come online, and as companies search for robust systems that can scale up to meet their production needs, we believe they will come to that realization as well.  Our forecast for 2018 is that an increasing number of those companies will begin to realize the value of an IIoT system that is secure by design.