The Benefits of Harnessing Live Data

The data is pouring in.  The flow started as a mere trickle of hand-written records on clipboards in the early days of mechanical and pneumatic automation.  It grew to a steady stream with the introduction of PLCs (programmable logic controllers) and SCADA (Supervisory Control and Data Acquisition) systems pooling data automatically.  Now, with the advent of IoT and digital transformation live data is gushing through industrial systems in a mighty torrent.

As with the flow of water, this flow of live data has power. Harnessing it can mean more efficient operations, savings in labor and material costs, and overall improvements in quality.  What’s needed is software to facilitate the collection, analysis, and distribution of the results in real time.

This is what a recent survey of 500 mid-level manufacturing professionals suggests.  The Plutoshift report, The Challenge of Turning Data Into Action, says over three quarters of their respondents agreed that “in order to take immediate action based on collected data, they need software solutions that analyze data in real-time.”

Problem: Manual data entry

Summing up the report’s findings: despite well-known benefits of digital transformation, the adoption rate has been low.  Only 12% of those surveyed have configured their systems to respond automatically to incoming data.  The common feeling is that data inputs are not reliable enough for automated response.  About half of the respondents are still using manual data entry.  This in itself can introduce errors, and perhaps worse, the data almost immediately goes stale until the next manual entry is made.  The more stale the data gets, the more likely it will be incorrect.  And an automated response to stale data could be catastrophic.

For example, a machine may only be checked by an operator once per day on a plant floor walk-through.  If it develops an irregular vibration, it could be hours before it is noticed.  An automated system using manual data input might keep it running, possibly damaging the equipment.  On the other hand, an inexpensive IoT sensor on the machine could send notification as soon as a problem is detected, and trigger an alarm or automatic speed adjustment until an operator could take remedial action.

Once the data is streaming in, there are many companies out there like Plutoshift that can help manage it.  Skkynet’s focus is the data stream itself—to ensure it is secure, reliable, and up to date—to the millisecond.  This will allow those who use the data to take full advantage of automated response mechanisms, to actively participate in digital transformation. Like the human nervous system relaying data from the outside world, effective digital transformation depends on harnessing live data.  After all, you can only know as much about your world, or your system, as the data tells you.

Collecting Big Data in Real Time

It was bound to happen.  The two titans meet.  The gargantuan grasp of Big Data turns its ever-open hands towards the firehose stream of real-time data.  “The next evolution of the big data phenomenon has turned out to be real time streaming of data,” says Big Data pundit Rick Delgado in a recent blog: What Real Time Streaming Means for Big Data.  “Organizations have an increased need to gather and analyze their data at the same time, making real time data streaming a must if big data is going to keep up with demand.”

Will Big Data ever be satisified?  Not as long as the demand for informed action continues to grow.  Will we ever run out of real-time data?  Not as long as stuff keeps happening.  The only thing necessary to complete this marriage is to make the connection, and stream real-time data into the welcoming, capable hands of Big Data.

This is what we are keen on.  With our established track record in real-time industrial data communications, we anticipated this need for real-time analytics years ago, along with other thought leaders.  In a blog back in 2011 we quoted Paul Maritz, President & CEO of VMware at keynote address on the future of cloud computing at VMWorld 2011, “People are going to have to be able to react to information coming in, in real time.” Since then we’ve been putting the vision into action, and it’s great to now see the Big Data people coming on board.

Real-Time Analytics from Big Data

The advantage of live connectivity to Big Data is you can now do your analytics in real time. Delgado sees this clearly.  Real-time inputs to Big Data, he says, can fuel near-real-time outputs.  Rather than a two-stage process of storing the data, and then analyzing it, the analysis can take place on the fly, and your system can function like the mind of an athelete, jazz musician, pilot, or soldier. Insights become more spontaneous, and reactive responses are replaced by pro-active initiatives. The competitive advantage goes to those who can better anticipate and immedately meet customer demands, increasing customer satisfaction and establishing greater loyalty.

Delgado lists a number of areas where real-time streaming to Big Data could have a significant impact. For example, certain types of fraudulent or suspicious patterns of trading in the financial sector that don’t show up in the aggregate could be spotted in real time.  Businesses could monitor customer behavior on websites and social media to provide people with exactly what they need, at the moment they want it.

Additional Benefits – Industrial Sector

Among various application spaces that Delgado mentioned, he left out a significant one: streaming real-time Big Data for industrial users.  Imagine an operator of a machine where an alarm light is flashing.  Looking at his smart phone or tablet, he gets not only the alarm and raw data from the machine, but a real-time analysis of what could be wrong.  And along with that, he may receive suggested action steps based on comparing that data in real time to technical specs, historical records, and even live recommendations from its manufacturer, who is also connected to the machine, and monitoring it in real time.

Companies like GE are investing millions in such systems.  They collect and analyze in real time the Big Data coming from power turbines, jet engines, and other equipment during operation.  As the Industrial IoT gains acceptance, we see other companies, big and small, follow suit.  The value inherent in real-time data for making instantaneous decisions is too great to pass up.  The industrial sector, a large and long-time user of real-time data, stands to benefit significantly by connecting to Big Data.

Skkynet Technology Featured in IEEE Paper and Presentation

The feasibility and value of cloud-based data communications for power generation smart grid testbeds presented at IEEE General Meeting.

Mississauga, Ontario, July 19, 2016 – Skkynet Cloud Systems, Inc. (“Skkynet”) (OTCQB: SKKY), a global leader in real-time cloud information systems, announces that its SkkyHub™  technology supported research leading to a published paper presented at the IEEE Power and Energy Society General Meeting in Boston yesterday. The paper, “Cloud Communication for Remote Access Smart Grid Testbeds” by Mehmet H. Cintuglu and Osama A. Mohammed of Florida International University, concludes that “cloud communication can be successfully implemented for actual smart grid power systems test beds.”

“We are pleased that the IEEE has accepted this paper for publication,” said Paul Thomas, President of Skkynet. “This is a significant milestone in demonstrating the value of cloud-based, real-time data connectivity for industrial and infrastructure applications.”

The object of the research was to determine the effectiveness of cloud-based communication for integrating data coming from diverse, heterogeneous electrical system testbeds.  These testbeds allow students and researchers to quickly test and verify innovations and proof-of-concept systems. While networked testbeds are useful for testing large deployments of smart devices, traditional WAN approaches are costly.  “In cloud based systems operational costs are significantly reduced compared to dedicated high bandwidth wide area links which was previously a pre-requisite for creating successful networking test beds,” the paper states.

The cloud communications technology used for the research was Skkynet’s SkkyHub service, which the paper describes as “a SaaS platform providing secure end-to-end networking for smart grid devices such as IEDs and PMUs,” which can be “implemented on virtually any new or existing system at a low cost capital and provides a web-based human-machine-interface (HMI) for remote access and supervisory control.”

The SkkyHub service allows industrial and embedded systems to securely network live data in real time from any location. It enables bidirectional supervisory control, integration and sharing of data with multiple users, and real-time access to selected data sets in a web browser. The service is capable of handling over 50,000 data changes per second per client, at speeds of just microseconds over Internet latency. Secure by design, it requires no VPN, no open firewall ports, no special programming, and no additional hardware.

About Skkynet

Skkynet Cloud Systems, Inc. (OTCQB: SKKY) is a global leader in real-time cloud information systems. The Skkynet Connected Systems platform includes the award-winning SkkyHub™ service, DataHub®, WebView™, and Embedded Toolkit (ETK) software. The platform enables real-time data connectivity for industrial, embedded, and financial systems, with no programming required. Skkynet’s platform is uniquely positioned for the “Internet of Things” and “Industry 4.0” because unlike the traditional approach for networked systems, SkkyHub is secure-by-design. Customers include Microsoft, Caterpillar, Siemens, Metso, ABB, Honeywell, IBM, GE, BP, Goodyear, BASF, E·ON, Bombardier and the Bank of Canada. For more information, see

Safe Harbor

This news release contains “forward-looking statements” as that term is defined in the United States Securities Act of 1933, as amended and the Securities Exchange Act of 1934, as amended. Statements in this press release that are not purely historical are forward-looking statements, including beliefs, plans, expectations or intentions regarding the future, and results of new business opportunities. Actual results could differ from those projected in any forward-looking statements due to numerous factors, such as the inherent uncertainties associated with new business opportunities and development stage companies.  Skkynet assumes no obligation to update the forward-looking statements. Although Skkynet believes that any beliefs, plans, expectations and intentions contained in this press release are reasonable, there can be no assurance that they will prove to be accurate. Investors should refer to the risk factors disclosure outlined in Skkynet’s annual report on Form 10-K for the most recent fiscal year, quarterly reports on Form 10-Q and other periodic reports filed from time-to-time with the U.S. Securities and Exchange Commission.

Will Time-Sensitive Networking (TSN) Improve the IIoT?

Is current Internet technology sufficient for the needs of Industry 4.0 or the IIoT?  Or could it be better?  How can we enhance Ethernet to improve real-time data communications? These are the kinds of issues that some key players in Industrial IoT plan to address by developing the world’s first time-sensitive networking (TSN) infrastructure.

TSN has been defined as “a set of IEEE 802 standards designed to enhance Ethernet networking to support latency-sensitive applications that require deterministic network performance,” according to Mike Baciodore in a recent article in Control Design titled “How time-sensitive networking enables the IIoT

Put simply, the goal of TSN is to provide the IoT with the same kind of real-time performance that is now limited to individual machines like cars and airplanes, or to distributed control systems in industrial applications.  The Industrial Internet Consortium (IIC), along with Intel, National Instruments, Bosch Rexroth, Cisco, Schneider Electric and others have joined forces to achieve this goal, to enable a truly real-time IoT.

TSN is Good News for Skkynet

This collaboration to develop TSN comes as good news to us here at Skkynet.  Since we currently provide secure, bidirectional, supervisory control capabilities over TCP, we understand how much more effective our software and services will be when supported by TSN.

With TSN, our latencies of a few ms over Internet speeds would be reduced to simply a few ms.  Data dynamics would be better preserved, and system behavior more deterministic.  This effort to develop TSN validates our thinking that the IIoT works best with low-latency, high-speed networking.  Unlike those who operate on the assumption that web communication technology (REST) is the way forward, the TSN approach means that networked data communications can approximate or equal in-plant speeds and latencies.

Several participants and commentators on the TSN project point out that typical cloud architectures are not ideal counterparts for TSN.  Something fundamentally different is required.  Putting their individual ideas and suggestions together, what they envision for an architecture is remarkably close to what Skkynet currently provides.  It should be secure by design, fully integrate edge computing, and keep the system running without interruption during any network outages.  Above all, it must provide secure, selective access to any process data, in real time.

“One of the cool concepts out there is that people will want to have a cyberphysical representation of the equipment in the cloud,” said Paul Didier, solutions architect manager at Cisco. “That doesn’t mean the physical plant will be controlled in the cloud. Optimization and maintenance can be done in the cloud and will filter its way back to the machine.”

Our recent case study showcasing DataHub and SkkyHub technology illustrates this “cyberphysical representation.”  During the deployment and test of a mineral processing system, developers thousands of miles away monitored the machine logic and tweaked the system in real time. “It was as if we were sitting beside them in the control room,” said one of the team, “and through live monitoring, we were able to continue developing the application, thanks to the real-time connectivity.”

It’s a small step from this to machine control, and time-sensitive networking will be a welcome technology in that direction.  To the Industrial Internet Consortium (IIC) and everyone else involved in this project, we say keep up the great work!  We’re ready to put TSN to good use when it becomes available.

Case Study: PowerData, Caribbean

Caribbean resort facilities and power stations use DataHub to monitor system output and performance

Even on the lush tropical beaches of St. Maarten, Suriname, St. Kitts, and Antigua, where the sunshine sparkles on the deep turquoise waters of the Caribbean, access to real-time data is vital. While tourists lounge on white sand beaches, the managers and engineers at resorts, shopping centers, and power plants work round the clock behind the scenes to ensure a smooth experience. Operators and managers in the public institutions and private facilities at these remote destinations need to know what their processes are doing at any given time, from any location. They must be able to react quickly to changing conditions and make key decisions.

To meet this need, PowerData Limited of St. Maarten provides real-time and historical online data reporting services. They supply managers and engineers in power plants, resorts, and commercial facilities in the Caribbean islands with the data they need to monitor their power generation equipment, instrumentation, and other machinery. Recently, PowerData started using the DataHub® to give their customers a real-time data display using a standard web browser.

“Now our clients can open a web browser from wherever they are, and see exactly what is going on,” said Mr. Cameron Burn, CEO of PowerData. “The DataHub’s Java applets lets us feed large quantities of data to a page at high speeds, with no refresh necessary.”


Cameron is using the DataHub’s Table applet to display multiple DataHub points. His web server provides the page, and loads the DataHub Table applet. The applet then creates a direct TCP link to the DataHub, which is connected to the PowerData monitoring equipment’s OPC server. The DataHub streams the data from the PowerData equipment to the web page in real time. The processing load on the web browser is very little-there’s no need for screen refresh-and the data is always up-to-the-second accurate.


“Remote monitoring of our engine installations has been one of the most valuable aspects of this new system,” said Mr. Jeff Close, MAN Support Services Engineer at Needsmust Electricity Power Station in St. Kitts. “This is so much easier, and much more reliable, than the manual monitoring and logging methods we were using in the past. It gives us the ability to combine all engine data for the station into one page, therefore making it easier to assess the station status.”

“We are very pleased with the convenience of obtaining our data reading automatically from the PowerData web site,” said Terrence Simmon, Power Station Operations Manager of the Sonesta Maho Beach Hotel in St. Maarten. “It has increased our reliability significantly.”

As the benefits of real-time data monitoring from a web browser become more apparent, Cameron Burn expects to see a growing demand for this use of the DataHub.

Industrial SaaS Whitepaper

We just posted a new whitepaper discussion on “What is a Good Approach to Industrial SaaS.”  Software as a Service (“SaaS”) provides access to hosted software over a network, typically the Internet, and is closely related to the concepts of smart factories, cloud computing, industrial Internet, machine-to-machine (M2M), and the Internet of Things (IoT).

“A chain is only as strong as its weakest link,” goes the old saying. How true that is in industrial control systems. … Factory automation, power generation, resource extraction and processing, transportation and logistics are all supported by chains of mechanisms, hardware, and software, as well as operators and engineers that must each carry out their mission to produce the expected output, product, or service.

The whitepaper goes into a discussion of the key qualities a what is necessary for widespread acceptance of industrial SaaS, such as:

  1. Security: industrial systems require the highest possible level of security, and achieving it over unsecured networks involves a comprehensive approach from the design stage of the overall system.
  2. Robustness: industrial software as a service should provide as close to real-time performance as the network or Internet infrastructure will support, such as milliseconds updates, thousands of data changes per second, and support redundant connections with hot swap over capability.
  3. Adaptability: industrial SaaS should be able to connect seamlessly to any new or installed system at any number of locations with no changes to hardware or software, using open data protocols and APIs, and readily scale up or down depending on user needs.
  4. Convenience: industrial SaaS should be convenient to use, from ease of demoing, to sign up, configuration, usage monitoring and low cost.  It should offer off-the-shelf tools to get your data to and from the cloud with no programming, provide the ability to easily integrate data from multiple sources, and include options like data storage and HMI displays–all without disrupting the industrial process in any way.

Read the Whitepaper.