Ransomware Attacks – Choosing an Easy Way Out?

What would you do—right now—if your computer screen locked up and a message appeared, “Your files and data have been encrypted with a strong military algorithm. You have 3 days to pay for our decoder to get your data back.” What if it wasn’t your personal computer at all, but a company computer? What if you owned the company?

In a recent BBC video, reporter Joe Tidy describes the bold response that Norsk Hydro of Sweden made to that kind of a ransomware attack. Rather than succumbing to the hackers’ demands, the 35,000 employees at the company switched over to paper-based operations for days and weeks until the computers could come back online. Salespeople had to work on the factory floor and finance staff made sandwiches, but production in the 170 plants worldwide continued almost unabated.

“I think in general it’s a very bad idea to pay,” Jo De Vliegher, a company spokesperson, told the BBC. “It fuels an industry. It’s probably financing other sorts of crimes.”

Much as we may admire Norsk Hydro’s strong response, the attack and its after-effects cost the company over 50 million dollars. Small wonder that ransomware attacks on businesses have increased by 500% in the past year, according to some sources, and that ransom demands can be in seven figures.

Pay or Perish?

Unfortunately, these circumstances leave some companies with little choice—it’s pay or perish. A survey conducted by Small Business Trends shows that 55% of all SMBs would pay the ransom. It is hard to blame them when another recent report shows that 60% of small companies that sustain a cyber attack go out of business within six months. On the other hand, experts point out that paying the ransom may not solve the problem, since the attacker still may not release the data, or may release part of it and demand more money for the rest.

A Better Solution

Of course, a better solution is to secure your system against ransomware attacks. For a company’s IT department, all of the standard security guidelines apply, as well as ensuring backups of any data needed to run the company. OT (Operations Technology) systems that are increasingly being accessed from outside need to pay special attention. Threats like ransomware attacks that may have seemed irrelevant to an air-gapped system years ago take center stage when OT gets connected to IT. Even with a VPN, any virus that can propagate within IT can make its way into OT.

Strong, closed firewalls are essential, and DMZs can be very useful. In this environment, Skkynet’s secure-by-design software and services allow companies to access their production data without compromising on security. Rather than waiting until after an attack has occurred, the easiest and most cost-effective way to deal with a ransomware attack is to prevent it from happening in the first place.

Academic Achievements

Around this time of year, as the warm spring breezes blow through open classroom windows, students and faculty alike in colleges and universities around the world look forward to graduation and summer holidays. This time of anticipation is also a time to look back, and review the accomplishments of the past academic year.

Now is a good time to recognize some of the year’s outstanding achievements of Dr. Pascal Vrignat and groups of his students at the Polytechnic School of the University of Orleans in Chateauroux, France. Using Skkynet software, they have conducted several sophisticated research projects and studies related to industrial data communication and the Internet of Things.

Back in September last year Dr. Vrignat was the keynote speaker at ECAR2018, the International Conference on Electrical, Control, Automation and Robotics in Xiamen, China. This annual conference invites professors, doctoral students, and other scientists to present their latest research findings. Dr. Vrignat’s presentation, OPC UA: Examples of Digital Reporting Applications for Current Industrial Processes, showed how an OPC UA data feed connected to the Cogent DataHub can be shared among SCADA reporting tools and Excel spreadsheets, populate emails and SMS messages, and power Matlab OPC Toolbox diagnostics and analysis.

In December Dr. Vrignat made a similar presentation titled: Examples of technological building blocks in the context of an application, at the Journee Pedagogique GDR SoC2 Club EEA in Paris. Also at that event he presented two student projects. In the first of these, seven teams of Dr. Vrignat’s students competed in the contest: The Industry of the Future, Internet of Things, is Now! Their entry was a web-based control system of a plastic forming machine that was connected to the Cogent DataHub for integration with Excel, emails, and SMS messages. The second student project was for the international challenge: Xplore New Automation 2018, where Dr. Vrignat’s students were finalists in the Environment category for their IoT project that controlled the deployment of irrigation tubing remotely, by cell phone.

This past January, some student teams and Dr. Vrignat received awards in the category of Pedagogical Innovation at the PEPS Soumission 2019 – Passion for Teaching and Pedagogy in Higher Education. “Offering this challenge for the 1st time in this format has been a complete success in several ways: for the students, for external professionals, in the results achieved, and in student motivation,” said Dr. Vrignat. “We have seen a significant increase in individual and collective skills, and we have shown that ‘project’ and ‘active’ pedagogies can be a very good strategy in teaching.”

Skkynet congratulates Dr. Vrignat on the work he has done, the academic awards earned, and most important, on the valuable contributions he is making to the lives and future careers of his students. We are pleased that he has chosen the Cogent DataHub as a basis for secure, real-time data communications in his projects, and we look forward to supporting his work in the years to come.

The Benefits of Harnessing Live Data

The data is pouring in.  The flow started as a mere trickle of hand-written records on clipboards in the early days of mechanical and pneumatic automation.  It grew to a steady stream with the introduction of PLCs (programmable logic controllers) and SCADA (Supervisory Control and Data Acquisition) systems pooling data automatically.  Now, with the advent of IoT and digital transformation live data is gushing through industrial systems in a mighty torrent.

As with the flow of water, this flow of live data has power. Harnessing it can mean more efficient operations, savings in labor and material costs, and overall improvements in quality.  What’s needed is software to facilitate the collection, analysis, and distribution of the results in real time.

This is what a recent survey of 500 mid-level manufacturing professionals suggests.  The Plutoshift report, The Challenge of Turning Data Into Action, says over three quarters of their respondents agreed that “in order to take immediate action based on collected data, they need software solutions that analyze data in real-time.”

Problem: Manual data entry

Summing up the report’s findings: despite well-known benefits of digital transformation, the adoption rate has been low.  Only 12% of those surveyed have configured their systems to respond automatically to incoming data.  The common feeling is that data inputs are not reliable enough for automated response.  About half of the respondents are still using manual data entry.  This in itself can introduce errors, and perhaps worse, the data almost immediately goes stale until the next manual entry is made.  The more stale the data gets, the more likely it will be incorrect.  And an automated response to stale data could be catastrophic.

For example, a machine may only be checked by an operator once per day on a plant floor walk-through.  If it develops an irregular vibration, it could be hours before it is noticed.  An automated system using manual data input might keep it running, possibly damaging the equipment.  On the other hand, an inexpensive IoT sensor on the machine could send notification as soon as a problem is detected, and trigger an alarm or automatic speed adjustment until an operator could take remedial action.

Once the data is streaming in, there are many companies out there like Plutoshift that can help manage it.  Skkynet’s focus is the data stream itself—to ensure it is secure, reliable, and up to date—to the millisecond.  This will allow those who use the data to take full advantage of automated response mechanisms, to actively participate in digital transformation. Like the human nervous system relaying data from the outside world, effective digital transformation depends on harnessing live data.  After all, you can only know as much about your world, or your system, as the data tells you.

Redefining Middleware

Harry Forbes, lead analyst for the Distributed Control System (DCS) market at ARC Advisory Group makes an insightful observation: Skkynet redefines middleware.  In a recent paper, Middleware’s Changing Role – to Serve Industrial IoT, from the perspective of 30 years of hands-on experience in industrial automation and control, Forbes looks at Skkynet’s software and services and sees how they fit together to provide middleware for Industrial IoT.

In a way, you could say that Skkynet got its start in middleware.  The first large-scale implementation of the DataHub, installed in a chocolate making plant in Toronto, provided a fast and reliable connection between a Wonderware application running in Windows and supervisory control software running in QNX.  According to Wikipedia, the term “middleware” is “most commonly used for software that enables communication and management of data in distributed applications.”

Among other things, this is what the DataHub, the ETK, and SkkyHub do.  They enable communication and data management between applications.  What’s new is that they expand this middleware functionality across industrial networks and the Internet, in the Industrial IoT. As Forbes explains, “[T]he Industrial IoT requires applications and services that span across enterprises, as well as reaching assets deployed in field locations, many of them quite remote.”

It seems that we are, in a sense, redefining middleware.  Maybe that’s what we’ve been doing all along, but just never saw it that way.  “Middleware” was originally defined as software that intermediated between the operating system and software applications.  In the 1980s that definition was expanded to include linking between older and newer applications.  With the advent of networking, distributed systems and client/server architectures, middleware is now commonly referred to as simply the “glue” that holds things together.

The DataHub architecture combines several features that support the traditional understanding of middleware:

  • It does protocol conversion, allowing software using one protocol to connect to software using a different protocol.
  • It supports TCP communication, allowing connected software to connect over a network or the Internet.
  • It has an API that allows custom connections from various software packages.

In addition, the DataHub offers features that may or may not be considered traditional middleware:

  • It aggregates the data from all connected clients into a single, universal data set, to which any client can subscribe.
  • Its built-in scripting language allows for data to be manipulated or transformed as it flows through the system.
  • It provides an HMI that supports visualization of the data.

The combination of all of these features, and possibly others that I have not mentioned, makes DataHub technology an ideal middleware solution for the Industrial IoT.  The ETK provides a basic subset of this functionality at the embedded device level.  And SkkyHub offers a different subset, along with additional functionality on the cloud.  Taken together, “These three components can be assembled quickly and in different configurations to fit various Industrial IoT application requirements,” is how Harry Forbes described the Skkynet software and services, providing the security, scalability and performance that is critical for any industrial application.

Data Sharing Needed for Sustainable Energy

Sustainable energy can be profitable. That, in a nutshell, is the finding of a GreenBiz Research survey presented in the 2019 Corporate Energy & Sustainability Progress Report from Schneider Electric. And an important key to those profits is sharing data.

“Companies agree that sharing data is important, with those that share the most seeing significant benefit,” the report said. This importance of data sharing stands out in the context of the overall report findings, which are broken up into 5 main topics:

  • Funding: Executives that demonstrate ROI (return on investment) and provide strong leadership can overcome perceived obstacles, such as insufficient capital.
  • Data: The challenge is to ensure the quality of collected data, and to share it effectively.
  • Goals: Setting public targets or goals for energy conservation and sustainability drives motivation and success.
  • Energy: Strategic sourcing optimizes usage, yielding significant cost savings in a volatile energy landscape.
  • Technology: Energy efficiency and renewables, based on data-driven technologies, are a leading source of ROI.

Ultimately, for a sustainable energy project to succeed, it must provide a solid return on investment. This report affirms the experience of our customers in wind and solar that the better the quality of their data, and the more they are able to share it, the higher their ROI.

For example, a wind farm doesn’t operate in isolation. In addition to the electrical power it sends to the grid, each wind turbine also sends data for its rotor speed, operating state, power output, and more out to control engineers and automated systems to optimize performance. This data can also be integrated with other data arriving in real time. Weather and climate conditions can be introduced, along with real-time market pricing, to generate live, real-time cost/benefit analyses.

Seeking ways to share data

Sharing data like this takes both cooperation and technology. The various players involved have to agree on what to share and how. Reviewing last year’s survey, the report noted that “respondents indicated that 80% of their companies had energy and sustainability data collection projects underway.” And this year “the research finds that more companies are now seeking the most efficient ways to share the data that has been collected.”

We are pleased to see this growing level of awareness of the need for data sharing. At the same time, we actively encourage executives, managers and engineers who are looking for more efficiency in their data sharing practices to consider our approach. It could be just what they need to boost the ROI of their sustainable energy projects.

Embarking on the Journey of Digital Transformation

A Skkynet team attended the 23rd Annual ARC Industry Forum last week in Orlando, Florida, themed “Driving Digital Transformation in Industry and Cities”. They came back with a vision of how the digital transformation journey is shaping up—for those who are in the driver’s seat.

The main takeaway was that among this group of C-Level executives, VPs, directors and managers of some of the top manufacturing and process industries worldwide, everyone is on the journey. Some are starting out, others well underway, while still others have already completed some significant milestones. “People are realizing that this is something they have to do,” said Michael Quartarone, Skkynet’s Director of Channel Sales. “Everyone was interested in two things: What does the future state look like? and What can we learn from others that will help us on the journey?”

Expert guidance

Guiding them on this journey were digital transformation experts from major corporations like AVEVA, Ford, Schneider Electric, BP, Dow, and GE. In a series of keynotes on the first day, these seasoned veterans of IoT implementation shared their experiences of how they turned a lofty vision into concrete action. They told their stories of how they got started, where they got stuck, who helped them, what resources they tapped into, and what are the business cases that validate their efforts.

David Kramer of Ford gave a particularly compelling description of how they are taking a long view of digital transformation, the steps they are taking and how they look for quick wins as they execute on their vision. Steve Beamer at BP shared their challenges of ensuring that their digital transformation efforts deliver key process improvements where they can impact things like worker safety.

Forum Focus

The bulk of the conference consisted of forums that covered topics like IoT data communications, security, edge processing, OT-IT convergence, and more. The Skkynet team remarked on how receptive the industry leaders they met were to our technology, and how well they grasped the value of what we are doing. “There is a noticeable change in perception of IoT and its value,” said Xavier Mesrobian, Skkynet’s VP of Sales and Marketing. “People now understand more clearly the challenges of providing secure, remote access to OT systems in real time, and they appreciate what we offer.”

“Some big vendors are taking this journey,” said Quartarone. “Microsoft, Intel, SAP, GE, and other players are actively engaged in in the IoT space delivering innovation and solutions. They all had booths at the conference, demonstrating their level of commitment. We had some very fruitful and enlightening conversations.”

The forum was hosted by ARC Advisory Group, who offer advisory services, marketing analytics, and technology selection services at the corporate level for the manufacturing and process industries. Skkynet has been working closely with ARC for years now, and will continue to build our relationship, sharing our perspective and expertise in in this journey of IoT and digital transformation.