Posts

Who Owns the Factory?

My local Toyota dealer owns my car.  My name may appear on the ownership papers, but I know better.  The dealership tells me when I’m due for maintenance, what each thing will cost, and why it’s important to repair or replace it.  Sometimes I think they care more about my car than I do.  Of course, they get paid for this service, but it is also in their best interest to keep my car running in tip-top shape, because a satisfied customer is a repeat customer.

It wasn’t always this way.  In younger days when money was scarce and time was free, and I could do anything I put my mind to, I got a few books and set about doing my own car repairs.  After some trial and error, I was able to do normal maintenance, and even undertake a few more complicated repairs like change a radiator core or rebuild a carburetor.  But over the years cars have gotten more complex, and time has become more valuable.  Now I’m more than happy to turn the whole project over to the experts.  As far as I’m concerned, the dealership owns the car.

Who owns the project?

Seems like factories may be going in the same direction.  To get the most out of “smart” manufacturing, the IIoT, and Industrie 4.0, factory owners and operators are relying more and more on outside expertise.  System integrators are stepping in to fill the gap, and some of them are realizing that they can provide the most value to their customers by taking ownership.  Maybe not the factory itself, but the projects they implement.  The question, “Who owns the project?” really boils down to, “Who takes responsibility for it?”

Robert Lowe, co-founder and CEO of Loman Control Systems Inc., a certified member of the Control System Integrators Association (CSIA), recently suggested this idea in an Automation World blog, End-User Asset ‘Owned’ by a System Integrator. He sees a need for system integrators to take on more responsibility by supporting their clients “beyond the project.”  He proposes a new acronym, SIaaS, for System Integration as a Service.  Providing “service and support for maintenance, machine monitoring, machine performance, process performance, reporting, technology upgrades, cybersecurity and so forth” frees the end-user to “focus on making its product and not be dependent on inside resources for sustainable performance.”

Lowe goes on to explain how system integrators are in a unique position to partner with companies on a project they have completed, because they understand well how it works.  Not only did they build it, but they have more experience monitoring, maintaining, and upgrading similar systems.  Rather than finding, training, and maintaining specialized staff to keep the system running, the plant owner can keep his or her people focused on the bigger picture of getting their product out the door.  And the system integrator who owns the asset will ensure that it performs well, because a satisfied customer is a repeat customer.

Skkynet supports system integrators who want to provide their expertise as a service.  On the one hand, our technical solutions—DataHub, SkkyHub, and ETK—are all available “as a Service”. More significantly, research and experience have shown that many IoT projects run into unexpected difficulties.  Rather than expending the resources to build and maintain a secure and reliable IIoT system on their own, plant management and system integrators can hand that responsibility over to those with the expertise, and cut their costs as well.

Manufacturers and Machine Builders Weigh In on IIoT

With all the conversation swirling around about Industry 4.0 and the Industrial IoT, you sometimes have to wonder what’s actually trickling down to those people who are expected to buy in, like manufacturers and machine builders.  The bottom line is that someone is going to have to invest in the IIoT, and they expect to get a return on that investment. IIoT proponents are counting on manufacturing companies and OEMs to put some skin in the game.  But who is talking to them?

At least one person is.  Larry Asher, Director of Operations at Bachelor Controls Inc., a certified member of the Control System Integrators Association (CSIA), has been meeting with long-term customers in a number of industrial fields, and asking them for their thoughts on the IIoT. Their responses indicate an overall positive view of the potential.

Asher first reiterates a growing understanding that the IIoT is not just a new term for industrial networking, or SCADA as usual.  He says, “Though it is true that networking has existed as part of industrial control solutions for many years, traditional isolated control networks will not support the level of integration required for large-scale data and analytics, nor will they support the number of connected devices that will be a part of IIoT-based solutions. IIoT-based solutions demand connectivity, accessibility and security, making the network infrastructure critical.”

He then shares the insights garnered from his conversations, organized into four areas that the IIoT is expected to impact: data analysis, mobile/remote access, supply chain integration, and preventative maintenance.

Summary of Insights

Here is a summary of how the manufacturers and machine builders he met with view the impact of the IIoT:

Data and Analytics: Everyone agrees that investing in IIoT to enhance data collection and develop more sophisticated and powerful analytics is a good thing.  Applying this higher level of analysis is already impacting procedures and control implementation on the plant floor. Some manufacturers are even revising company organizational structures to bring in people who can maximize performance and profit using IIoT data.

Mobile/Remote Access: Access to data via mobile devices and/or from remote locations has seen less interest, but that is expected to change.  Right now the implementation is fairly low, despite the significant number of products and options available, perhaps due to a perception of high cost.  But, as Asher reports, “mobility remains as a central theme and poised for rapid growth with a change in the value proposition.”

Supply Chain Integration: As to supply chain integration, there was a wide range of experience.  Some saw little or no difference between current practices and what the IIoT has to offer, while others reported that the integration is so complete that suppliers now effectively have direct access to user inventory levels.

Preventative Maintenance: Manufacturers and OEMs alike appreciate the value of IIoT-based preventative maintenance.  With machines and equipment connected directly to the vendor, manufacturers can automatically generate maintenance work orders or request spare parts.  Vendors gain a competitive advantage when they are able to monitor and remotely service their equipment 24/7, which also provides them with a source of recurring revenue.

Overall, the views of those at manufacturing plants responsible for ensuring ROI validate the practicality and cost-effectiveness of the Industrial IoT.  As word gets out, and more decision-makers understand the benefits, we expect to see increased levels of adoption.

System Integrators Defend Their IIoT Readiness

A clear sign of a growing opportunity is when people start staking their claims.  Here’s a case in point.  A recent blog in AutomationWorld has caught the attention of system integrators, and from their comments it seems to have rubbed some of them the wrong way.  The blog, The IIoT Integrators Are Coming, by Senior Editor Stephanie Neil, claims that automation system integrators may lose out on IIoT opportunities if they don’t keep up with the technology, leaving the space open for non-industrial IoT companies from the IT world.

Several control system integrators, members of Control System Integrators Association (CSIA), have responded saying that Neil and the people she quotes are mistaken.  They explain the differences between consumer or business IoT and Industrial IoT, and point out that it is easier for a company that knows industrial automation to add IoT to their portfolio than for an IoT company to learn industrial process control. For example, in counter-blog We Are Ready for IIoT, Jeff Miller of Avid Solutions makes the case that his company, at least, is ready.

If nothing else, this conversation provides a useful window into what these potentially key players in the Industrial IoT space are thinking.  On the one hand, some realize that IIoT can be a valuable service to offer their customers, and are gearing up for it.  Others are holding back, questioning the value, reluctant to test the waters, and wondering whether this isn’t just mainly hype that will evaporate in a year or two.  But, according to Neil, if they wait too long, someone else will swoop in and steal their lunch.  And that person or company may be completely outside the traditional world of industrial system integration.

Who is right?

Our take on this is simple.  Both are right.  First, anyone from the IT realm working in IoT needs to know that there is a real difference between regular IoT and Industrial IoT.  An industrial user of the IoT will have special requirements, different and in many cases far beyond what someone might need for a general business or consumer application. At the same time, system integrators must understand that the knowledge required for building an IoT application is highly specialized. It takes a deep understanding of TCP and working with unstructured data, in addition to the critical issue of Internet security.  Above all, we encourage system integrators to keep an open mind, and treat the IIoT as a new opportunity to better serve their customers.

As to the best approach to take, we see at least two: do it yourself, or partner with someone who provides good tools. We won’t stand in the way of the DIY’ers in the crowd, but for those who value tools, we have an easy and cost-effective way to implement the Industrial IoT that works. It does not require integrators to learn new protocols or build security models. It simply connects to in-plant systems and provides the remote data access that automation engineers expect: secure, bi-directional, and real-time, with no open firewalls, no VPNs, and no programming. And it has a revenue-share model for system integration companies that want to enjoy the financial benefits of the IIoT.

Case Study: Coca-Cola Bottler, Ireland

State-of-the-art Coca-Cola plant uses DataHub scripts to integrate alarm data and reports.

One of the largest soft drink manufacturing plants in the world, Coca-Cola’s Ballina Beverages facility, recently installed the DataHub® from Cogent Real-Time Systems (Skkynet’s subsidiary), to log alarm data and create end-of-shift reports. The 62,000 square meter plant, located in Ballina, Ireland, uses the most up-to-date manufacturing automation systems available, and management is constantly looking for ways to improve them.

Some of the equipment used at Ballina Beverages is designed and manfactured by Odenberg Engineering. Odenberg, in turn, relies on their subsidiary, Tricon Automation to handle the process control of the machinery.

In a recent upgrade to the system, the Odenberg/Tricon team chose the DataHub to construct custom log files to track and archive their alarms. They wanted to combine the live data from each triggered alarm with a text description of the alarm, and then log the results to a file. The alarms were being generated by an Allen-Bradley system from Rockwell Automation Inc., and the 1500 alarm descriptions were stored in an Excel spreadsheet. Each row of the final log would have to combine the time, date, and code of a triggered alarm with the corresponding description of that alarm.

After considering several different scenarios, the most effective approach was to connect the DataHub to Rockwell Automation’s RSLinx using its OPC server, and then to read in the alarm condition strings from a text file (instead of from the spreadsheet), using a DataHub script. The same script writes the data to the log file. This works so well that they decided to use another script to create end-of-shift reports.

“We got the basic system up and running in a few hours,” said Gus Phipps, team member from Odenberg, “which was good, because we were working under a tight deadline. Cogent helped us out with the DataHub scripting, but we were able to do most of the work ourselves. It went surprisingly quickly.”

“Using the DataHub’s scripting language let us customize it to exactly meet our needs,” said George Black, Tricon’s project manager. “It is very flexible, and yet completely robust. It is months now since the project was completed, and the DataHub continues working away merrily every day, just doing its job. We plan to use it again in other projects very soon.”

Case Study: Wind Turbine Farm, USA

DataHub Scripting solution calms the conflict of bats vs. blades

Required by law to protect a rare species of bat, a major wind power generation company finds a solution using the Cogent DataHub®.

A rapid expansion of wind farms across the Eastern and Central United States has been checked in the past couple of years due to growing concerns for wildlife. An endangered bat species lives in that area, and is protected by law. Fears that the whirring blades of wind turbines could be harmful to this species of bat were sufficient to halt construction of a wind farm in West Virginia in 2009, and the discovery of a dead bat near a wind turbine in Pennsylvania in 2011 caused the power company to shut down the whole 35-turbine system for several weeks.

Although wind turbines are known to cause a few fatalities among common tree-dwelling bats, the endangered bat was thought to be largely safe, as it lives in caves, hibernates for more than half the year, and is seldom found in the vicinity of wind turbines. However, in the fall these bats migrate from their feeding grounds to their home caves for the winter. During this time, the chances of them passing through a wind farm are greatly increased.

In March a few years ago a major power company in the USA was informed by the US Fish & Wildlife Service that a number of turbines on the bat migration routes would need to be shut down while the bats are migrating. This caused quite a stir. The migration period for the bats is two months long―from mid-August to mid-October. Shutting down the whole system for that length of time would be very costly, not to mention the loss of clean energy which would need to be replaced by fossil fuels.

To maximize uptime, the company gained permission to let the turbines run during the times that the bats were not flying – all daylight hours, and in the night time when air temperatures drop below a specific temperature setpoint, or when the wind is fairly strong. The challenge was to implement a complete solution. A single bat fatality could mean full shut-down, legal penalties, and even lawsuits.

Top management at the company immediately took action, contacting the wind turbine manufacturer, who also provides the control systems. After several months of emails and meetings, it became apparent that the manufacturer would not have anything ready in time for the mid-August deadline.

“With three weeks to go, they told us there was no solution in sight,” said the SCADA engineer responsible for the project, “and we would need to go to manual operation, and reconfigure the cut-in speed on every turbine, twice a day.”

Most wind turbines are designed to “cut in”, or start turning to produce energy, when the wind is blowing at a certain speed. For these turbines, the normal cut-in speed is 3.5 meters per second. As the bats are active in low to moderate wind speeds, the company would need to raise that to 7 meters per second each night, and then drop it back down to 3.5 the following morning. This would mean manually reconfiguring the PLCs for 100 turbines, twice a day.

A better way

“I thought there must be a better way,” the project manager continued. “We’d been using the DataHub for years, and knew the potential was there to leverage this asset further. I gave Cogent a call, and told them what we were up against. They delivered by helping us to develop a very efficient program using the native scripting language of the DataHub. The code ran right on the SCADA interface of the OEM system – so it’s as reliable as you can get.”

“Working together with Cogent, we came up with a DataHub script that doesn’t change the cut-in speed of the turbines at all. We just blocked them from starting. The script tells each turbine to stay off, and keeps measuring wind speed. When it picks up to 7 meters per second, the script releases the turbine to start, and it ramps right up to the operating state. At the end of the day, we have a complete audit trail of every turbine controlled, including a history of critical parameters, such as rotational and wind speeds, and energy curtailed.”

“The script also has a temperature component. On cool nights in September and October, when the temperature drops below the dew point, it uses the same algorithm for starting and stopping the wind turbines.”

By the first week of August a test script was written, and after a few days of testing and last-minute tweaks, it was ready. The system went live on August 15th, and is meeting all expectations. Every night, whenever the air temperature is above the setpoint and the wind speed falls below 7 meters per second, the wind turbines stop, allowing the endangered bats to return safely to their caves for a long winter hibernation.

“I call the DataHub the Canadian Swiss Army Knife,” said the project manager. “We are able to accomplish a host of required functions with a single product solution. The ability to provide sophisticated logic and control algorithms with the built-in functionality of this product is the game changer. Being able to securely deliver real-time data between a site and the control center system allows the dispatch team to monitor the control process and maximize the production of clean, renewable, energy sources. Talk about a smart grid – who would have thought we’d be doing this type of thing in real time?”

Case Study: University of California, Berkeley, USA

DataHub is used to integrate data for distributed control of unmanned aerial vehicles

For the past several years, students and faculty at the Vehicle Dynamics Lab (VDL) of the University of California, Berkeley, have been developing a system of coordinated distributed control, communications, and vision-based control among a group of several unmanned aircraft. A single user can control the fleet of aircraft, and command it to carry out complex missions such as patrolling a border, following a highway, or visiting a specified location. Each airplane carries a video camera and an on-board computer, and communicates with the groundstation and the other aircraft in the formation. The control algorithms are so sophisticated that the fleet can carry out certain missions completely autonomously—without any operator intervention.

The control system for each aircraft runs on a PC 104 computer with a QNX6 operating system. Control is divided into three kinds of processes: communication, image processing, and task control. All of these processes interact through the DataHub running in QNX. The DataHub® is a memory-resident, real-time database that allows multiple processes to share data on a publish-subscribe basis. For this application, each process writes its data to the DataHub, and subscribes to the data of each other process on a read-only basis. In this way, each process gains access to the data it needs from the other processes, while avoiding problems associated with multi-processing data management.

For example, the communication software comprises three separate processes: The Piccolo process controls the aircraft, the Payload process communicates with users on the ground, and the Orinoco process handles communications with the other aircraft. Needless to say, each of these three programs needs information from the other two, as well as from the video and task control packages. All of this data is transferred seamlessly through the DataHub.

“The DataHub has contributed a great deal to our software integration,” said Brandon Basso, one of the VDL team members. “Its ability to restrict write privileges to each shared variable of the owner processes avoids many of the difficulties associated with multi-process management.”

For task control, there are two primary software packages: Waypoint controls visits to specified locations, while Orbit handles the orbiting “patrol” of a group of locations. These processes are monitored by a third, supervisory process called Switchboard. In addition to coordinating these processes, decisions must be made by the different aircraft as to which plane will take on which task. The complex calculations needed for this decentralized task allocation are mediated through the DataHub.

Waypoint and Orbit use input from the vision control and vision process. Prior to takeoff, certain algorithms are applied to previously recorded videos, to create a visual profile of the area, which is maintained by the vision control. In the air, this data must be compared to what the plane is currently flying over. A camera on the wing of the plane feeds data to the vision process, which analyzes the content and generates meaningful information about objects on the ground, such as waypoints on a river or road. This live content, along with the stored visual profile in the vision control, is fed through the DataHub to Waypoint and Orbit.

According to the paper, A Modular Software Infrastructure for Distributed Control of Collaborating UAVs, published by the University of California Berkeley which describes it in detail, this project marks “a major milestone in UAV cooperation: decentralized task allocation for a dynamically changing mission, via onboard computation and direct aircraft-to-aircraft communication.” Skkynet is pleased that the DataHub has played an important role in the success of this endeavour.