Case Study: BP Pipelines, USA

Integrating legacy and new systems

BP Pipelines operates one of the largest networks of pipelines in the United States, transporting over 450 million barrel-miles of petrochemicals per day. Their control center in Tulsa, Oklahoma is responsible for the transport of oil and natural gas from South-Central and Midwest oil fields to locations nationwide.

Recently the management at the Tulsa control center decided to add leak detection to their SCADA system to monitor pipeline leakage. The SCADA system, by Telvent, gathers data from production systems and stores it in a Sybase database that has been modified for real-time applications. The challenge was to feed process data from Sybase database to the leak detection system, which had available an OPC server.

“We tried for months to find an OPC server that would communicate via ODBC to the real-time Sybase product,” said Chuck Amsler, Team Leader for SCADA Applications at BP Pipelines. “It was an old version of ODBC, and we just couldn’t get at the data. None of the applications we tried could do it.”

Finally Chuck called Cogent to see if there was some way the Cogent DataHub® could be used to make the connection. After a few hours of consulting with Cogent’s technical staff, he had a DataHub script that supports a connection to the Telvent system and queries the Sybase database. With his process data reaching the OPC DataHub, it was just a matter of bridging the data to the leak detection system’s OPC server. Now the data flows from Telvent to the leak detection system reliably and consistently.

“Once we saw how easy it was for the DataHub to make the connection,” said Chuck, “we decided to use it to log the results.” With Cogent’s help he wrote another script to transfer the leak detection calculations back to an Oracle database for eventual re-use by the SCADA system.

The DataHub scripts give a large degree of flexibility for customization. On the Sybase side, there are actually two servers running, one hot, and the other for backup. The system can switch from hot to backup at any time. For every query, the script tests for which server is hot, and always reads from the correct server.

On the Oracle side, dynamic scripting allows members of Chuck’s team to modify the logging process even while the system is running. They can add, delete, or change data points that qualify the basic pipeline data, without breaking the connection or interfering with the logging.

“We are very impressed with the overall quality of the DataHub” said Chuck, “and with the level of support from Cogent. We look forward to working with them as we move from data gathering to the next stages of the project.”

Case Study: Coca-Cola Bottler, Ireland

State-of-the-art Coca-Cola plant uses DataHub scripts to integrate alarm data and reports.

One of the largest soft drink manufacturing plants in the world, Coca-Cola’s Ballina Beverages facility, recently installed the DataHub® from Cogent Real-Time Systems (Skkynet’s subsidiary), to log alarm data and create end-of-shift reports. The 62,000 square meter plant, located in Ballina, Ireland, uses the most up-to-date manufacturing automation systems available, and management is constantly looking for ways to improve them.

Some of the equipment used at Ballina Beverages is designed and manfactured by Odenberg Engineering. Odenberg, in turn, relies on their subsidiary, Tricon Automation to handle the process control of the machinery.

In a recent upgrade to the system, the Odenberg/Tricon team chose the DataHub to construct custom log files to track and archive their alarms. They wanted to combine the live data from each triggered alarm with a text description of the alarm, and then log the results to a file. The alarms were being generated by an Allen-Bradley system from Rockwell Automation Inc., and the 1500 alarm descriptions were stored in an Excel spreadsheet. Each row of the final log would have to combine the time, date, and code of a triggered alarm with the corresponding description of that alarm.

After considering several different scenarios, the most effective approach was to connect the DataHub to Rockwell Automation’s RSLinx using its OPC server, and then to read in the alarm condition strings from a text file (instead of from the spreadsheet), using a DataHub script. The same script writes the data to the log file. This works so well that they decided to use another script to create end-of-shift reports.

“We got the basic system up and running in a few hours,” said Gus Phipps, team member from Odenberg, “which was good, because we were working under a tight deadline. Cogent helped us out with the DataHub scripting, but we were able to do most of the work ourselves. It went surprisingly quickly.”

“Using the DataHub’s scripting language let us customize it to exactly meet our needs,” said George Black, Tricon’s project manager. “It is very flexible, and yet completely robust. It is months now since the project was completed, and the DataHub continues working away merrily every day, just doing its job. We plan to use it again in other projects very soon.”

Fitting In with Industrial IoT

“I t all sounds fine on paper, but will it work for me?” That’s a question that engineers and system integrators often ask when the topic of Industrial IoT comes up. There are so many ways it has to fit. Industrial systems are like snowflakes–every one is unique. Each facility, factory, pipeline, or power plant was built for a particular purpose, in a different part of the world, at a specific time in plant automation history, when technology had advanced to a certain level. We see a wide range of machines, tools, sensors, and other equipment used with endless combinations of proprietary and home-grown software and data protocols. Over time, plant modifications and expansions along with hardware and software upgrades bring still more variety.

If this diversity isn’t challenge enough, new questions are now popping up about the Industrial IoT itself: How to get started? What service provider to use? What approach or platform is best to take? What are the cost benefits?

Putting all this together, it becomes clear that a good Industrial IoT solution should be a comfortable fit. It should connect to virtually any in-plant system with a minimum of fuss, and provide links to remote systems as well. It should be compatible with multiple data protocols and legacy systems, and yet also integrate seamlessly with future hardware and software. Like putting on a new suit, the ideal is to ease into the IoT without disrupting anything.

Working towards that goal, here’s what a good system should do:

  • Support diverse data communication protocols: OPC, both OPC “Classic” and UA, plays an important role in simplifying and unifying industrial data communications. Any Industrial IoT platform should support OPC, along with common industrial fieldbuses like Modbus, Profibus, HART, DeviceNet, and so on. It should also support more specialized standards like IEC 61850, CAN, ZigBee, and BACnet. In addition to these, Industrial IoT should be compatible with non-industrial standards like HTML and XML for web connectivity, ODBC for database connectivity, DDE for connecting to Excel if needed, as well as the ability to connect to custom programs.
  • Connect to embedded devices: The “of Things” part of the Internet of Things refers primarily to embedded devices. Sensors, actuators, and other devices are getting smaller, cheaper, and more versatile every day. They should be able to connect–either directly or via a wired or cellular gateway–to the cloud. This is an area where SCADA can provide a wealth of experience to the Industrial IoT, and in turn benefit significantly from the expanded reach that Internet connectivity can provide.
  • Work with new or legacy equipment and facilities: Since the introduction of the DCS and PLC in the 1970’s, digital automation has been growing and evolving. While new technologies are constantly being adopted or adapted, many older systems continue to run. With so much engineering, effort, and capital invested in each project, plant management is often reluctant to make changes to a working system. To be accepted in the “If it ain’t broke, don’t fix it” world, an Industrial IoT system should be able to connect to, but not intrude upon, legacy systems. Of course, for new systems, it should do likewise.
  • Use existing tools, or better: The Industrial IoT doesn’t need to reinvent the wheel. Most industrial automation systems have a solid, working set of tools, which might include DCS and SCADA systems; HMIs; MES, ERP and other kinds of databases; data historians, and more. A compatible Industrial IoT implementation should work as seamlessly as possible with all of these tools, using the appropriate protocols. At the same time, it would do well to offer connections to improved tools, if required or desired.
  • Meet Big Data requirements: Among the new tools, the ability to connect existing or future industrial systems with Big Data is one of the main attractions of the Industrial IoT. A compatible Industrial IoT solution should provide connectivity and the performance necessary to feed whatever Big Data engine may be chosen.
  • Allow for gradual implementation: Automation experts and proponents of the Industrial IoT are quick to point out that there is no need to implement this all at once. They often recommend a gradual, step-by-step implementation process. Start with a small data set, an isolated process or system, and build from there. Bring in users as needed. Once you are comfortable with the tools and techniques, you can build out. Naturally, you’ll need an IoT platform that supports this approach.

How Skkynet Fits

With Skkynet, compatibility for the Industrial IoT comes in three components that work seamlessly together: DataHub®, Embedded Toolkit (ETK), and SkkyHub™.

The Cogent DataHub® connects directly to in-plant systems via OPC, Modbus, ODBC and DDE, and is fully integrated with the Red Lion Data Station Plus, to connect to 300 additional industrial protocols. The DataHub supports data aggregation, server-to-server bridging, database logging, redundancy, and other data integration functionality. It also offers WebView, a flexible, web-based HMI.

The Embedded Toolkit (ETK) is a C library that provides the building blocks for embedded systems to connect and communicate with SkkyHub or the DataHub. It has been compiled to run on gateways from Red Lion, B+B SmartWorx, NetComm, and SysLINK, as well as devices from Renesas, Lantronix, Raspberry Pi, Arduino, ARM, and more.

These two components can be connected to and integrated with virtually any industrial system. They can be used separately or together, and can serve as the first stage of evolution towards the cloud at any time, by connecting to SkkyHub.

The SkkyHub™ service collects and distributes real-time data over networks, both locally and remotely. Connecting to the DataHub or any ETK-enabled device, SkkyHub provides secure networking of Industrial IoT data between remote locations, and remote monitoring and supervisory control through WebView.

Skkynet’s Industrial IoT software and services are in wide use today. You can find them connecting manufacturing facilities, wind and solar farms, offshore platforms, mines, pipelines, production lines, gauges, pumps, valves, actuators, and sensors. Their unique combination of security, speed, and compatibility with virtually any industrial system makes the DataHub, ETK, and SkkyHub well-fitting components of the Industrial IoT.