Case Study: Wind Turbine Farm, USA

DataHub Scripting solution calms the conflict of bats vs. blades

Required by law to protect a rare species of bat, a major wind power generation company finds a solution using the Cogent DataHub®.

A rapid expansion of wind farms across the Eastern and Central United States has been checked in the past couple of years due to growing concerns for wildlife. An endangered bat species lives in that area, and is protected by law. Fears that the whirring blades of wind turbines could be harmful to this species of bat were sufficient to halt construction of a wind farm in West Virginia in 2009, and the discovery of a dead bat near a wind turbine in Pennsylvania in 2011 caused the power company to shut down the whole 35-turbine system for several weeks.

Although wind turbines are known to cause a few fatalities among common tree-dwelling bats, the endangered bat was thought to be largely safe, as it lives in caves, hibernates for more than half the year, and is seldom found in the vicinity of wind turbines. However, in the fall these bats migrate from their feeding grounds to their home caves for the winter. During this time, the chances of them passing through a wind farm are greatly increased.

In March a few years ago a major power company in the USA was informed by the US Fish & Wildlife Service that a number of turbines on the bat migration routes would need to be shut down while the bats are migrating. This caused quite a stir. The migration period for the bats is two months long―from mid-August to mid-October. Shutting down the whole system for that length of time would be very costly, not to mention the loss of clean energy which would need to be replaced by fossil fuels.

To maximize uptime, the company gained permission to let the turbines run during the times that the bats were not flying – all daylight hours, and in the night time when air temperatures drop below a specific temperature setpoint, or when the wind is fairly strong. The challenge was to implement a complete solution. A single bat fatality could mean full shut-down, legal penalties, and even lawsuits.

Top management at the company immediately took action, contacting the wind turbine manufacturer, who also provides the control systems. After several months of emails and meetings, it became apparent that the manufacturer would not have anything ready in time for the mid-August deadline.

“With three weeks to go, they told us there was no solution in sight,” said the SCADA engineer responsible for the project, “and we would need to go to manual operation, and reconfigure the cut-in speed on every turbine, twice a day.”

Most wind turbines are designed to “cut in”, or start turning to produce energy, when the wind is blowing at a certain speed. For these turbines, the normal cut-in speed is 3.5 meters per second. As the bats are active in low to moderate wind speeds, the company would need to raise that to 7 meters per second each night, and then drop it back down to 3.5 the following morning. This would mean manually reconfiguring the PLCs for 100 turbines, twice a day.

A better way

“I thought there must be a better way,” the project manager continued. “We’d been using the DataHub for years, and knew the potential was there to leverage this asset further. I gave Skkynet a call, and told them what we were up against. They delivered by helping us to develop a very efficient program using the native scripting language of the DataHub. The code ran right on the SCADA interface of the OEM system – so it’s as reliable as you can get.”

“Working together with Skkynet, we came up with a DataHub script that doesn’t change the cut-in speed of the turbines at all. We just blocked them from starting. The script tells each turbine to stay off, and keeps measuring wind speed. When it picks up to 7 meters per second, the script releases the turbine to start, and it ramps right up to the operating state. At the end of the day, we have a complete audit trail of every turbine controlled, including a history of critical parameters, such as rotational and wind speeds, and energy curtailed.”

“The script also has a temperature component. On cool nights in September and October, when the temperature drops below the dew point, it uses the same algorithm for starting and stopping the wind turbines.”

By the first week of August a test script was written, and after a few days of testing and last-minute tweaks, it was ready. The system went live on August 15th, and is meeting all expectations. Every night, whenever the air temperature is above the setpoint and the wind speed falls below 7 meters per second, the wind turbines stop, allowing the endangered bats to return safely to their caves for a long winter hibernation.

“I call the DataHub the Canadian Swiss Army Knife,” said the project manager. “We are able to accomplish a host of required functions with a single product solution. The ability to provide sophisticated logic and control algorithms with the built-in functionality of this product is the game changer. Being able to securely deliver real-time data between a site and the control center system allows the dispatch team to monitor the control process and maximize the production of clean, renewable, energy sources. Talk about a smart grid – who would have thought we’d be doing this type of thing in real time?”

Case Study: ABB Energy Automation, Italy

Secure OPC tunnelling between power plants and company offices

In two recent projects, Italy’s ABB Energy Automation has developed a control solution that feeds data from power plant facilities directly to corporate offices – in real time – using the Cogent DataHub®. A key requirement was to provide a highly secure means of data transmission, with the minimal risk of break-ins. The DataHub tunnelling solution establishes a secure, reliable connection between the power plant and corporate networks.

ABB Energy Automation implements software and control systems for power plants to ensure that equipment operates at optimum speed and efficiency. For this project, it became clear that several Italian power companies would benefit substantially by monitoring the performance of the plant directly from the company offices. Mr. Michele Mannucci, ABB Project Engineer, began looking for a way to make the connection, using the most reliable and secure means available.

“Customers are very sensitive about security these days since they need to exchange information on the web,” he said. “We had OPC DA servers on our equipment, but found that using DCOM for networking was too risky. It required us to open too many ports in our firewalls. We had to find a way to avoid using DCOM.”

A search on the web brought Mr. Mannucci to the DataHub. For the first test, he connected the DataHub to the plant’s DigiVis Freelance 2000 OPC server, and then connected to an OPC client, tunnelling through the plant firewall using just one open port. With that working, he installed another DataHub on the corporate network, and then created a mirroring connection between the two DataHubs.

For the production system, the company decided to use ABB’s own proprietary OPC server on the secure LAN in the plant, and connect that to the DataHub. From the DataHub the data flows out through a single port on the plant firewall via SSL-encrypted TCP to a DataHub in the corporate offices, which is connected to the corporate LAN. The two DataHubs mirror the data, so that every data change on the plant LAN is immediately received on the corporate LAN.

“For us, this OPC tunnel is very good, because we only need to open one port, and we are secure from DCOM break-ins,” said Mannucci. “We are considering installing this same solution in our top plants.”

It took only a few days for Mannucci to go from initial testing to a working system in the first power plant. The second system was up and running in a similar time frame. Both systems have been running 24/7 since installation, with no breaches in security.

Case Study: Gazprom, Russia

Gazprom integrates SCADA, HMI modules, RTUs, data processing and historical archiving

Gazprom, the largest gas producing company in the world and responsible for 8% of Russia’s GDP, is using the DataHub® to monitor and control pumps, valves, consumption control units, cranes, and other equipment along 23,000 kilometers of pipeline spanning much of western Russia. The control system was developed by the Federal State Unitary Enterprise and is called the Unified Remote-Control Complex, or UNK TM. Software sales and support were provided by SWD Software Ltd., a QNX and Cogent distributor in St. Petersburg, Russia.

“The DataHub was the perfect tool for the job,” said Mr. Leonid Agafonov, Managing Director of SWD. “It is easy to use and provides robust connectivity for the whole control system. Our customer is very pleased with the project, particularly the reliability of the software.”

The system is an open, distributed-information control system with modular hardware architecture running on the QNX 4 operating system. A DataHub operates in each Control Room, and is connected to a number of Remote Terminal Units (RTUs), which in turn are connected to valves, pumps, and other hardware. The DataHub is also connected to a SCADA system, various HMI modules, and the Cascade Historian, which stores data to disk.

The system provides real-time operation, a multi-window graphical user interface, data processing components, and archival disk storage of data. Workstation devices and services, such as electrochemical protection and operational service can be added or removed at any time. There is also teletext communication between the Control Room and the RTUs, through the DataHub.

The Unified Remote-Control Complex has successfully passed tests administered by the Interdepartmental State Testing Commission and has been recommended for use at OAO “Gazprom” units and facilities. The system was developed by the Federal State Unitary Enterprise “FNPZ Y.E.Sedakov NIIIS”. It has a Measurement Instrumentation Approval Certification #6398 and is listed as #18430-99 in the State Measurement Instrumentation Register.

Case Study: TEVA API Pharmaceuticals, Hungary

TEVA combines tunnelling and aggregation to network OPC data through a firewall

Laszlo Simon is the Engineering Manager for the TEVA API plant in Debrecen, Hungary. He had a project that sounded simple enough. Connect new control applications through several OPC stations to an existing SCADA network. The plant was already running large YOKOGAWA DCS and GE PLC control systems, connected to a number of distributed SCADA workstations. However, Mr. Simon did face a couple of interesting challenges in this project:

  • The OPC servers and SCADA systems were on different computers, separated by a company firewall. This makes it extremely difficult to connect OPC over a network, because of the complexities of configuring DCOM and Windows security permissions.
  • Each SCADA system needed to access data from all of the new OPC server stations. This meant Mr. Simon needed a way to aggregate data from all the OPC stations into a single common data set.

After searching the web, Mr. Simon downloaded and installed the DataHub®. Very quickly he had connected the DataHub to his OPC servers and determined that he was reading live process data from TEVA’s new control systems. He was also able to easily set up the OPC tunnelling link between the OPC server stations and the SCADA workstations, by simply installing another DataHub on the SCADA computer and configuring it to connect to the OPC server stations.

“I wanted to reduce and simplify the communication over the network because of our firewall. It was very easy with the DataHub.” said Mr. Simon after the system was up and running. Currently about 7,000 points are being transferred across the network, in real-time. “In the future, the additional integration of the existing or new OPC servers will be with the DataHub.”

Case Study: Plastics Manufacturer, Scandinavia

Leading plastics manufacturere uses live process data to optimize production, saving time and materials

One of Scandinavia’s leading plastics manufacturers has chosen the DataHub® from Cogent Real-Time Systems (a subsidiary of Skkynet) to extract data and interact with their state-of-the-art plastic manufacturing equipment. The firm can now access any desired process data for the purposes of engineering analysis and enterprise-level resource planning. The DataHub was the only additional piece of software required to realize substantial savings of time, materials, and production costs.

“The DataHub is exactly the kind of application we needed,” said the project coordinator. “Our system is extensive, and we need to visualize a lot of production parameters. We looked at other solutions but they were too expensive and more complicated.”

plastics-manufacturer-plantWhen the company installed new equipment recently, the necessary system integration grew very complex. Progress was slow. After almost a year they were facing a deadline and had little to show for their time and effort. The goal was to pull together data from 15 machinery units, and feed it in real time into the company’s business processing systems. And if possible, to enable plant engineers to view and work with the live data as well. When they found the DataHub they were pleased to learn that most of the work had already been done.

The first test was to connect the DataHub to an OPC server and put live data into ODBC databases, Excel spreadsheets, and web browsers, as well as to aggregate OPC servers and tunnel data across a network. The DataHub proved to be easy to use and reliable, and it performed remarkably well. The next step was to set up a test system.

The test system connected all of the OPC servers for the plant’s plastics production machines to a central DataHub. Another DataHub at a network node in the engineering department is connected to the central DataHub by a mirroring connection, for tunnelling data across the network. This second DataHub is then connected to an Excel spreadsheet to give a live display of the data in real time. When a piece of equipment machine starts up on the production line, the chart comes to life—cells spontaneously update values and bar charts spring into existence.

The engineering department was able to develop a custom TCP application that uses the DataHub C++ API to make a direct connection from the DataHub to their SQL Server database. Once connected that database gets updated in milliseconds with any change in the plastic-manufacturing machinery. From the SQL Server database the data is accessed by the company’s ERP and accounting software. Using the DataHub in these ways allows the company to:

  • Aggregate the data from all machinery into one central location.
  • Distribute the data across the network to various users.
  • Do decimal conversions of the data as it passes through the DataHub.
  • Put selected subsets of data into Excel for engineers to view and run calculations on.
  • Feed values into a SQL Server database in the company’s IT and business processing system. The OPC points are read-only to ensure a clean separation between the management and production areas.

“This system pays for itself,” said a company spokesman, “and we save money in many ways. We have seen substantial gains in productivity and performance because we can monitor our processes far more effectively. Our accounting and planning departments have, for the first time ever, an up-to-the-second record of actual production variables and statistics. At the same time, our engineering staff can use real-time data in their calculations, and feed the results directly back into the process.”

The DataHub also saved substantial programming costs. The time alone saved on development work has paid for the system many times over. With a single tool the project coordinator has met the various needs of both the engineers and company managers. “The software is easy to install and it works well,” he said. “It’s at the correct level for our needs.”

Case Study: University of California, Berkeley, USA

DataHub is used to integrate data for distributed control of unmanned aerial vehicles

For the past several years, students and faculty at the Vehicle Dynamics Lab (VDL) of the University of California, Berkeley, have been developing a system of coordinated distributed control, communications, and vision-based control among a group of several unmanned aircraft. A single user can control the fleet of aircraft, and command it to carry out complex missions such as patrolling a border, following a highway, or visiting a specified location. Each airplane carries a video camera and an on-board computer, and communicates with the groundstation and the other aircraft in the formation. The control algorithms are so sophisticated that the fleet can carry out certain missions completely autonomously—without any operator intervention.

The control system for each aircraft runs on a PC 104 computer with a QNX6 operating system. Control is divided into three kinds of processes: communication, image processing, and task control. All of these processes interact through the DataHub running in QNX. The DataHub® is a memory-resident, real-time database that allows multiple processes to share data on a publish-subscribe basis. For this application, each process writes its data to the DataHub, and subscribes to the data of each other process on a read-only basis. In this way, each process gains access to the data it needs from the other processes, while avoiding problems associated with multi-processing data management.

For example, the communication software comprises three separate processes: The Piccolo process controls the aircraft, the Payload process communicates with users on the ground, and the Orinoco process handles communications with the other aircraft. Needless to say, each of these three programs needs information from the other two, as well as from the video and task control packages. All of this data is transferred seamlessly through the DataHub.

“The DataHub has contributed a great deal to our software integration,” said Brandon Basso, one of the VDL team members. “Its ability to restrict write privileges to each shared variable of the owner processes avoids many of the difficulties associated with multi-process management.”

For task control, there are two primary software packages: Waypoint controls visits to specified locations, while Orbit handles the orbiting “patrol” of a group of locations. These processes are monitored by a third, supervisory process called Switchboard. In addition to coordinating these processes, decisions must be made by the different aircraft as to which plane will take on which task. The complex calculations needed for this decentralized task allocation are mediated through the DataHub.

Waypoint and Orbit use input from the vision control and vision process. Prior to takeoff, certain algorithms are applied to previously recorded videos, to create a visual profile of the area, which is maintained by the vision control. In the air, this data must be compared to what the plane is currently flying over. A camera on the wing of the plane feeds data to the vision process, which analyzes the content and generates meaningful information about objects on the ground, such as waypoints on a river or road. This live content, along with the stored visual profile in the vision control, is fed through the DataHub to Waypoint and Orbit.

According to the paper, A Modular Software Infrastructure for Distributed Control of Collaborating UAVs, published by the University of California Berkeley which describes it in detail, this project marks “a major milestone in UAV cooperation: decentralized task allocation for a dynamically changing mission, via onboard computation and direct aircraft-to-aircraft communication.” Skkynet is pleased that the DataHub has played an important role in the success of this endeavour.