Posts

Industry Embraces Big Data

We blogged about Big Data six years ago. Back then, pushing industrial data to the cloud in real time was a novel idea. Collecting industrial data within the plant for on-site use had been going on for decades, but few companies were integrating that data with enterprise IT or analytical systems.

Today, all that is changing. IoT and Industrie 4.0 are ideal for connecting industrial processes to Big Data. Progressive companies routinely use digital transformation to feed analytical systems to improve performance across the enterprise. Others are taking notice, trying to catch up. A recent research project by Automation World points to the growing rate of acceptance and adoption of Big Data among system integrators and end users, and how they leverage it.

Half of the system integrators in the study report that most or all of their clients collect production data to run improvement analysis. A quarter of the end-users surveyed say that they collect data from over 76% of their systems and devices.

While most of the data being collected is for in-plant improvements in equipment and maintenance operations, somewhere between 40% and 54% is also being used for Industry 4.0, smart manufacturing, or digital transformation initiatives. Pulling Big Data from the shop floor has become that important in just a few years time.

Data collection technologies

Despite the move towards Big Data, the most widely-used approaches to collecting data are still hand-written notes entered into a spreadsheet, as well as on-site data historians, according to the report. So for many users, the technology hasn’t changed significantly since the 1980s. However, cloud and edge technologies are gaining acceptance, being used at some level in about one fourth of the facilities reported on.

The survey didn’t specifically address it, but we see that some technologies originally developed for in-plant use—most notably data historians—are now widely used in edge and cloud scenarios. Some of the most well-known real-time data historians have cloud equivalents, or can be run on cloud servers. As a result, there is no clear line between traditional data collection and IoT-based systems, and there doesn’t need to be.

What is needed is secure, real-time data communication between the plant and the office or cloud. As high-quality data communication is more widely adopted, and as companies implement digital transformation in more areas, we can expect to see a huge growth in Big Data applications to optimize resource use, increase production efficiencies, and bolster the profits of the enterprise.

Case Study: Minera San Cristobal, Bolivia – 1

Connecting corporate and control systems

Minera San Cristobal, owned by Apex Silver and Sumitomo Corporation, is one of the largest silver-zinc-lead mining projects in the world. The mine, located in the Potosi district of southwestern Bolivia is expected to produce approximately 450 million ounces of silver, 8 billion pounds of zinc, and 3 billion pounds of lead. In the San Cristobal mill the ore extracted from the mine is crushed, ground, and refined through flotation process to yield concentrates of silver, zinc, and lead, which are then shipped abroad for final smelting. These processes are monitored and controlled using the DeltaV Professional Plus SCADA system. When the system was first installed, managers at the San Cristobal mill initiated a project called “DeltaV External Historian”. The goal of the project was to store vital process data in a SQL Server database, for these three reasons:

  1. To maintain an external backup of the most important process data out of the process control servers (more than 3600 points).
  2. To provide access to the plant information from the corporate network, while avoiding the risk of having office personnel connected to the control network.
  3. To interface with corporate ERP systems like JD Edwards.

To achieve all three of these goals, Sr. Mario Mendizabal chose Cogent DataHub® software and used it to connect his DeltaV system to SQL server. First, he connected a DataHub instance to the DeltaV OPC server on the control network. He then installed a second DataHub instance on the SQL Server machine, which is on the corporate network. Finally, he connected these two DataHub instances over TCP, using DataHub tunnel/mirroring. This connection bypasses firewalls, eliminates the need to configure DCOM, and provides a secure link between the corporate and control systems. The tunnelling connection mirrors the data between the two DataHub instances, putting a complete set of data on both machines. To ensure that the control system is completely independent from any input on the corporate side, Sr. Mendizabal configured the connection to be one-way only-from DeltaV to the External Historian. This avoids any overwrite problems.

“The system has been performing very well,” said Sr. Mendizabal. “The backup data log is perfectly accurate, and the connection to the corporate network is functioning just as we had planned. The managers and accounting staff are very pleased to have up-to-the-second access to the most critical data coming out of our control system. We couldn’t have done it so easily or so well without DataHub software.”

Case Study: ABB, Colombia

Electrical substation upgrade: connecting ABB MicroSCADA suite to Oracle using DataHub

A key task for ABB in Colombia is upgrading electrical substations. The ABB development team is always looking for new tools for their substation automation systems to make operation easier, and to provide as much information as possible for the system operators.

One of the latest developments was the addition of a database to their Substation Automation System. The Substation Automation equipment used by ABB is designed according the 61850 standard. The idea of adding a database was to allow the operator to access valuable information stored over long periods of time (2-4 years).

“As with most SCADA systems, the trends graphics and the historical data are stored temporarily,” said one member of the development team. “Your typical substation equipment is designed to have no moving parts. It uses a small, solid state disk to store data, which is not big enough to store information for long periods of time.”

abb-colombia-system

The ABB substation automation system uses ABB’s MicroSCADA suite. One of the functions of that software is to provide a gateway between the IEC 61850 data protocol and OPC. Using the information available in OPC, the development team chose the DataHub® from Cogent (a subsidiary of Skkynet) to interface between MicroSCADA and Oracle, storing process information in a network disk.

“We found the DataHub to be a powerful, user-friendly software package that allows us to bridge between our OPC server and Oracle,” said a member of the development team. “The support of the Cogent team was great. The DataHub is a very good complement to the IEC 61850 technology. We can save data in the Oracle database, and also monitor live data in the DataHub data browser. The first prototype is now in a testing period, and is running well.”