Welcome Note

Welcome readers, I would like to share my knowledge in the fascinating field of System Integration. The integration of ERP System with Planning & Scheduling, Manufacturing Execution System(MES) and Shop Floor Control sytems.

Wednesday, June 30, 2010

What is the purpose of data compression algorithm in Process Historian

Author - Roger Palmen
IT Consultant - MES at Logica

The advantages are clear: a data historian can process immense quantities of data. Any good historian can process thousands of data-points with sampling rates of up to once per millisecond. When you do the maths, sampling once every millisecond will amount to 31.5 billion samples per year. Use a small size to store takes about 2 bytes, so that will be 58.7Mb per year for a single datapoint. A small server will have 1000 points, but i have seen systems running up to 60000 to 70000. You will spend your days adding disk arrays. And storing is one thing, but how to access these huge amounts of data? For reference: the most-used historian (OSISoft PI Server) requires a 'simple' quadcore, 8GB mem, 2,2Tb Drivespace server to power a 1 million point server capturing 1 year of data. And that server will cost you much less than $10.000 in HW&OS. So that brings us to the disadvantages. As far as i'm concerned there aren't any... Why? You can throw all data away without loosing any of the INFORMATION contained in that data. Let's go to the practical examples. Compression algorithms generally compress on amplitude and frequency of the points that need to be stored. Let's look at the amplitude first. If you have a temperature gage in your process, that measures effectively at a one degree accuracy. That gage could be connected to a system that indicates the temperarature using 3 decimal digits. If we then throw away all differences smaller than 0.5 degrees, we loose a lot of data but no information, right? Same applies to frequency.Let's look at the fuel gage in your car. If you hit the brakes the fuel will slosh through the tank and will make the reading go up and down vividly. But is that really relevant? Looking at the general trend it should go down only a little every minute (except when refuelling ofcourse). So there is no need to capture all the details because you're just interested in the general trend. The theory behind this is the Nyquist–Shannon sampling theorem. Take a look at wikipedia for some details about that. In the real world there are a few rules of thumb that you can use to define the compression settings for each point. Using those you can easily reduce the data volume with 90% or more without loosing any information. To summarize: 1) Any substantional system cannot work well or cost-effective without compression algorithms. 2) When setup right, there are no theoretical drawbacks to using compression algorithms. 3) One exception: if you don't know what is relevant in your data, don't use compression. But then you're looking at research applications where you do not know before what is relevant or not.

What is the difference between Process Historian and Relational Database

Good article from other RSS :
Relational Databases vs. Plant Data Historians—Which One Is Right For You?
July 19, 2007 By: Jack Wilkins


Today, sensors are everywhere. They do everything from counting parts on an assembly line to measuring the quality of products. But some of the biggest challenges occur after measurements have been made. At that point, you have to decide: Where do I collect the data, and how can I use it to improve my operations by decreasing variability and improving quality?
ChoicesIn the manufacturing arena, real-time operations require fast data collection for optimal analysis. Generally, manufacturing companies approach data collection in one of two ways: with a traditional relational database or with a plant data historian.
Each offers distinct advantages. A relational database is built to manage relationships, but a plant data historian is optimized for time-series data. For example, relational databases are great at answering a question such as: "Which customer ordered the largest shipment?" A plant data historian, however, excels at answering questions such as: "What was today's hourly unit production standard deviation?"
Relational DatabasesThis type of database is an ideal option for storing contextual or genealogical information about your manufacturing processes. The relational nature of the database provides a flexible architecture and the ability to integrate well with other business systems. When extending the functionality of a relational database for manufacturing applications, companies leverage its openness by creating and managing custom tables to store data that comes from multiple sources, such as other databases, manually entered values via forms, and XML files.
As relational databases mature, I see vendors improving their system's performance in transactional manufacturing applications, such as capturing data from an RFID reader. When capturing contextual information or time-series data from a small number of sensors, a relational database may work best.
Plant Data HistoriansOn the other hand, plant data historians are a perfect choice when you must capture data from sensors and other real-time systems because this type of repository uses manufacturing standards, such as OPC, that facilitate communications. With plant data historians, you can streamline implementation by using standard interfaces.
With most of these systems, there is little or no management or creation of data schema, triggers, stored procedures, or views. You can usually install and configure a plant data historian quickly without specialized services, such as custom coding or scripting for the installation.
Plant data historians are also designed to survive the harshness of the production floor and feature the ability to continue capturing and storing data even if the main data store is unavailable. Another feature typically found in a plant data historian is the ability to compress data, reducing the amount of drive space required. When capturing time-series data rapidly (with a re-read rate of less than 5 s) for several thousand sensors, a plant data historian may work best.
The Best of Both WorldsWhen relational databases and plant data historians are deployed in concert, companies can collect and analyze the tremendous volumes of information generated in their plants, improve performance, integrate the plant floor with business systems, and reduce the cost of meeting industry regulations. As stated by many Six Sigma quality experts, "You can't improve what you don't measure." Plantwide data collection can make this possible.
By using analysis tools, such as Microsoft Excel or other off-the-shelf reporting solutions, you can increase the quality and consistency of your products by comparing past production runs, analyzing the data prior to a downtime event, and plotting ideal production runs against in-process runs. Today's analysis tools make it easy to aggregate data, prepare reports, and share information using standard Web browsers.
Plantwide in-process data collection also serves as the vital link between plant processes and business operations, providing business systems with the data they need to gain a clear, accurate picture of current production status or historical trends.
Ultimately the decision should be to use a relational database and a plant data historian. The combined power of both provides the detailed information that yields numerous benefits, internally for the company and externally for customers.

Wednesday, June 23, 2010

Option C : Real time data for Silo management (SAP ODA)

The third option would be using existing components available in SAP PP module. ODA stands for OPC Data Access. SAP ODA tool has two parts server and client. ODA server component need to be installed on TIS system and ODA client will reside in SAP ECC system. But this approach needs ABAP development for building business logic and simple UI developments.
Merits:
-Simple interface
But setting up of OPC remote connection involved DCOM hurdles.

Monday, June 21, 2010

Option B : Real time data for Silo management (SAP XI)

The second option would be using a SAP PI / XI module. As XI is a repository for all the enterprise services and capable of handling B2B (Business to Business) integration. Also XI can handle large volume of messages it can be option to integrate Tankfarm Information System. But before that the TIS system must have an add-on component known as OPC Xi interface (source :http://www.opcfoundation.org/). Xi stands for eXpress Interface (not related to SAP XI). This interface can provide the OPC real time data in the form of webservices. Once webservices are available the same can be registered in SAP XI repository and accessed by other SAP modules like IS OIL, PP, PM etc.
Merits :
1. SOA enabled
2. Less TCO
3. As SAP XI can be a central instance maintaining the system landscape is easier
4. OPC remote access is free from DCOM configuration problems, can work across fire-wall

Tuesday, June 15, 2010

Option A : Real time data for Silo management

The physical inventory update for the SAP IS OIL silo management is done manually (using transaction O4_TIGER). This can be automated using SAP MII (Manufacturing Integration and Intelligence). The data flow starts with tank farm inventory system(TIS). TIS has the real time data of storage tank levels and can be accessed in their underlying OPC sever. The OPC client adapter in MII collects data from TIS OPC server and tranfers to IS Oil system. The MII instance can either located at corporate or remote (near to storage location). If its corporate instance, then the MII Universal Data Source (UDS) need to installed in the TIS system. UDS will create a TCP/IP tunnel to access the tank data in a patented binary format.
Merits:
a. Building additional business logics (business process specific to customer)
b. Scalability
c. Custom user interface
In latest version of MII UDS can be replaced with a .NET based tool PCo (Plant Connectivity).

Thursday, June 10, 2010

Real time data for Silo management (SAP IS-Oil)

In the oil and gas industry accounting and data reconciliation plays a critical role. The more frequent and realtime data will be a good starting point for the inventory management and oil accounting. Almost all of the oil companies and storage locations are already provided with tankfarm information system (TIS) from major vendors like Invensys, Honeywell(Enraf) and Emerson (SAAB).
There are three ways we can retrieve the real time inventory data from TIS systems :
1. OPC server of the TIS itself and by having suitable OPC client
2. Enable OPC - by using Kepware UCON tool (all the TIS vendors support protocol TRL/2)
3. SOA (Service Oriented Architecture) - Install OPC Xi Client server tool above the OPC server of TIS and access the OPC tag values as a webservice

The option - 3 would be the best one as the setting up and maintainability and lower TCO.
In the next post we will look at the various architecture for getting real time data.

Friday, April 9, 2010

Handling of Device data

The data from the devices/controller need to be validated first in order to make it work. The validation can be part of the device driver itself (best choice) else can be performed by the layer above it. The validation will involve health check diagnostic to check whether the device is alive and communicating, feed back is readable (not any junk characters), etc.,

The validated device data has to be processed by any Logic building application. e.g. :
1. PI-Processbook from OSI soft http://www.osisoft.com/software-support/products/PI_ProcessBook.aspx
2. InTouch from Wonderware/ Invensys http://global.wonderware.com/EN/Pages/WonderwareInTouchHMI.aspx
3. Business Logic Editor in xMII module of SAP http://help.sap.com/saphelp_xmii115/helpdata/en/Introduction/IllumSystemOverview.htm

The above tools are provided user friendly logic building features. These tools are provided with industry standard data connectors or adapters. The following are the industry standard connectors
1. OPC (OLE for Process control) - Connecting to time series data from shop floor control system - http://www.opcfoundation.org/
2. OLEDB (Connect to different type of Relational Database Management System( RDBMS), ISAM (Indexed Sequential Access Method) for accessing flat files

Thursday, April 8, 2010

Device drivers for communication between Controllers and PC

Once protocol for the controllers are available, the communication between controller and PC can be enabled using device drivers. Device drivers are the set of instructions developed to communicate with the controllers using assembly level language. Long standing programming tools are C, C++, VC++ and Embedded java. The key parameters required

1. Baudrate of the device ( e.g. 9600)
2. Databits
3. Parity
4. Stopbit
5. Flowcontrol

Once above parameters are available using the programming tool, we can Open the port. Here the port points to COM port (serial port 9-pin) of a PC openning up communication channel with controller. After openning up the port the command or instruction has to be sent based on the protocol document.

More information reference
Sample protocol document for batch controllers : http://info.smithmeter.com/literature/docs/mn06069l.pdf

Tuesday, April 6, 2010

Controllers in detail

Controllers are the heart of the sensor to boardroom integration scenario. Controllers are basically a microprocessors. The processing capability of these microprocessors vary based on the application. For example, if you take the case of Flow controllers or Batch controllers the microprocessor is capable of process two letter commands and maximum of less than 100 commands. The configuration parameters will be stored in PROM type memory (no hard disk). There will be an Input / Output (I/O) circuit board. Using the I/O board users can connect digital or analog input and output devices. The input devices in this case are solenoid valve, flow measurement devices like Positive Displacement (PD) meter, valve position sensor. The output devices are display units, printers.

The controllers can be communicated through a personal computer connected to it. The communication takes place using serial mode (RS - 232/ 422/ 485). The communication protocol plays a major role in establishing connection. Most of the controllers developed earlier days had a proprietory protocols (e.g. smith protocol in AccuLoad controllers). Recent developments in the industry forced the controller vendors to standardise with MODBUS protocol. This will be the standard for controller communication, in the near future.

Also, recent influence of Information Technology (IT) developments added TCP/IP mode of data transmission in the Industrial Automation network as well. The controllers research and development initiative progressed in the following mode of communication :
1. Modbus protocol support
2. TCP/IP
Today's advaced controllers are provided with Modbus communication port and TCP/IP port apart from traditional RS-232/422/ 485 ports.

Hence, if you are interested in connecting a set of similar devices to Manufacturing Execution System (MES) or directly to ERP and other systems, best bet would be to have Serial to TCP/IP converters. Integrate the sensors to the controllers using serial mode and controllers to the external system using TCP/IP mode. These serial to TCPIP converters are provided with optional output USB Port and Fibre Optic cable port. Fibre Optic cable port can be used if the controller need to be located in a remote place and far from the communicating computer.

Also, the remote controllers can be connected to the TCP/IP converter which in turn can be connected to a wireless LAN device and an in-built GPRS enabled device. This helps in trouble shooting of controllers situated in remote locations e.g. Remote Terminal Units (RTU) in the cross country pipelines of Oil & Gas industries.

More information below,
Controller : Batch controllers http://www.ryanhercoflowsolutions.com/Markets/VendorArticles/Signet/BatchControls.pdf
Modbus protocol : http://www.modbus.org/docs/PI_MBUS_300.pdf
Serial-to-TCPIP converters : http://www.moxa.com/product/NPort_S8000.htm

Let's Begin with Sensors








The sensors are the basic elements in any controller configuration. In a typical process control scenario, it is a device which measures and transmits the process value to the controller (e.g. RTD - temperature, Diaphragm - Pressure, Loadcell-Mass, Orifice - Flow, household security sensors, proximity card readers etc.). The sensors must be connected to a controller. The type of connection and communication methods differ based on the controller. For example in the proximity card reader the sensor need to be connected by a serial cable. This indirectly limits the distance between sensor and respective controller to maximum of 100 to 200 meters.
In the case of hydrocarbon handling area the proximity type sensors need to be intrinsically safe type and only few vendors are available (like Honeywell , HID). For Oil & Gas industry application take extracare while selecting sensors.

The key points while selection of sensors,
1. Type of industry (intrinsic safe or not)
2. Sensor sensitivity
3. method of communication between sensor and the controller
For more information on proximity type sensors/ cards etc ---http://www.hidglobal.com/technology.php?tech_cat=4&subcat_id=9
















Sunday, April 4, 2010

Webservices for accessing OPC data from ERP sytems

OPC DA XML standards will be the future for accessing data of Shop Floor systems from ERP Systems. The main advantages are,

1. To avoid DCOM related OPC configuration settings while setting up of OPC Client systems

2. No need to install OPC Client on user systems

3. Current ERP systems are updated with very good webservices consumption modules (e.g. SAP PI (XI))

4. Security related services are intact and easy to configure as it is webservice

The sample OPC XML DA Gateway servers and demo servers are available at http://www.advosol.com/pc-6-5-xml-da-server-side-gateway.aspx

Monday, March 29, 2010

Sensor to Boardroom

Today its quite simple to transfer a data from sensor to boardroom for making decision making process. This is possible because
1. Standardization of protocol in field level control systems (OPC)
2. Standardization of interface in database systems (OLEDB)
3. Major developments in the MES space (Manufacturing 2.0 initiative)