281289 A Pragmatic Approach to Creating a Digital Information Hub for Plant Asset Information

Wednesday, October 31, 2012: 2:36 PM
327 (Convention Center )
Philip Simon, Global Business Consulting, AVEVA, Houston, TX

Title:     A Pragmatic Approach to Creating a Digital Information Hub for Plant Asset Information

PTP Abstract:

Information management used to be and in some cases still is synonymous with document management. This is understandable because documents have always been used to create, manage, and store information in context.  However, documents are limited as an information store in their ability to serve up relevant and selective data.  With the advent of databases and other data management technologies, it has become possible to create and manage information down to the specific plant component and attribute value.  This has created a new paradigm for information management that has enormous value over document-based information sources because information presentation can be selective and purpose-driven.    As a result, companies are slowly warming up to the ideal where all the information needed for their work is available in context, accurate, and presented in a format relevant to the specific user, at the click of a button.  This goal has significant business value associated with it, including but not limited to safe and efficient operations and effective configuration control. 

However, amidst all the latest information management technologies, a fundamental challenge has also emerged, ironically fueled by information technology itself.  On the one hand, all attempts to access current, trusted and relevant information implicitly assumes the existence or knowledge of all such information under one umbrella or centralized system and not in disparate sources and formats.  On the other hand, with the increase in new technologies and best-of-breed discipline-specific applications, the number of disparate sources and formats of information is increasing rather than decreasing.  In addition, in an almost instinctive manner, every system attempts to become the de-facto central system for all information simply because they recognize the importance of access to other information for their end user's work efficiency and presumes that their discipline is clearly the center of the information hub.  In this tug-of-war of applications that are best of breed in one discipline vying to be also the central source of information for the rest of the information, two facts emerge: one is that information will continue to be created in multiple, disparate sources, and the other is that there is no simple way to get them all together with context in any one system.  The second fact poses a serious challenge that manifests itself in the process required to create the data standard and to transform the data to be compliant to the standard in the information hub.  We will look at this challenge in more detail and identify a solution to overcome it.

The process to create the data standard requires the concerted knowledge, attention, and time from subject matter experts from multiple disciplines, away from their day-jobs.  Most significantly, this often is done as a theoretical exercise in a vacuum without visibility to the actual data in the field.  Faced with this challenge, vocalized by one Facilities Integrity Manager as “I don't know where to begin to get such a system defined and populated…”, many companies either postpone this lofty goal for a ‘later time' or go to their shareholders and request a major company undertaking. The challenge may have been less immense if we had the standards such as proposed by the Semantic Web or the industry organizations such as Fiatech, POSC-Caeser, ISO, EPRI, etc.  The reality though is that even with the progress made by these bodies, we don't have any comprehensive decisive standard available to propose for adoption.  To add to the complexity, very often in the early phases of a capital project, these standards are either yet to be defined or cannot be adhered to due to engineering uncertainty at the early stages.  Inconsistencies are expected and are intended to be resolved as the project progresses. 

If and when the standard has been defined and approved, there is considerable delay in getting sufficient information validated and loaded into the system.  Typically the information that is not compliant with the standard is simply rejected for further analysis and evaluation by humans to rectify or transform the data.  Depending on the closeness of the standard to real information, as much as 80% of the information may get rejected especially if the standard was developed in a vacuum.  A partially loaded system remains a work in progress and is not useful in the daily operations.  Several iterations of adjusting the standard and transforming the data will need to occur before a critical mass of information becomes available in the centralized system.  These activities will also require the input and attention from the very same subject matter experts, whose time is generally at a premium.  All of these factors, including the need to make hard-and-fast decisions without visibility to the impact to real data, go toward making this undertaking time consuming and expensive, often holding up regular work and delaying the rollout of the system.

One solution to this problem is to have a system that can accommodate significant uncertainty as it takes a set of inconsistent, disparate sources from a state of relative chaos (whether it be due to project maturity state or due to poor data quality) to a state of order without holding up the work or the business or impacting existing systems.  This system should provide visibility to the chaos so that appropriate and comprehensive standards can be developed relevant to the real data and not in a vacuum.  It must then be able to progressively enforce those standards and guide the reconciliation of the data.  The progressive nature of the process is critical in that it allows the development of the standard and the cleansing of data to be iterative.  A setup of this nature will provide immediate returns for the investment while reducing the risk of the undertaking through multiple interative steps, slowly growing the quality of the information even as the knowledge and the standard matures. 

In this presentation, we will compare and contrast this pragmatic approach against other current approaches and also discuss the key ingredients of such a system that can deliver value from the start.



Extended Abstract: File Not Uploaded
See more of this Session: Advances in Information Management and Integration
See more of this Group/Topical: Computing and Systems Technology Division