Friday, July 30, 2010

Plant Intelligence as Glue for Dispersed Data?

Plant Intelligence as Glue

A series of TEC tutorials have looked at manufacturing plant-level systems addressing both the importance of plant level systems (see The Importance of Plant-Level Systems) and the difficulties in developing systems that integrate the needs of the shop floor and the back-office (see The Challenges of Integrating ERPs and MESs). One approach to addressing these challenges is to use plant or manufacturing intelligence systems.

Given the problematized communication between manufacturing execution systems (MES), plant automation, and enterprise applications, a new breed of applications are coming from the likes of former Lighthammer (now part of SAP), Kinaxis (formerly Webplan), Activplant, and Informance. They offer middleware analytical applications called manufacturing intelligence or plant intelligence that target other applications used to generate corporate-wide visibility of key performance indicators (KPI). These plant portal applications consolidate data taken from a wide range of computing sources—from plant floors, enterprise systems, databases, and elsewhere—and organize these data into meaningful, roles-based information, aggregating the data from disparate sources for analysis and reporting. Connections can through extensible markup language (XML), or open database connectivity (ODBC) standards, with communications managed by a protocol layer in the portal's Web server architecture.

Near real time visibility and transactional exchanges have to be created between enterprise applications and the plant floor with appropriate drill-downs to contextualize and understand the impact of specific manufacturing events. These products are applied to critical plant processes, and monitor production and provide the input required for calculating key metrics, such as overall equipment effectiveness (OEE). In order to increase OEE, data generated by equipment in a production line is acquired and aggregated (preferably in automatically, see The Why of Data Collection).

Information is contextualized using business rules and user roles to create and maintain consistent functional and operational relationships between data elements from these disparate sources. For example, these products can demonstrate the relationship between allowed process variables and ranges of time series-based quality and yield data. It can also analyze information by using business rules to transform raw process data into meaningful KPIs. Data can also be filtered for any noise/outliers; visualized with a context-based navigation and drill-down capabilities; and presented or propagated to determine the factors and root cause disturbances that slow production or impact quality. Ultimately plant-level systems allow decisions to be made that will speed up throughput and increase first-run production.

How It Works

Configuring data sources for integration can be done through templates, which is analogous to selecting a printer for an office application. The real trick, however, is in having sound plant-level models, which are frameworks that portray accurate plant-level context and data management, within the application sets. In manufacturing, even small changes to a master plan can create a so-called "reality gap" and these are historically addressed by last-minute panicking and scrambling, all the while, the business protagonists are not always (if ever) conscious of the impact or even the validity of their "educated guess" decisions.

Thus, these new software applications make it possible to model the cascading consequences of anything users do in response to an unplanned event (like a customer doubling an order or a machine breaking down), which in turn, makes it possible to understand how the other, intertwined parts of the user organization and supply chain will be impacted by a change (see Bridging the Reality Gap Between Planning and Execution).

When users have information about unplanned events and how their responses will impact the company, they should have manufacturing intelligence that can guide them through the forking paths of exception-based, decision-making. The value of the plant-level information indeed changes when enterprises use it to support higher-level, strategic, and tactical business processes. For example, data generated for a department supervisor or for management purposes has one value, and the same data used for Sarbanes Oxley Act (SOX) compliance has another (see Attributes of Sarbanes-Oxley Tool Sets). Moreover, the value of quality assurance (QA) information increases substantially when used to support enterprise-wide warranty issues.

Inventory information takes on a different look when viewed across an entire supply chain with synchronized schedules that are based on real demand. This greater value comes from changing from the data-centric view of separate manufacturing applications, to a process-centric view of integrated systems that can support company processes that have a higher impact. Manufacturing intelligence cannot work without its backbone enterprise resource planning (ERP) and MES system where all the transactional information resides. But ultimately, the actual mechanism for pushing changes through this backbone is manufacturing intelligence. It is thereby necessary to have a way to address both the planned and unplanned components of manufacturing in the same extended system.

User Recommendations

Examples of the potential benefits from the intrinsic integration of ERP with the plant floor and of achieving near real time information are multiple, and can be seen in The Importance of Plant-Level Systems and The Challenges of Integrating ERPs and MESs.

Common issues pertaining to interfaces in a best-of-breed environment might be mitigated (if not completely eliminated) with applications such as SAP's composite application SAP xApp Manufacturing Integration and Intelligence (SAP xMII, formerly Ligthammer CMS [Collaborative Manufacturing Suite]) or Oracle's innovative approach seen in Oracle Discoverer and Daily Business Intelligence. In Oracle's case, its application can seamlessly integrate performance metrics with transactional data, thereby incrementally updating performance metrics as transactions occur. Namely, when the data is stored in the same database, there is no need for the creation and management of ungainly interfaces, because there is only one master application. Data visibility becomes inherent, since by using the proper links, data can be gathered and disseminated in multiple ways, without delay.

Yet, in most cases, multiple databases on the shop floor, such as quality management data, production and warehousing real time transactions, plant maintenance data, ERP master data, etc. are rarely in sync, making timely decision-making difficult and often inaccurate. This holds true any time information is kept in more than one location, because without a highly advanced method of synchronization, the chances of having accurate data stored in more than one location are small indeed.

If data is only synchronized on a batch mode basis daily, or even by shift, managers have a difficult time making timely, accurate decisions, and this impacts all functions, such as production planning, shipping, inventory control, and purchasing. It also handicaps customer service representatives as they attempt to serve customer requests about the status of their orders. In the worst cases, some data is never synchronized to the master ERP system, which creates a serious communication void and promotesthe dreaded "islands of automation".

Users should, however, understand that the current generation of plant intelligence software mainly brings visibility, sometimes through composite applications, or the vendor will natively provide all the functional pieces. Its biggest asset is that it features real time event detection, workflow management, and interoperability. Nonetheless, it does not yet produce true intelligence required for trustworthy corporate performance management (CPM). These solutions often still lack business context and sophisticated production scenario analysis, which are featured by out-of-the-box data models for certain industries, for example, how to unlock constrained capacity, improve quality, reduce the cost of more frequent and lengthy changeovers, and improve profit margins. They may not have data life cycle management tools either, which allows for data reconciliation, data modeling, data cashing, yield accounting applications, etc.Analytics that add intelligence to the near real time data that enterprise systems can access through the portal toolsets may also be absent.

Thus, while the likes of SAP xMII can connect to and use real time data, many companies with intricate manufacturing plant-level processes will still, at least for the foreseeable future, need to leverage multi-vendor, best-of-breed solutions to create their own data management discipline and plant-level intelligence. They will also likely have to invest a lot of their own skills, experience, and intellectual property in constructing their manufacturing performance management systems.

When selecting these applications, in addition to using customary checklists of Web-enablement and platform support, users should ensure that the products use industry-standard frameworks, interfaces, and terminology, such as business to manufacturing markup language (B2MML), XML/Web services, ISA-95 and ISA-88 for batch process industries, and Microsoft Common Object Model (COM) and open connectivity (OPC) for application interfacing. Also, the candidate products should feature documented, "out-of-the-box" interfaces for popular ERP and plant automation applications (such as SAP, Oracle, Siemens, ABB, Rockwell, etc.), and to other control systems like distributed control systems (DCS) and programmable logic controllers (PLC). Given the dominant mantra of "intelligence", these products should logically include a strong and intuitive reporting function (ideally based on an "open" or pervasive products such as Business Objects' Crystal) and provide an integrated workflow management engine to model the production process.







SOURCE:
http://www.technologyevaluation.com/research/articles/plant-intelligence-as-glue-for-dispersed-data-18318/

No comments:

Post a Comment