Thursday, August 19, 2010

Mainstream Enterprise Vendors Begin to Grasp Content Management Part Three: Challenges

Can PCM Address the Challenges?

With buyers keen on seeing payback for their investments in e-procurement and PTX, suppliers on the other hand being urged to participate in on-line sales channels in a great part by distributors and aggregators (which repackage the product data with catalogs from many like suppliers, and resell the content to companies undertaking e-procurement initiatives) that have constructed their e-commerce and collaboration platforms, PCM vendors have naturally abounded from many sides.

Yet, the all-encompassing content management solution is still in the ever-evolving design stage, as vendors try to piece together comprehensive systems. Therefore, as mentioned earlier on, there seems to be a proliferation (and subsequent confusion about) of the pertinent terms like enterprise content management (ECM), product content management (PCM), catalog management, product information management (PIM), records management (RM), product data management (PDM), enterprise data repositories (EDR), document management (DM), knowledge management (KM), web content management (WCM), digital asset management (DAM), enterprise information management (EIM), digital rights management (DRM), document imaging, workflow management (WM) or business process management (BPM) and more.

As also said earlier on, generally speaking, PCM or PIM refers to a system for managing all types of information about finished products, and it is a further evolutionary step of catalog content management backed up with a workflow management. This is however different from ECM, which focuses more on document management and unstructured editorial and web content, whereas PCM is more granular around individual data elements and focuses on highly structured product content. ECM encompasses many of the above-cited technologies used to capture, manage, store, preserve, and deliver content and documents related to organizational processes. In other words, it allows the management of an organization's unstructured information (e.g., e-mails, photos, spreadsheets, documents, etc.), wherever that information exists—stored in repositories, shuttled across networks, and managed over the course of its existence or life cycle.

However, regardless of the name and purpose, all the above software categories aim at the same goal of being the sole trustworthy, master source of product information for the enterprise. Still, business purposes for such systems could fall roughly into the following three groups:

1. The product information related to the design, development, and introduction of the products, which belongs to PLM and PDM, as its subset, that captures and manages product data generated during the product design and development process.

2. The information related to the buying (procurement) or selling side of e-commerce of the products, where PIM systems come into play later in the product life cycle, after a product is manufactured and introduced to the market. PIM vendors aim at helping manufacturers and distributors create centralized product information repositories that can be used for multiple purposes by people in various roles throughout the company and the supply chain.

3. The information about already owned products, equipment, and facilities, that is, assets.

Yet, to conduct collaborative processes, businesses need embedded intelligence, and business intelligence (BI) or analytics applications, focused on structured data offer only a part of the total solution. In other words, businesses also need content management for the unstructured data and content, which can contain a majority of business information, given that many decisions makers collaborate via e-mail or voicemail, which are examples of vast unstructured info that currently resides outside of business processes and of the reach of ERP and BI systems. Also, while PIM addresses data synchronization, ECM, bundled with strong storage management systems has been bolstered by abounding record retention regulations like the Sarbanes-Oxley Act (SOX), Health Insurance Portability and Accountability Act (HIPAA), or Department of Defense Requirements for Records Management Applications (DoD 5015.2). Owing to well-publicized accounting and document shredding scandals at the likes of Enron, enterprises must be capable to retrieve unstructured data quickly, at least in case of a court appearance.

This is part three of a three-part note.

Part one defined PCM system attributes.

Part two presented background information and lessons learned.

Unified System Needed

Therefore, ideally, one unified system of record should be able to support all the above purposes, as it would thereby vouch for efficiency, accuracy, control, and agility. For example, while there are some compelling reasons to deploy PIM in isolation, the real benefits would come if PIM would complement PLM in an integrated fashion during the product's attributes and specifications release (introduction) to the market, which is often treated separately as an addendum and handled by marketing or sales folks working with isolated spreadsheets or custom databases. In other words, the majority of current UCCnet data synchronization solutions focus on processes such as aggregating product data in a product catalog designed to meet UCCnet's specifications, encapsulating the data in the correct message format, and establishing connectivity for the data synchronization with the retailer through UCCnet or via a third party e-commerce network services providers (e.g., Transora or Worldwide Retail Exchange [WWRE]).

While these solutions may address the basic requirements for UCCnet compliance, they fail to address the fundamental business issue that all parties operate using accurate current data, rather than the same (but possibly outdated) data. Hence, enterprises should expand their UCCnet compliance efforts to include product data integrity processes, enabled by PLM tools that ensure that there is a single version of the truth throughout the extended enterprise and that changes are reflected in all systems immediately, improving data quality and the effectiveness of internal and external business processes. Only then the data shared with business partners will be accurate, which is the true value, given that synchronizing bad data will still result in fines by the likes of Wal-Mart.

Because of these same initiatives, PLM vendors are also being increasingly asked by their customers to include more commercially-oriented product information in their PLM systems. A few high profile case studies have shown how the data in a PLM system can serve as the source of valid, consistent and up-to-date product information for synchronization and syndication to supply chain partners. However, most PLM systems lack a PIM system's ability for secure, trusted synchronization of information to data pools like UCCnet. For more information, see The Role of PIM and PLM in the Product Information Supply Chain: Where is Your Link?

Enterprise Vendors Move to ECM

While the notion of a single source for all the product content, given a highly distributed, multi-application, heterogeneous environment, remains a tall order at this stage, it also remains a sensible vision to pursue. Namely, the desire to reach unstructured sources too (e.g., paper records, faxes, electronic document, web pages, e-mails, multimedia objects, etc.) and pull them closer to collaborative business processes will likely drive ERP vendors to pull ECM into their domain, in a manner similar to their encroachment into the land of BI and PLM (see BI Approaches of Enterprise Software Vendors and PLM Coming of Age: ERP Vendors Take Notice).

True, such single-source PCM or ECM system would challenge many notions about what is structured and what is unstructured information, given it would have to cover a wide variety of information formats and types, whereby the information has to be granular and systematically ascribed to serve various audiences and integrate or synchronize with other systems. Further, not only does the classification differ in vertical industries, so do the taxonomy and protocol standards that allow companies to do business with others. For that reason, most pure-play PCM vendors have been very industry-specific, managing to survive on a large percentage of consulting services revenue versus software license fees. Recently though, some of the providers have been expanding into offering versatile, scalable systems with the ability to work across industries, vertical markets, and business functions.

On the other hand, the ECM vendors, the likes of Documentum (now a part of EMC), FileNet, Stellent, Vignette, and Open Text to name some, are used effectively as repositories for PCM, most often in design, engineering, or other PLM roles. But repository functions of content management are quickly being commoditized, since the standard DBMS from Oracle, IBM, and Microsoft will soon offer support for functions like storing, versioning, and tracking most kinds of content, although the ECM vendors will respond by adding value on top their repositories by again offering vertical solutions and support for key business processes. The particular danger might be coming from IBM, who has not been hiding its interest in the ECM market either with a slew of recent acquisitions, including Tarian in 2002 for records management, Aptrix and Green Pastures in 2003 for WCM and collaborative authoring and version control respectively, and most recently Venetica for content integration. The potential for new business and broadened offering (e.g., combining portal frameworks, ECM components, storage systems, etc.) has prompted several intra-market acquisitions, the most notable being Documentum's acquisition of former askOnce's search technology (before it was acquired by EMC), Vignette's acquisition of TOWER Software, and Open Text's acquisition of IXOS Software.

Therefore, as many companies are already realizing value from recent BI or ERP and PLM or ERP convergence and consolidation (see Has Consolidation Made the PLM Market More Agile? and BI Market Consolidation Compared to ERP Market Consolidation), the eventual ECM and PCM convergence with ERP should also result with benefits of allowing companies to take advantage of knowledge and content hidden within enormous untapped pools of unstructured content.

The likes of SAP will gladly serve as a convergence point that would bring ECM within collaborative business processes. Even now the SAP Records Management (SAP RM) component of SAP NetWeaver offers more than tools, as it has predefined industry specific records management layouts and scenarios, such as SAP Public Sector Records Management, mySAP Financials Dispute Management, and mySAP CRM Case Management. Also, SAP's business processes (or even processes outside SAP applications) can share content gathered at any SAP NetWeaver layer through open standards. For example, participants can publish BI content in the SAP Knowledge Management (SAP KM) repository, which then enables KM objects to exploit all SAP KM services, like subscriptions, discussions, collaboration, information aggregation, etc., and the documents associated with BI, business processes, and product models. Also, through the collaboration engine, cFolders, users can make the analytic model available to participating teams without requiring them to open a SAP Business Warehouse (SAP BW) developer workbench or have to study the metadata or databases.

On the other hand, a flexible workflow management facility, which is instrumental to allow users to build collaborative processes that reflect their unique product development issues, is another critical PLM underlying technology element that Oracle has long mastered and offered as a stand-alone option to Oracle applications users. BI or analytics too goes without saying, given Oracle has long sold stand-alone OLAP tools for years. Oracle has also been making strides in document management and ECM, with a project code-called Tsunami that is slated for the end of 2004, which will provide a major upgrade for Oracle Collaboration Suite and will also come in handy for managing unstructured data within PLM systems. Oracle has already helped some customers with basic document management functions of Oracle Application Server 10g that features Oracle Content Management SDK (software development kit). Formerly known as Internet File System (iFS), the SDK is a collection of tools that handles tasks such as managing multiple versions of a document, check-in and check out capabilities, etc. Thus, Oracle's moves seem to counteract the Microsoft ones of Microsoft SharePoint document-sharing and collaboration technology that the vendors has bundled within recent server versions of its Windows operation system, and which will eventually morph into WinFS (Windows File System) within the next version of Windows code named Longhorn.

Manufacturers Perspective

From the perspective of manufacturers, data synchronization is an important part of keeping up good relationships with retailers like Wal-Mart, and one solution is internal data integration via a catalog product such as xCat offered by former A2i (now part of SAP MDM), with the GDS capability expected in late 2004. However, for the above ECM or PCM "bigger picture" discussion, SAP is interested not only in GDS, but also in the greater value of PCM, with tie-ins for trade promotions management, marketing resource management, financial applications, CRM applications, solutions dealing with product design, and even sourcing and procurement. None of these can be dealt with successfully without knowing what and from whom one is buying. However, without PCM functionality delivered in unison with PLM solutions, changes to product information outside of the catalog do not automatically update the catalog, since PLM should control product content throughout its life cycle, ensuring that all product content is current and accurate wherever it exists within the organization and its supply chain.

As said earlier on, one of the reasons why e-commerce was slow in taking off was that companies did not have the product content and the publishing tools to make it useful on an ongoing basis (i.e., through the ability to continually change offerings, modify offerings, accommodate alterations to different markets). Collecting content is difficult, and expensive, given that not only do enterprises need a process for collecting content, but they also need the tools in which to place and structure the content properly. Poorly structured content cannot be published to paper, and cannot be web-enabled either. That has been A2i's focus—creating a system that allows users to structure content in a way that it can be used repeatedly. Further, since it is such an expensive process, users need to be able to leverage that investment across multiple media and to be able to publish to the web in a fast, rich, and searchable way—in terms of not just transactional data, but also parametric information that makes it possible to search by product relationship.



SOURCE:
http://www.technologyevaluation.com/research/articles/mainstream-enterprise-vendors-begin-to-grasp-content-management-part-three-challenges-17615/

More Data is Going to the Cleaners

Event Summary

"WESTBORO, Mass., November 29, 1999 - Ardent Software, Inc. (Nasdaq: ARDT), a leading global data management software company, today announced a strategic partnership with Firstlogic, Inc., the developer of i.d.Centric data quality software that helps companies cleanse and consolidate data in database marketing, data warehousing, and e-business applications. Under the partnership agreement Firstlogic will develop and support a link between its industry-leading customer data quality tools and Ardent's DataStage Suite."

"Organizations implementing complex database marketing programs and e-business strategies depend on extensive data quality assurance. Together Ardent and Firstlogic are meeting this market need," said Mikael Wipperfeld, vice president of data warehouse marketing at Ardent Software. "This partnership allows our joint customers to take advantage of Firstogic's address verification, name parsing and extensive matching and consolidation capabilities inside the DataStage suite, the most comprehensive Business Intelligence (BI) Infrastructure solution available."

"As data has become increasingly strategic to corporate growth, monitoring and maintaining the highest level of data quality are essential," said Art Petty, Firstlogic vice president of marketing. "Our customer data cleansing and matching tools work with Ardent's DataStage Quality Assurance solutions to provide powerful, complete data quality management." .

Market Impact

As we predicted in "Data Warehouse Vendors Moving Towards Application Suites" (September 29, 1999), and "Oracle Buys Carleton Corporation to Enhance Warehouse Offering" (November 10, 1999), the data cleansing tool vendors continue to be acquired or partnered with. We predict this trend will continue as Ardent now has very strong data cleansing capabilities. They already had cleansing software acquired with the purchase of Prism Solutions. Prism had previously purchased QDB Solutions, which created QDB/Analyze, a tool for complex data cleansing, and this partnership will give them additional capabilities.



SOURCE:
http://www.technologyevaluation.com/research/articles/more-data-is-going-to-the-cleaners-15471/

Continuous Data Quality Management: The Cornerstone of Zero-Latency Business Analytics Part 2: One Solution

Implementing A CDQM Application

It is impossible to improve that which cannot be measured. A CDQM (Continuous Data Quality Management) tool provides a real-time, up-to-date scorecard to measure data quality within the enterprise. By checking data quality in real-time, "data fires" can be detected when they are just starting, before any real damage has occurred. Most enterprises fight fires with axes, fire hoses, trucks, and hordes of firemen, but the CDQM approach is a smoke detector. It's far less expensive to put a fire out when it's just smoldering, rather than to extinguish a blazing house fire and then remodel the entire house.

Data quality must be a constant commitment. Most companies, when implementing a data quality initiative, look at it as a massive data-cleansing project that scrubs data as part of a system upgrade or new system implementation. That approach is a lot like taking a shower at the beginning of the month and saying, "Now I'm clean!".

Without metrics and constant measurement, there is no way to verify data quality on an ongoing basis and keep the data in the system clean. And if the quality of data is not fully understood, one cannot be confident in the decisions made based upon that data. Considering that US enterprises are expected to spend upwards of $22B by 2005 on business intelligence initiatives, doesn't it make sense to implement a system to track the quality of that data?

This is Part Two of a two-part article on the importance of maintaining data quality, based on the author's experience.

Part One defined the problem of maintaining data quality to an enterprise.

When Good Data Goes Bad

The classic "garbage in, garbage out" scenario becomes all too real when there are quality problems with the data on which important decisions are based. At Metagenix, we like to tell the story of a fictional electronic parts manufacturing company we call Huntington Corp. We pieced together Huntington from several real-world companies whose data quality issues almost did them in, and whom, for obvious reasons, we cannot name. We like to tell the Huntington story because it so clearly illustrates the need for a continuous data quality tool. Take, for example, the data issues that arose when Huntington acquired rival company, Systron, and began the process of combining the two companies' customer accounts into one customer database. What started as an innocent attempt to merge this data became a mess that almost spiraled out of control.

The problem began with the two companies' customer account numbers. Huntington's were eight numbers long; Systron's were 10 alphanumeric characters long. When Huntington's IT department began merging the account data, it was decided that Huntington's eight-number account number would be the default format. All of Systron's accounts were uploaded to an eight-number format, changing all alpha characters to zeroes and truncating the 10-character format to eight, eliminating the validity of Systron account numbers in one stroke—with all the attendant downstream angst and confusion for employees and customers. Having a continuous data quality tool in place during this ETL process would prevent this data nightmare from occurring.

Another problem occurred during the Systron integration when IT combined the two companies' parts and inventory tables. Like the customer account number fiasco, Systron used alphanumeric parts numbers in relational tables from each of its divisions. Huntington only had one master table, with numeric parts listings. When Huntington's IT department combined Systron's multiple alphanumeric tables into Huntington's master table, it failed to include the system division number assigned to each division table in the new Huntington part number. As a result, customers calling to order Systron parts and familiar with a Systron part number were befuddled by Huntington's new part numbers, a disaster not fixed until it was called to IT's attention.

Other data quality issues arose even before the Systron acquisition. Huntington's accounting department was still batch processing journal and other general ledger entries on a nightly and weekly basis. Over time, Huntington's accounting department was spending more and more time researching and reconciling erroneous entries. Though Huntington Corp. accounts were clearly defined in a chart of accounts by department and type of account, the general ledger and other systems were allowing invalid account numbers, thus causing subsequent delays in invoicing and payment processing. A continuous data quality tool set up to check business rules on account numbers would have ensured that account entries were going to valid account numbers.

The same type of problem happened when Huntington installed a new CRM software system. While customer service representatives were doing their best to get the customer data needed to build effective communications with the company's customers, they were frequently not capturing addresses and telephone numbers. When Huntington's marketing or customer relations departments decided to conduct campaigns or send follow-up messages to customers, this customer contact data was missing—unbeknownst to these departments and to the detriment of these campaigns and messages. A continuous data quality tool would have produced an error report by detecting that contact information was not being captured, thereby allowing this information to be obtained in time for these customer communications.

The downside of the data quality issues in these situations is fairly obviouspoor data quality negatively impacts the value of the data used to support decision-making and operations. From as simple as the case of an incorrect general ledger account that throws off accounts payable analysis and reconciliation, to a missing email address that results in failure to notify a multi-million dollar customer of a backorder, and everything in between, data quality plays a crucial role in the support structure of today's business.

The Solution

Businesses clearly need a framework for implementing and monitoring data quality on a continuous basis as part of any business intelligence initiative. At Metagenix, we have developed this new framework, a continuous data quality management tool to work in concert with enterprise business intelligence and database systems. Built with an easy-to-use interface like simple address checkers, yet developed for robust enterprise use across departments, data warehouses and silos, Metagenix's CDQM tool works by applying specific business rules as a continuous check of data quality. The CDQM framework consists of several interconnected processes, which we have outlined below.

Business Rules Capture Repository and Interface

This system provides a meta-data repository to capture business rules and knowledge about data across the enterprise. Business rule assertions are expressed as algorithms and functions that indicate the validity of data. These assertions can be in the context of a field, a virtual record, or an entire data source. One example of a rule is that zip codes must be five or nine digits for US addresses, and must match the expected city and state fields. Another example is that the schema for the Customers table is not expected to change.

Metrics Repository

This tracks the results of processes within the CDQM system and provides historical information. A complete data quality scorecard can be constructed based upon the information stored in the Metrics Repository. The interface allows a user to view the results, and slice and dice the results data. Depending upon the job function of the user, the interface will provide different views of the scorecard. For example, the CIO might be interested in which systems are generating the most quality problems, while a DBA might be interested in table schema changes that have occurred.

Event Processor

This is an object interface that allows external applications to communicate events of special interest to the CDQM framework. For example, an ETL job might inform the system that a transfer of a file of 225,003 records time-stamped from yesterday took place at 12:03AM into table CUSTOMERS and took 14 seconds.

The CDQM framework could then be used to track execution speeds, check sums on the data, and check the timeliness of the transfers. Problems such as loading the same file twice could be immediately recognized. Likewise, recognizing an incremental increase in transfer times over the last month could spur investigation into potential difficulties in the ETL process. The Event Processor is not limited to monitoring data movements; external applications could signal a variety of events to be tracked as part of the data quality monitoring effort.

Transaction Server

This system allows an enterprise to centralize data validation. Instead of multiple applications each implementing their own validation logic, a central set of business rules is used to determine the validity of a data gram.

External applications use an object interface to transmit data to the Transaction Server, which determines the validity of the data according to the rules stored in the repository and returns a result indicating potential problems with the data. At the same time, the Transaction Server updates the Metrics Repository with the available information about the transaction. Thus, decision makers can determine the sources and causes of faulty data and adjust the business processes accordingly. Just as easily, implementing new business rules merely requires adjusting the meta-data in the Business Rules Capture Repository, rather than recoding potentially hundreds of applications that deal with data in a slightly different fashion.

Rules Checker

The Rules Checker mirrors the Transaction Server, but operates on a macro level. Instead of providing a real-time service, the Rules Checker is run periodically to verify compliance with the business rules against a variety of data sources. For instance, the Rules Checker might be run every night against all records in the order entry database to verify that all orders reference parts numbers that are in the catalog. Another example would be a check each night that the domain of the Customers->Type field is what was expected when compared to the values stored in the repository.

The Rules Checker updates the Metrics Repository, and can also generate events such as an email to a responsible manager when certain rules are violated. Imagine being able to come into the office each morning and receive an email indicating which records were loaded incorrectly last night and what's wrong with them!

Scheduler

The Schedule activates the Rules Checker processes according to schedules and dependencies determined by the user. For example, a user could specify that the check of the master customer name file should be run every night immediately following the successful completion of an ETL job.

The Metagenix CDQM framework employing these components is extremely scalable and user-friendly. All interfaces are delivered via a web browser. The repositories are implemented in standard, ODBC compliant relational databases. The Rules Checker, Transaction Server, and Event Processor are implemented from conception as massively parallel, high-performance systems capable of handling the massive amounts of data required.

Editor note: The information presented here is the opinion of the author, based on his experience in using a continuous data quality management tool. TEC does not endorse this specific product per se. This article is published by TEC because it contains some useful information for companies concerned with data quality management issues.

CDQM: In Summary

With an ever-increasing dependence on data for near and real-time decision-making and with more connectedness of databases, data warehouses, marts, and silos across the enterprise, we at Metagenix believe the call for CDQM is loud and clear.

Data becomes information when it is used in analysis upon which decisions are made. Data quality problems result in bad information, necessarily leading to bad decisions. At Metagenix, we strongly believe CDQM is the answer to the data quality problem, a problem that our technology will solve.'



SOURCE:
http://www.technologyevaluation.com/research/articles/continuous-data-quality-management-the-cornerstone-of-zero-latency-business-analytics-part-2-one-solution-16778/

Attaining Real Time, On-demand Information Data: Contemporary Business Intelligence Tools

Pure-Play BI Vendors

Some pure play business intelligence (BI) vendors have long been providing BI platforms, which offer complete sets of tools for the creation, deployment, support, and maintenance of BI applications. Pure-play vendors attempt to sell these platforms to original equipment manufacturers (OEM) and independent software vendors (ISV), and even to IT organizations and end users that are information technology (IT) savvy enough to build their own applications on top of it. These BI platforms logically combine many database access capabilities like structured query language (SQL), online analytical processing (OLAP) data manipulation, modeling functions (what-if analysis), statistical analysis, and graphical presentations of results (charting) to create data-rich applications. The applications have customized user interfaces (UI), and are organized around specific business problems that target business analyses and models.

Part Three of the Business Intelligence Status Report series.

Most BI platform vendors also offer their own BI applications as the BI platform's validation. These enterprise BI application suites are the descendants of basic query-and-reporting tools, which they tend to displace or extend. The suites provide support for varying levels of users, with a variety of query, reporting and OLAP capabilities, all available with the idea of minimal training.

The BI market is currently crowded with a number of vendors with adept product suites. Some players in the market include MicroStrategy, Informatica, Information Builders, Oracle, IBM, SAP, Microsoft, Teradata, and Ascential, which was reunited with its short-term foster parent Informix, when Ascential was acquired by IBM, and See IBM Buys What's Left of Informix). Other vendors include Applix, arcplan, ProClarity, Siebel Systems, OutlookSoft, CombineNet, and SPSS. Obviously, these vendors have different origins. Some are traditional database vendors and enterprise application vendors; others were BI suite vendors, pure players in certain niches (such as enterprise reporting), while some have evolved to cover multiple bases.

Lately most of these vendors have updated their client/server tools with a Web-based UIs. Many BI software providers create standards-based portlets, sometimes using Web services, to expose BI functionality and information when more than viewing is required, and to be consistent with other portal add-ins. The advantage of portlets over simple hyperlinks (which have initially been leveraged) is a richer, more interactive experience and a more uniform integration approach. Otherwise, a portlet is a Web-based component that will process requests and generate dynamic content. The end user essentially sees a portlet as being a specialized content area within a Web page that occupies a small window in the portal page. The portlet provides users with the capability to customize content, with the appearance and position of a portlet.

Incidentally, current portlet standardization might help Web enablement and portal environment integration efforts, as most BI vendors are currently and cautiously developing products that will adhere to JSR 168, a standard that enables portlet interoperability. Since a JSR 168-compliant portal should be compatible with all Java-based portal solutions, it should alleviate the development burden on BI vendors. As a result, vendors can then focus on more advanced BI functionality rather than on porting their solutions to the individual, commercially available enterprise portal products. Thus, because of a strong Web similarity associated with BI application suites, some vendors describe their offerings as BI portals, whereby these portal offerings typically provide a subset of the counterpart client/server functionality via a Web browser. However, the vendors have been steadily increasing this functionality to come closer to that provided by rich, Microsoft Windows-like client desktop tools.

This is Part Three of a seven-part note.

Part One detailed history and current status.

Part Two looked at contemporary BI tools.

Part Four will describe the BI/CPM market landscape.

Part Six will discuss Geac and Point Solutions vendors.

Part Six will compare direct access to a data warehouse for the mid-market.

Part Seven will make recommendations.

Is Real Time, On-Demand BI Attainable?

There are those who always want more due to the many pressures that demand the need and investment in data management and integration. The trend of massive growth in data volumes continues with no end in sight, while enterprises have to manage and share large amounts of data across diverse regions and lines of businesses (LOB). The introduction of new data generating technologies, such as radio-frequency identification (RFID), will only accelerate this growth and the subsequent need for real time BI. Traditional BI systems use a large volume of static data that has been extracted, cleansed, and loaded into a data warehouse (DW) to produce reports and analyses. However, the need is not just reporting, since users need business monitoring, analysis, and an understanding of why things are happening.

The demand for instant, on-demand access to dispersed information has grown as the need to close the gap between the operational data and strategic objectives has become more pressing. As a result, a category of products called real-time BI applications have emerged. These can provide users, who need to know (virtually in real-time) about changes in data or the availability of relevant reports, alerts, and notifications regarding events and emerging trends in Web, e-mail, or instant messaging (IM) applications. In addition, business applications can be programmed to act on what these real-time BI systems discover. For example, a supply chain management (SCM) or enterprise resource planning (ERP) application might automatically place an order for more "widgets", for example, when real time inventory falls below a certain threshold, or when a customer relationship management (CRM) application automatically triggers a customer service representative and credit control clerk to check a customer who has placed an on-line order larger than $10,000.

The first approach to real time BI uses the DW model of traditional BI systems. In this case, products from innovative BI platform providers like Ascential or Informatica provide a service-oriented, near real time solution that populates and the DW much faster than the typical nightly extract/transfer/load (ETL) batch update does. The second, commonly called business activity monitoring (BAM) is adopted by pure play BAM and or hybrid BAM-middleware providers such as Savvion, Iteration Software, Vitria, webMethods, Quantive, Tibco (particularly after the acquisitions of Staffware and Praja) or Vineyard Software. It bypasses the DW entirely and uses Web services or other monitoring means to discover key business events. These software monitors or agents can be placed on a separate server in the network or on the transactional application databases themselves, and they can use event- and process-based approaches to proactively and intelligently measure and monitor operational processes.

For more on these diagnostic BI tools, see Business Activity Monitoring—Watching the Store for You. The most advanced of these applications not only optimize users' time and the information they receive, but also provide the context for them to take appropriate action.

EII Complements or Renders DW Obsolete?

Enterprise information integration (EII) is an emerging category of software that confronts the longstanding challenge of enterprise data integration over diverse data sources in scattered enterprise systems. Companies that have overcome the problem of scaling and managing data are now pondering how to unify their data sources and leverage them to solve near, real time business problems. To that end, EII aims at providing unified views of multiple, heterogeneous data through a distributed (federated) query. One way to think of EII is as a virtual database layer that allows user applications to access and query data as if it resided in a single database. In other words, the concept has taken an existing database capability to merge a query across different tables, but it is done on a virtual basis, shielding users from the underlying complexities of locating, querying, and joining data from varied data source systems.

EII is a fundamentally different approach to other data integration technologies such as enterprise application integration (EAI), which provides data or process-level integration, and enterprise portals, which merely integrate data at the presentation level. To refresh our memory, EAI is the unrestricted sharing of data and business processes throughout networked applications or data sources in an organization.

Early enterprise applications in areas such as, inventory control, human resources (HR) and payroll management, sales force automation (SFA), and database management system (DBMS), were designed to run independently, with no interaction between the systems. They were custom-built within the technology of the day for a specific need, and were often proprietary systems. As enterprises grew, they recognized the need for information and applications to be transferred across and shared between systems. As a result, enterprises often invested in EAI in order to streamline processes and keep all the elements of the enterprise interconnected.

There are four major categories of EAI:

1. Database linking, whereby databases share information and duplicate information as needed.

2. Application linking, whereby the enterprise shares business processes and data between two or more applications.

3. Data warehousing whereby data is extracted from a variety of data sources and channeled into a specific database or DW for analysis.

4. Common virtual system, which would be the peak of EAI, whereby all aspects of enterprise computing are tied together so that they appear as a unified application.

EII is also differentiated from the conventional ETL tools for data warehousing because it neither moves data nor creates new data stores of integrated data. Rather, it leaves data where it is, leveraging metadata repositories across multiple foundation enterprise systems and visibly pulls information into new applications. As a result, customers may be content to trade-in expensive and pesky DWs for a data extraction and presentation layer that sits on top of existing transactional systems, but only on the condition that they receive unimpaired performance. As a result, this will make virtual or abolish the intermediary step requiring diverse data sources to be aligned and their terms of use to be agreed upon.

Another way to look at the EII approach, somewhat borrows from material management approaches. EAI and ETL can be thought of as "push" technologies, and EII can be regarded as a "pull" mechanism that seeks and finds data, as needed and in near real-time, by creating an enterprise-wide abstraction semantic layer for standardized access to any corporate data source. The ability to provide appropriate BI without having to adapt a DW for specific decision support tasks is sometimes referred to by EII vendors as on-demand BI.

Companies can certainly benefit from accelerating BI analysis and reporting, to expedite end-of-quarter closing reports, to incorporate regulatory compliance efforts. EII may offer opportunities for more effective data management, provisioning, and auditing within "a single version of the truth". Also, near real time information can be especially useful for certain industries characterized by data diversity. For example, an airline might need to know specific passenger information as soon as the plane pulls away from the gate, or a retailer may need real time information to plan inventory distribution, allowing hourly metrics to guide selling and supply strategies. BI analysis and reporting can combine demographics data, and summary and line item purchasing histories from data stores and point-of-sale (POS) systems to aid customer service applications designed to deliver a 360-degree view of customers. Other industries that can benefit include the insurance and pharmaceutical industries. For example, an insurance field adjuster can reference customer claim forms and other account information residing in disparate databases. Last but not least, pharmaceutical and life science companies can assemble patient data, scientific information, clinical trial information, and feeds coming from various proprietary sources. Moreover, for some environments, it might make more sense to leave the data where it is, pulling them in to create a consolidated view when needed rather than lumping them into one massive database.

Other useful EII (real-time, on-demand BI) deployments could be within operational dashboards to track various performance metrics, or in financial risk analysis, where each single transaction might affect a serious change. The advent of Web services and Internet standards, such as extendable markup language (XML) and simple object access protocol (SOAP) will certainly help with the future integration of data and with real time BI. Enterprises considering a service-oriented architecture (SOA) could embed an EII server that would publish data integration as a Web service. For more pertinent information, see Understanding SOA, Web Services, BPM, BPEL, and More.

Performance Management Solutions

The latest evolutionary step of BI introduces the concept of corporate performance management (CPM), which is often interchangeably referred to as enterprise performance management (EPM) or business performance management (BPM). CPM is an emerging portfolio of applications and methodology that has evolving BI architectures and tools at its core. Historically, various BI applications have focused on measuring sales, profit, quality, costs, and many other indicators within an enterprise, but CPM goes well beyond these by introducing the concept of "management and feedback". It embraces processes such as planning and forecasting as core tenets of a business strategy. In other words, while the DW process supports the bottom-up extraction of information from data, it does not provide a top-down enforcement of a corporate-wide strategy.

CPM adds a reactive component capable of monitoring time-critical operational processes to allow tactical and operational decision makers to be in tune with the corporate strategy. It also crosses traditional department boundaries or silos to manage the full life cycle of business decision-making, combining business strategy alignment with business planning, forecasting, and modeling capabilities. It is an umbrella term that describes the methodologies, metrics, processes, and systems used to monitor and manage the business performance of an enterprise, whereby applications that enable CPM translate strategically focused information to operational plans and send aggregated results.

These applications are also integrated into many elements of the planning and control cycle, or address BAM or CRM needs. In other words, CPM maps a structured set of data against predefined reports, alerts, dashboards, analysis tools, key performance indicators (KPI), etc., to monitor and improve business processes based on upfront and established corporate strategic objectives. Furthermore, CPM creates a closed-loop process, starting with developing high-level corporate goals and predefined KPIs. It then measures actual results against the KPIs representing the comparison in a balanced scorecard. The results are reported to management through intuitive reporting tools, and are ultimately fed back into the business modeling process for corrections in the next planning cycle.

CPM applications enable information sharing across and even beyond the borders of the enterprise, to all employees, business partners, shareholders, and most importantly, customers. While real time tools can open up BI to tactical and operational decisions, they also amplify the need for an effective information delivery channel. An enterprise portal might be the choice, given its high visibility and it gives companies the opportunity to provide related context, services, and content around published BI information.

Two kinds of BI information seem to be well suited for inclusion in a portal: 1) near real time data, with related content to provide timely snapshots of a business unit or an individual business process; and 2) analysis and summaries, since this high-level BI information fits nicely into executive dashboards, while balanced scorecards can communicate company performance metrics to a broader set of employees, tying it in with budgeting and planning systems, analytical capabilities, and corporate dashboards. Dashboards are front-end presentation that sits on top of CPM systems displaying key metrics and KPIs on which the company wants everyone to focus. Dashboards often represent the window into an overall CPM system, improving the visibility of the results of planning, budgeting, and BI analysis to enterprise users.

Therefore, CPM leverages performance methodologies such as dashboards, balanced scorecards, or activity-based costing (ABC), a cost accounting system that uses cost drivers to allocate costs to products or other business bases to realistically allocate overhead. Although these approaches help determine how and what to measure, they lack a mechanism for dynamically changing values, to keep abreast of the business reality. Ensuring closed-loop management is thus CPMs enhancement of traditional BI applications. BI applications customarily focus on measurement, which is basically worthless without the ability to act upon the results. Consequently, a perplexing variety of existing tools and techniques can lay claim to being part of the CPM trend—ranging from BI tools and analytics to business process management applications (related to but different from BPM), and scorecard products.

Thus, CPM is the evolutionary combination of technology and philosophy, building on the foundation of technology and applications that many enterprises already have. The demand for these applications lies in the fact that they incrementally add value to previously installed business applications, even to legacy ones. With CMP, enterprises may finally see some long belated benefits and feel somewhat better about implementing cumbersome ERP and other enterprise systems. Indeed, many enterprises have already deployed some BI products too, such as querying and reporting tools, planning and budgeting applications, analytic applications, incentive management systems, portals, dashboards, and scorecards, along with data warehousing technology, data models, and integration software, and whatnot. In fact, anyone stocktaking technology inventory will likely find some CPM components already in use.



SOURCE:
http://www.technologyevaluation.com/research/articles/attaining-real-time-on-demand-information-data-contemporary-business-intelligence-tools-18032/

Product Lifecycle Management: Expediting Product Innovation

For a design team, a loosely stated requirement is often misleading. That's why it is important to capture requirements in a structured manner—in a requirements database maintained by requirements managers—before they are passed along to the product development teams. The role of the requirements manager may not exist in the current scenario, or it may have been fulfilled by the marketing team. But it is essential to analyze requirements properly before taking them up for development.

Companies can use best practice methodologies like an affinity diagram, a Pugh matrix (decision matrix), or quality function deployment (QFD) for analyzing, prioritizing, and mapping requirements to existing features or to new features that they can deliver. These methodologies also require companies to benchmark what competitors can deliver to satisfy a given requirement.

A true product lifecycle management (PLM) system emphasizes the fact that product requirements should be communicated clearly to all stakeholders of product development, including the design, testing, materials, supplier, manufacturing, production engineering, and service teams. Increasing the visibility of product requirements is the first step in implementing any PLM system.

Increasing Productivity

The design starts from a concept generated by a given market need. The design team must be given a highly efficient environment in order to be able to work with maximum productivity. The use of computer-aided design (CAD) and computer-aided engineering (CAE) tools has been common in the industry for the last decade, but the real challenge is storing the generated CAD data in a centralized, secure, and easily accessible place. It's also important for design engineers to be able to check in or check out these files on a daily basis during the design cycle. This gives designers more control over the product design, since the centralized storage of product data makes for an information-rich product development process.

Furthermore, incorporating a strict approval mechanism enables designers to do things right the first time. Precise CAD data is an important element of analysis for the CAE or computer-aided process planning (CAPP) teams, since erroneous or improper CAD data makes CAE and CAPP efforts useless. Also, ensuring that CAD data is precise results in precise engineering bills of materials (EBOMs) for the manufacturing resource planning (MRP) or enterprise resource planning (ERP) systems.

CAD data cleanliness and control is not only important for the design team, but also for the subsequent product development teams down the line. For this reason, CAD data management is considered the heart of PLM.

Healthy Collaboration

Bringing people together is the main objective of any PLM system. When product development people are closer (virtually speaking), they can collaborate more efficiently. Due to current globalization trends, as well as the trend toward leveraging competencies and resources that are geographically separated, it has become mandatory to collaborate in a virtual environment. The health of the collaboration can be parameterized with three basic questions:

* How secure is your collaboration environment?
* How efficiently can you collaborate?
* How many resources is your collaboration environment consuming in terms of the network and hardware and software?

Collaboration plays a vital role when your design teams are separated geographically. It also makes lot of sense for organizations that outsource the whole product design to third-party organizations or suppliers. Collaboration enables the host companies to give product design feedback to the design partner companies or suppliers in the early stages of the design, rather than after completion. This is critical, as issues detected early in product design are less expensive to fix than issues that are detected later.

Reducing Time-to-market

Manufacturing organizations are striving to answer the question of how to reduce time-to-market. Companies can obtain more market share and profit if they introduce a product to market sooner than their competitors do. Thus, they tend to minimize cycle time whenever possible. Product cycle time as a whole can be broken down into cycle times for design, engineering analysis, validation, buying, process planning, and piloting.

Product development involves cross-functional teamwork, as the following functions are generally involved: marketing, design, engineering, purchase, testing, production engineering, manufacturing, and quality. In a conventional product development cycle, these cross-functional teams work serially, one after the other (also termed serial engineering). This conventional method actually ties product cycle time to a certain period, as only one team can work at a time (while the other teams wait for the results). For example, the analysis, testing, and purchase teams will wait for the design to be completed before proceeding.

The idle time of other teams can be used to shrink the overall cycle time of the product. This leads to the concept of concurrent engineering, where cross-functional teams can start their work at a predefined point of the previous step of the product life cycle. For example, the engineering or purchase departments might start the analysis and buying processes when the design is 60 percent complete. And the manufacturing planning team might start once analysis and testing is 50 percent complete. Concurrent engineering can shrink product cycle time phenomenally by leveraging the maximum time resources possible from all stakeholders of the product.

However, the impact of rework in concurrent engineering is heavy compared to serial engineering. For example, when there is a design change after 60 percent completion of the design, it will impact the work of the analysis and testing teams, since they have already started their work. But this impact can be easily managed in a digital workplace.

If analysis and testing is being conducted in a digital environment that is seamlessly integrated with the CAD environment, then the impact of design change is minimal: this is the power of digitization. It is strongly suggested that the product development environment be digitized as much as possible in order to attain successful concurrent engineering.

Nowadays, there are CAD tools that are tightly integrated with native CAE and CAPP tools. Concurrency can be easily achieved with the kind of digital environment that allows for designing, analyzing, simulation testing, and product planning, since these activities do not necessarily have to be conducted physically. This also reduces the cost of building physical prototypes.



SOURCE:
http://www.technologyevaluation.com/research/articles/product-lifecycle-management-expediting-product-innovation-19933/

Embracing Complexity: A Speedy Business Performance Management Solution

Company Overview

Applix provides a complete performance management software solution for finance and operations, without compromising its strong customer focus. The company has over 2,200 unique customers, and is growing at a steady rate. Aside from being top-rated in vendor satisfaction by BPM Partners Pulse Survey, Applix was ranked by The OLAP Survey as a leader in overall business benefits achieved by customers, and was ranked in the top three for customer loyalty.

Headquartered in Westborough, Massachusetts (US), Applix is a provider of business intelligence (BI) and business performance management (BPM) solutions. Incorporated in 1983, Applix first targeted software applications for the UNIX market. To capitalize on the emerging market, thirteen years later it acquired Sinper Corporation, an online analytical processing (OLAP) software developer. In 1998, Applix released its first TM1 product, followed by five subsequent releases, which aimed at enhancing its capabilities for performance management, Microsoft Excel integration, 64-bit platform support, and complex analysis. The Applix focus is to drive operational performance management by investing heavily in its TM1 product line. Operational BI (OBI) and performance management software provide the necessary shift from the traditional backward-looking view of BI, towards a forward-looking approach. OBI allows performance management functionality to be embedded in overall business operations. This takes the shift away from purely using BPM tools for financials, and creates the ability to leverage strengths provided by performance management software for use across the organization and to create alignment with an organization's business process flow.

Product Overview

Applix TM1 gives customers the ability to solve difficult business decisions and perform what-if analyses in a user-friendly environment, making it an above-average performance management solution. TM1 incorporates dashboards, workflow, and OLAP cubes in a fully integrated Excel and Web-enabled environment. Users are able to transfer their skill set seamlessly, due to the familiar interfaces. The product has an integration layer which connects easily to open database connectivity (ODBC), object linking and embedding database for OLAP (ODBO), SAP, and legacy data sources to capture the appropriate data, and has a powerful in-memory data management server engine to help accelerate query return times, thus improving performance. Additionally, TM1 dashboards, Web sheets, and OLAP cubes aid in planning, budgeting, and forecasting activities. TM1 enables complex data modeling and rules creation. Also, data can be updated in real time and reflected in Excel Web sheets, dashboards, and cubes, which can all be posted on the Web. TM1 uses its Web portal as a gateway for users across the organization, in order to allow them to exchange work items by viewing the same sets of data, and to manage decisions based on those data views. This allows TM1 users to collaborate on multiple tasks across the organization. Customizable dashboards and cubes help users analyze and answer defined business questions. This helps to drive potential opportunities (and to avoid risk), by identifying data patterns, collaborating with multiple task stakeholders, and creating business scenarios.

Product Strengths

TM1 has the ability to perform powerful what-if analyses against large data sets, faster than many competitors. Applix TM1's complex analytical queries are dealt with in memory through the use of a 64-bit processing platform and a caching architecture, as opposed to having calculations stored within a server (disk space). The development of 64-bit processing represents a significant trend in BI, for accommodation of large amounts of data. The advent of 64-bit processing also allows data to be updated effectively in real time. Vendors such as Information Builders and MicroStrategy also have this capability, and Hyperion is working on developing a platform compatible with Microsoft. Compared with most performance management vendors, however, Applix has taken the lead in providing a platform that provides users with the ability to create complex queries quickly, and to reflect those results on a Web portal in real time. Additionally, calculations are performed in memory and not within a server environment, which contributes to the quick response times.

Applix TM1's integration with Excel is above average, and permits users to use spreadsheets as the basis for creating Web sheets and dashboards, and to post their data to the Web. TM1 has integrated the use of Excel into its BI platform. Multiple users across an organization or across multiple geographic locations are able to access published data sheets, to edit the sheets, and to have the data automatically written to the server and updated on the Web. Cubes empower users to analyze data by drilling through multidimensional data views, and by identifying trends as well as data sources. With TM1 cubes, an organization can post OLAP cubes to the Web in different forms, such as graphs or as data integrated with Excel.

TM1's analytical capabilities focus on providing users with user-friendly access to cubes and reports. As opposed to developing complex and robust cubes that are only used by one or two super-users, Applix TM1 allows users to create compact cubes to hone in on an organization's business questions and on developing user-friendly intuitive cubes that help answer essential questions, and that are accessible to anyone in the organization. This is done by limiting a cube's scope to between five and ten main dimensions, and by authorizing users to choose what dimensions are available to drill through. These cubes provide users with the functionality to edit, change, and transfer data to Excel; to post changes to the Web; and to view the updates in real time. Although cubes are usually designed with a limited number of dimensions for focusing on actual business questions, it is possible to reflect up to 256 dimensions, which makes it a robust analytical tool. Users can also identify what dimensions they want to view and drill through, as opposed to the dimensions that are reflected in the background but not viewed online.

TM1 Rules helps users define the cube rules, allowing for complex and repetitive calculations within each OLAP dimension. Users create cubes through wizards, giving business users primary control of their analytics (as opposed to relying on the information technology [IT] department). Cubes and the associated data are then connected via defined rules, and the cubes can be customized for maximum efficiency by using rules from multiple applications.

Features of Other Components

TM1 Planning Manager manages an organization's workflow processes to grant key decision makers administration and access privileges. Each assigned task can be edited to create collaboration between employees, departments, and projects. Users can be assigned tasks, be given access to change report data, and submit those changes to the appropriate decision maker for approval. The user tasks are then submitted as work items for approval based on a task list accessed by the appropriate decision maker. Approvals and rejections with comments are submitted and stored within the task process, and users can view the logs and resubmit any additional edits or changes.

TM1 Planning Template features a planning module that supports the ability to create budgets and identify top-down goals and bottom-up plans, with pre-built worksheets. The template structure can be modified to suit organizational needs, and it is possible to load information into the modified model.

TM1 Consolidations provides users with the ability to view financial and operational data in a centralized structure, from any number of organizational units and general ledgers (GLs). Several different views of data are possible, which gives users the ability to perform variance analyses of budget-to-actual data. Key features include full support for recurring and reversing journal entries, the ability to create customized journal entry reports, and built-in controls for the journal entry process (including automatic generation of inter-company elimination journals), as well as the ability to post journal entries over the Web in Internet Explorer.

The TM1 Financial Reporting module lets users set up reporting structures and create queries for their financial reporting requirements. Reports are built once, and automatically maintained within the structure of TM1. Reports are updated automatically, and financial data from multiple GL systems can be consolidated to represent one view of GL and general financial data in real time.

Product Challenges and Opportunities

Applix needs to develop scorecarding capabilities and the ability to set metrics within its application, in order to develop a successful long-term strategy and to stay competitive within the market. Most other performance management vendors offer these features within their main product offerings, including leading BI vendors such as Cognos, Hyperion, and Business Objects. Not having scorecarding capabilities is a major disadvantage for organizations that want to track and to structure their goals and performance requirements, and to measure them over time using one integrated software suite. To address this issue, Applix has become highly integrated with Microsoft's new scorecarding functionality and current integration abilities with Excel, to pave the way for continued integration with Microsoft BPM tools. Applix hopes that customers will choose to integrate the two, as opposed to choosing a solution that already has scorecarding capabilities. Additionally, Microsoft is working on enhancements to their BI tools, including enhanced dashboarding capabilities. Thus, the Applix goal of aligning itself closely with Microsoft could be overshadowed by prospects considering Microsoft for a full performance management suite which is intuitively integrated and competitively priced, and which could eclipse the benefits of implementing TM1.

The ability to define key performance indicators (KPIs) is also an important component of performance management. Currently Applix does not have built-in functionality defining structured key performance indicators that are aligned to an organization's corporate strategy. TM1 is planning to provide this feature with a future release; however, other solutions that already have this feature provide users with the opportunity to set strategic goals, align those goals to the business unit at each level, and measure performance to drive decisions. This means that organizations that want to drive decision-making intuitively based on their KPIs, may have to put more processes into place to achieve the same result as organizations implementing a solution with built-in scorecarding and KPI functionality.




SOURCE:
http://www.technologyevaluation.com/research/articles/embracing-complexity-a-speedy-business-performance-management-solution-18550/

Vision Software Brings a Solid Business Process Management Solution to the Table

Introduction

The market for business process management (BPM) solutions is tough. Until recently, there were no real market leaders in the BPM space, even though there were big players such as Metastorm, Pegasystems, Staffware, and DST Technologies.

Currently there are a wide variety of vendors who are entering the BPM market with niche or full-suite solutions. One of these vendors is Vision Software. Even though Vision competes with vendors such as Ultimus, HandySoft, and Staffware, there are certain aspects of their BizAgi that leave room for improvement. These will be discussed later in the article.

Established in 1989 in Columbia, Vision Software is a privately owned BPM company, based since 2003 in Hertfordshire, UK. Vision Software has worked on delivering process technologies over the past sixteen years, resulting in their BPM solution BizAgi. BizAgi is a dynamic solution which supports continuous improvement of core business processes in the organization. Vision Software moved from workflow solutions for messaging services on top of Microsoft Exchange and Lotus Notes during the late 90s, to adding business rules, a graphical user interface (GUI), and abstractions in 2000 and 2001, to become a fuller BPM solution. The integration with Microsoft Visio was important, as Vision Software created its own shapes for workflow design, and embedded this functionality in Microsoft Visio. This was critical to becoming a full BPM solution because missing modeling shapes was like "missing letters on a keyboard," according to Gustavo Gomez, managing director at Vision Software.

Vision initially focused on the governmental, financial, and utilities markets, but changed the focus shortly afterwards to just financials. The primary reason for focusing on financials was to establish a solid reference list and credibility before entering other markets.

BizAgi

Vision Software introduced the BPM solution BizAgi in 2002 after sixteen years of experience in different process approaches for organizations. BizAgi focuses on expanding the European client base, and already has solid references in Germany and Italy, with successful implementations at large banks in these countries.

Vision Software does a great job developing BizAgi in both Java 2 Platform, Enterprise Edition (J2EE) and .NET without functionality loss in either platform. This enables end users to implement BizAgi regardless of the organization's technical architecture.

Organizations use BizAgi solutions mostly in the banking and insurance industries. In the context of these industries, BizAgi's BPM solution offers the following benefits:

* Cost reduction
* Reduced cycle time for transactions
* Visibility in processes
* Increased productivity and efficiency
* Better customer service due to reduction of errors
* Enabling process monitoring

BizAgi's solution takes the end user through various steps to optimize, automate, and manage the business processes:

1. Process design
2. Definition of process data and business rules
3. Assignation of resources
4. Verification of the process
5. Process execution
6. Analysis of process performance

BizAgi represents a model-driven architecture throughout the application. Figure 1 shows the architecture.


Figure 1. BizAgi architecture

The architecture covers the four main components: the client interface, the web server which processes the requirements from the client, the application server which processes all business information and components, and the storage components that contain the BizAgi database, the data warehouse and other external application databases.

This logical architecture is the foundation for BizAgi's agility. Figure 2 shows the several components within the different architecture layers of Bizagi.


Figure 2. BizAgi components in architecture

The business process layer uses BizAgi's Studio to define the organizational and data structures.

The business rules component uses a what-you-see-is-what-you-get (WYSIWYG) tool which enables business users to design the business rules without any coding. Computational business rules are programmatic components, which are developed outside BizAgi, but the business rules engine has the capability to integrate this coded business rule for further use.

Forms are dynamically assembled, and validations of these forms are present within the form designer, through integration with the business rule engine.

The business objects component, together with the component manager, enables data processing and decision making, based on specific values of data generated by activities in the business process layer.

BizAgi integrates with Microsoft Visio for process design. The level of integration is extremely solid and BizAgi has its own BPM stencil (toolbar) to design process flows, regardless of the complexity in Visio. BizAgi also supports business processes that are created in Aris, but the integration limits itself to importing and exporting the models.

BizAgi Product Strengths

Vision Software has come a long way to develop their BPM solution. Their focus was to develop a tool that shows simplicity, and BizAgi does this extremely well. Navigation through solution and use is well designed.

BizAgi's ability to assign tasks based on roles is a good feature because this assignation can include rules for skills, geographical information, etc. One of the future goals for BizAgi is to give it the ability to assign tasks according to the phase within the workflow.

BizAgi's data management is one of few BPM solutions which manage data in a structured model. Competing solutions such as Fuego and Ultimus do not have a structured data repository; for every activity the user has to define "data in" and "data out" variables.

However, organizations should not be discouraged by its simplicity, as BizAgi is capable of handling a range of simple to very complex demands for the modeling aspect of BPM solutions. Vision Software incorporates experience and understanding of BPM, and uses this knowledge to develop a full range of functionality to allow easy development of business models.

Another powerful feature is the flash animation which shows the workflow phase. This very sophisticated animation is embedded into the workflow interface, showing the steps that have been taken so far, and the phase a certain process is in. The animation is populated by data from the data warehouse.

The scalability and performance of BizAgi in combination with its pricing model makes Vision Software an attractive competitor to other vendors such as FileNet or Lombardi.

Challenges and Vendor Recommendations

Unfortunately, BizAgi depends on process models from external applications, as it does not have a proprietary process modeling solution. This might be a problem for several potential customers because it requires the client to buy a third-party business modeler to create the process models. This may be perceived as a burden. BizAgi is currently working on the designer model for the next release. As the integration with Microsoft Visio is very sophisticated and BizAgi already has their own toolbar, BizAgi should also be able to come up with a decent process modeler by then.

Vision's current solution does allow the user to generate a new version of a business process, where the system manages the different versions of the process at design and runtime. Unfortunately, though, there is no feature currently available to revert to an older version once a business process is changed. Vision Software is focusing on adding versioning capabilities at a later release. Versioning is a common feature among other vendors such as Metastorm, Fuego, and TIBCO, and is often a high priority of end users.

Simulation of workflow process is not present at the moment, and Vision Software indicates that this is also not a priority for delivery in a future release. Vision Software states that the solution is of such simplicity that business users will be able to see where bottlenecks may occur in the process. Validation of the process is currently present in the solution.

It will be interesting to see how Vision Software keeps up this independent platform development in the future, when developing more functionality.



SOURCE:
http://www.technologyevaluation.com/research/articles/vision-software-brings-a-solid-business-process-management-solution-to-the-table-18488/

A Unique Product Lifecycle Management Tool for Private Label Retail

Background

Software solutions provided via an exchange network over the Internet have been especially enticing to retailers and their suppliers. About five years ago, we witnessed the emergence of the Worldwide Retail Exchange (WWRE). Headquartered in Alexandria, Virginia (US), this Internet-based, business-to-business (B2B), retailer e-marketplace was founded by a consortium of seventeen international retailers. The idea was to operate an independent company that would benefit its members through a common commitment to openness, leading edge technology, retail industry process efficiencies, and transaction information confidentiality. The perceived benefits included B2B standards setting, collaborative activities, value-added services, participation in a global networked community, shared technology investments, and business process formulation and standardization.

At about the same time, the Global NetXchange (GNX), headquartered in Chicago, Illinois (US) and with a significant presence in Europe, was founded by eight of the world's largest retailers. It provided a supply chain collaboration suite to connect retailers and manufacturers via a jointly developed, Web-enabled technology platform for the global retail community. The similarities of the GNX and WWRE business models and solutions were such that a meeting of the minds was inevitable. Thus, it comes as no surprise to learn that WWRE and GNX have come to terms for a merger.

As announced in April and finalized in mid-November 2005, GNX and WWRE have combined their complementary product sets and re-branded the companies as Agent for Retail Integration and Collaboration Solutions (Agentrics LLC). The company is aptly named given its background and desire to shed any reference to being solely a B2B exchange. Agentrics' vision is to leverage common capabilities for retail trading partners as clients, and to drive global practices and standards for information sharing and collaboration between retailers and manufacturers.

Today, Agentrics has close to 50 major retailers and over 250 suppliers as primary customers, as well as over 50,000 participating registered auctioning suppliers. Moreover, twenty-three global retail industry leaders have demonstrated their commitment to the shared platform by serving as board members and as equity owners. Agentrics' "sweet spot" is clearly the retail market and its trading partners that are willing to share a platform and its inherent efficiencies. A partial list of "anchor" clients include Sainsbury's, Safeway, Sears, CVS, Metro Group, Walgreens, Albertsons, Tesco, Carrefour, Kingfisher, Kroger, Colesmyer, SCA, and Groupe Casino.

The consolidated solution suite offers global sourcing and procurement, global supply chain collaboration, global data synchronization, and product lifecycle management (PLM) solutions to the retail industry via a subscription-based platform. Specific capabilities include

* Global sourcing
o On-line auctions
o Integrated sourcing
o Consortium buying
o Trading partner discovery
o Consulting and program management

* Global supply chain collaboration
o Promotions management
o Point of sale (POS) data exchange
o Three-tier collaborative planning, forecasting, and replenishment (CPFR)
o Supplier performance management
o Consulting and program management

* Global data synchronization
o Global data pooling of item data attributes
o Item data quality services and synchronization
o Consulting and program management

Product Lifecycle Management Tool Emerges as a Unique Opportunity

Partly due to its March 2004 acquisition of QSA Software, a UK-based retail industry supplier of new product development and introduction (NPDI) software, Agentrics is marketing a PLM solution for the retail private label niche. The ProductVine PLM solution enables retailers to secure brand integrity of their private (or "house") brand label goods throughout the PLM process, from introduction to retirement. Agentrics' initial sales and marketing strategy targets food and grocery retailers; examining opportunities in other retail subsegments is a lower priority. ProductVine's food and grocery-centric database was designed based on years of industry best practice experience and was engineered to facilitate the NPDI process. ProductVine includes project management, critical paths, workflow, and collaboration capabilities. It aims to fulfill the retail private label market's need to manage product information, such as product specifications, pack copy, compliance, regulatory, and supplier data. This market's requirements also include ideation management (i.e., strategic alignment), range and category briefing, product idea collation, test evaluations, issue management, and quality management for production and supplier audits.

Agentrics' global analysis of the retail private label market indicates that there is a varying degree of sophistication and tiering of private labels across mature, emerging, and immature markets around the globe. The bottom line is that while there is significant country by country variation in private label maturation, retail private labeling is growing and maturing across most territories, including larger Western European countries, Eastern Europe, and Latin America.

The ProductVine solution portfolio was built according to the "stage gate" approach, where individual modules are engineered to work either together or independently. Its three-tier architecture, which includes project and category management (for marketing criteria management and market team use), product data management, and quality management (for non-conformance audits), can be used in different ways. For example, Marks and Spencer uses the project and category management capabilities extensively for private label category management, while Brinker International, a company with multiple restaurant brands like Chili's and On The Border, uses quality management functionalities to perform supplier non-conformance audits and to ensure brand integrity.

ProductVine's seven-stage, "stage gate" approach to the baseline time-to-market period is typical of the retail private label NPDI staging process. It includes

* Product brief and invitation to tender
* Testing and auditing
* Final decision and negotiations
* Design phase
* Print phase
* Launch phase
* Follow-up phase

The ProductVine solution is fully hosted on a Microsoft n-tier platform. Its pricing model is based on subscribers paying an annual fee based on their overall transaction volumes, the number of products and suppliers, and the number of modules utilized. More companies are experiencing ProductVine's capabilities firsthand as Agentrics is witnessing an increased level of customer adoption.


SOURCE:
http://www.technologyevaluation.com/research/articles/a-unique-product-lifecycle-management-tool-for-private-label-retail-18314/

The Essential ERP - Its Genesis & Future

Introduction

Integrated enterprise resource planning (ERP) software solutions have become synonymous with competitive advantage, particularly throughout the 1990's. ERP systems replace "islands of information" with a single, packaged software solution that integrates all traditional enterprise management functions like financials, human resources, and manufacturing & logistics. Knowing the history and evolution of ERP is essential to understanding its current application and its future developments. Following is the genesis of ERP by era.

1960s - Pre-Computer Era

The focus of manufacturing systems in the 1960s was on inventory control. In those days when a computer would occupy an entire wing of a building at a local university, most manufacturing companies could not afford to own one. However, companies were able to afford to keep inventory on hand to satisfy customer demand. It was the age of the reorder point system (ROP) where the assumption was that the customer would continue to order what they had before and the future would look very much like the past. In most industries this was a valid assumption, since product life cycles were measured in years.

Inventory was regarded as an asset not only on the balance sheet but also in the mind of the average manager. Therefore, production planners created schedules and managed materials by hand. In the production control office, the manual explosion of Bills of Materials (BOMs) often resulted in errors. Index card files had been used to record material allocations, receipts, and issues. When the unallocated inventory balance on the card seemed low for a certain part, a planner would give a card to a buyer, who would then place a new purchase order. Those card files provided a real help to a planner as long as each index card had been updated in a timely manner and put in the right place.

The order entry/sales department usually created the plant schedule. As a result, persons who had little or no access to material availability information loaded forecasted sales and actual customer orders into the schedule. This lack of visibility, combined with the cumbersome inventory record-keeping process, caused frequent schedule changes and delayed customer deliveries. Often the shop would start an order only to learn that required material was not available. The resulting excessive work in progress (WIP) and raw materials tied up unnecessary capital funds and shop floor space, which ultimately led to a number of other missed opportunities.

1970s/1980s - Advent of Computers in Manufacturing

When computers finally became small and affordable enough to be deployed by an average manufacturing company, the resolution of materials mismanagement initially gained the highest priority status. Silently, the need to order only what was really needed crept onto the horizon. No longer could a company afford to order some of everything. Orders had to be based on what was being sold. What was already in inventory or committed to arrive on a purchase order offset this requirement. As a result, Materials Requirements Planning (MRP) computer systems were developed to provide for 'having the right materials come in at the right time'. The Master Production Schedule (MPS) was built for the end items. The MPS fed into the MRP which contained the time-phased net requirements for the planning and procurement of the sub-assemblies, components, and raw materials.

MRP - The Initial Impact

The impact that the computer had on material planning and enterprise management was huge. From the manual planning and huge posting card decks, this new computer system promised to automatically plan, build, and purchase requirements based on the items to be shipped, the current inventory, and the expected arrivals. The posting originally done on the manual input/output cards was replaced by transactions directly made in the computer and documented on pick lists. The amount on inventory was visible to anyone with access to a computer without having to go to the card deck and look it up.

MRP, or 'little MRP', represented a huge step forward in the planning process. For the first time, based on a schedule of what was going to be produced, which was supported by a list of materials that were needed for that finished item, the computer could calculate the total need and compare it to what was already on hand or committed to arrive. This comparison could suggest an activity to place an order, cancel orders that were already placed, or simply move the timing of these existing orders. The real significance of MRP was that, for the first time, the planner was able to answer the question 'when?'. Rather than being reactive and waiting until the shortage occurred, the planner could be proactive and time phase orders, including releasing orders with multiple deliveries.

Nevertheless, some simplifying assumptions were needed to allow the computers of the day to make the required calculations. One was that the orders should be started at the latest possible date to provide for minimal inventory while still serving the customer's need on time. This method is referred to as 'backward scheduling'. Therefore, all orders were scheduled backwards from the desired completion date to calculate the required start date. There was no slack time in the schedule and the downside of this assumption was that if there were any hiccups in the execution of the plan, the order would most likely be late to the customer. If only one part needed for the finished part was going to be late, there was no automatic way to know the impact on the other needed parts. Slack was built into the schedule through conservative lead times. Despite this drawback, the benefits far outweighed the costs and more companies began to embrace the tools and techniques of MRP.

CRP - The Next Development

As more people learned how to utilize this material planning methodology, they quickly realized something else very important was missing. It did not suffice to have all the parts to get the job done, sufficient plant capacity was needed as well. The idea of closing the loop with a capacity plan was introduced and 'closed loop MRP', 'big MRP', or Capacity Requirements Planning (CRP) was born.

At the same time, computers were increasing in power and decreasing in price. The computing capacity to do the extra mathematical computations was affordable and available. Now, not only could the materials be calculated, but also a capacity plan based on those material plan priorities could be created. In addition to the bills of materials needed for each of the finished parts, defined paths for the production process were necessary. Defined paths for the production process, commonly called 'routings', specified the machines or group of machines (work centers) to be used to build the parts so that capacity and load could be planned and scheduled.

Another critical assumption needed to complete the computations of the computers of the day was that infinite capacity existed at each of these work centers to satisfy this calculated demand when it was required. Infinite capacity is not an accurate reflection of reality, and this drawback in the use of MRP/CRP remains present till today. However, for the first time, reports were available where the overload conditions could be identified and proactively resolved for each machine. This allowed the preparation of plans and options to address the overload situation before the problem occurred. Typically, lead times were long enough to allow work centers to smooth out unbalanced workloads in the short term and still support the overall required completion of the work order.

After the BOM explosion and time phasing of materials and capacity had been accomplished through MRP/CRP, other problems on the shop floor became evident. While planners created a feasible schedule, with all the right material on its way or in stock, one would discover that maintenance on a critical piece of equipment had been overlooked or that skilled production workers were unavailable. Therefore, planning of all manufacturing resources, other than materials and nominal capacity, became the 1st priority.

MRPII - Connecting Manufacturing and Finance

Once again the technology improved simultaneously with the realization that as every piece of inventory moved, finances moved as well. For example, if a part was received at the factory, not only should the inventory on hand go up but also there should be a corresponding increase in the raw material inventory asset on the financial books. This is balanced by an increase in the liability level in the accounts payable account. As a group of parts moves to the shop floor to build the finished product, the raw material asset should go down and the work in process asset should go up. The labor and overhead charges from the shop floor personnel also are added to the work in process asset account with an offset to the accounts payable account. When the finished part completes it route through the shop, the work in process asset account goes up. As the finished product is sold, the finished good asset account goes down and the accounts receivable asset account goes up. Consequently, at every step of the way, as the inventory moves, financial accounting moves with it - in duplicate - with balanced credits and debits.

Available technology now had the power and was affordable enough to track this inventory movement and financial activity. As a result, the basic programs for manufacturing were integrated into one package using a common database that could be accessed by all users. These were the first Manufacturing Resource Planning (MRPII) packages, used predominantly by discrete manufacturers (See Glossary). Since MRP assumes infinite capacity and strict adherence to schedule dates, process and flow manufacturers (See Glossary) found little use for it. They instead focused their efforts during the same time on other aspects of the supply chain, particularly forecasting, purchasing, and distribution.

MRPII does not mean that MRP was done incorrectly the first time, it is rather its significant evolution. MRPII closed the loop with the financial accounting and financial management systems. The American Production and Inventory Control Society (APICS) defines MRPII as follows:

'A method for the effective planning of all resources of a manufacturing company. Ideally, it addresses operational planning in units, financial planning in dollars, and has a simulation capability to answer "what if" questions. It is made up of variety of functions, each linked together: business planning, sales and operations planning, production planning, and the execution support systems for capacity and material. Output from these systems is integrated with financial reports such as the business plan, purchase commitment report, shipping budget, and inventory projections in dollars. MRPII is a direct outgrowth and extension of closed loop MRP.'

For the first time, a company could have an integrated business system that provided visibility to the requirements of material and capacity driven from a desired operations plan, allowed input of detailed activities, translated all this activity to a financial statement, and suggested actions to address those items that were not in balance with the desired plan. Good information leads to good decisions, and therefore these integrated, closed-loop information systems provided a competitive advantage.

Meanwhile, other functional areas of companies had also been requesting help from data processing departments, today known as Management Information Systems (MIS), Information Systems (IS), or Information Technology (IT) departments. Systems were developed for support of each major functional area. As an example, Accounting and Finance had a set of programs that helped it manage the general ledger, accounts payable & receivable, as well as capital assets and financial reporting. These accounting programs were combined to form an integrated system for accounting, likewise MRPII integrated the manufacturing programs. Sales, Engineering, Purchasing, Logistics, Project Control, Customer Service, and Human Resources followed suit and each developed their own sets of integrated computer systems. Unfortunately, these disparate systems were unable to interact and exchange information. Information exchanges between these systems, often time consuming and error prone, were by interface programs.

1990s - Enterprise Resource Planning

By the time each functional area of a company had developed its integrated software program, the need for tightly integrating them became obvious. The next major shift during the late 1980s and early 1990s was that 'time to market' was becoming increasingly short. Lead times expected by the market continued to shorten and customers were no longer satisfied with the service level that was considered world class only a few years earlier.

JIT, ATP, and Other Factors

Customers were demanding to have their products delivered when, where, and how they wanted them. Companies were therefore compelled to develop and embrace the philosophies of Just in Time (JIT) and supplier partnerships as a way to remain competitive.

During the same time frame, the cost of goods sold was shifting drastically from labor to purchased materials. Consequently, planners needed to know materials allocations or finished goods' available-to-promise (ATP) values, immediately after customer order entry. On the other hand, buyers needed to know the sales plan several months in advance in order to negotiate prices for individual materials. Empowerment of employees was needed to provide the agility that was required to compete in the market.

The need to develop a system with tightly integrated programs that would use data stored on one common database and would be used enterprise-wide (actions in one department's program driving actions elsewhere), became the highest priority for IT professionals. No longer was it tolerable to submit a request to the IT department and wait several man-months of programming time to obtain this critical information. This common-database, company-wide integrated system was named Enterprise Resource Planning (ERP).

ERP Defined

APICS defines ERP as follows:

'An accounting-oriented information system for identifying and planning the enterprise-wide resources needed to take, make, ship, and account for customer orders. An ERP system differs from the typical MRPII system in technical requirements such as graphical user interface (GUI), relational database management system (RDBMS), use of fourth-generation language (4GL), and computer-aided software engineering (CASE) tools in development, client/server architecture, and open-system portability.'

Impact of the PC

The cost of technology continued to plummet and the advent of the personal computer (PC) revolutionized once again the face of business management systems. At a fast pace, the large inflexible mainframes were replaced by new client/server technology. The power of these small PCs exceeded the power of the large mainframes that were routine only a few years earlier. It became possible to run a fully integrated MRPII system on a small PC.

The changing pace of technology had once again leveraged forward the planning and control systems in recognition of a real business need. In addition, unlike previous evolutions, the ERP software vendors offered these critical business applications also to non-manufacturing companies.

ERP is far more than just MRPII which runs on a client/server architecture. ERP encompasses all the resource planning for the enterprise including product design, warehousing, material planning, capacity planning, and communication systems, to name but a few. These critical business issues affect not only manufacturing companies but also all companies that desire to achieve competitiveness by best utilizing their assets, including information. In other words, ERP systems help companies become leaner by integrating the basic transaction programs for all departments, allowing quick access to timely information. However, ERP inherited MRPII's basic drawbacks, which are the assumption of infinite capacity and the inflexibility of scheduling dates, preventing companies from taking full advantage of speedy information flow.

2000s - Future of ERP

During the last three years, the functional perimeter of ERP systems began an expansion into its adjacent markets, such as supply chain management (SCM), customer relationship management (CRM), product data management (PDM), manufacturing executions systems (MES), business intelligence/data warehousing, and e-Business. The major ERP vendors have been busy developing, acquiring, or bundling new functionality so that their packages go beyond the traditional realms of finance, materials planning, and human resources.

Capacity Planning

To circumvent MRPII's capacity planning limitations, planners turned to various ways of off-line capacity planning: either manually, with the help of spreadsheet programs, or with the help of new advanced planning and scheduling (APS) systems. APS systems are designed as bolt-ons with the idea of plugging into an ERP system's database to download information and then create a feasible schedule within identified constraints. The new schedule can then be uploaded into the ERP system thereby replacing the original MRP results. These APS systems typically offer simulation ("what if") capabilities that allow the planner to analyze the results of an action before committing to that action through the ERP system. Some of these systems go one step further by offering optimization capabilities. They automatically create multiple simulations and recommend changes in the supply chain within the existing constraints.

Global Supply Chain Management

While most traditional ERP software enables the integration and management of critical data within enterprises, companies have increasingly recognized the need to deploy more advanced software systems that manage the global supply chain by enhancing the flow of information to and from customers, suppliers and other business partners outside the enterprise. More recently, the availability and use of the Internet has created a demand for software that operates across the Internet and intranets. This global logistics concept merged with above described constraint-based optimization solutions called advanced planning systems (APS) and specialized warehouse management software (WMS), resulting in SCM (See Advanced Planning and Scheduling: A Critical Part of Customer Fulfillment).

The major ERP players already have offerings or strategies addressing this important need (See The Essential Supply Chain and SAP APO - Will It Fill the Gap).

Customer Relation Management

Another important area of functional expansion is in the front office/customer relationship management (CRM) arena. Customers are demanding applications and tools that allow them to link back-office ERP systems with front-office CRM systems. They are also demanding enhanced capabilities for e-Business, especially business-to-business (B2B) and business-to-customer (B2C) electronic commerce. The leading ERP vendors have begun to discern the opportunity these products present and the benefit potential for organizations implementing them. CRM has gone from a vast field of point solutions to suites of customer care applications covering sales force automation (SFA), field service, telesales, call center, marketing automation, etc.

ERP vendors have explored various routes to penetrate the CRM and e-Commerce markets, such as developing in-house products (SAP, with its telesales module and mySAP.com portal), acquiring point specialists to augment their offering (Oracle through its acquisitions of Versatility for call center, Tinoway for field service, and Concentra for its product configurator module), merging full suites (Baan with its acquisition of Aurum in 1997, and PeopleSoft with its acquisition of Vantive in 1999), and partnering with CRM and e-Commerce leaders (J.D. Edwards with Siebel and Ariba, and SAP with Recognition Systems Group for its market campaigns module).

Real-Time Performance Analysis

ERP software's scope will go beyond traditional transactional business functions by enabling organizations to deliver real-time performance analysis directly on the desktops of CFOs, CEOs, and business managers. Major ERP vendors have been shifting focus from routine users' transaction requirements to the overall organization's business imperatives, thereby helping lines-of-business become more knowledgeable and proactive. Instead of requiring a collection of processes, the system should appear to each user as a vast source of information. While relational databases, currently used by ERP systems, are good at retrieving a small number of records quickly, they are not good at retrieving a large number of records and summarizing them on request. Therefore, major ERP vendors have been increasingly embracing OLAP (On Line Analytical Processing) tools, that provide a high-level aggregated view of data.





SOURCE:
http://www.technologyevaluation.com/research/articles/the-essential-erp-its-genesis-future-16268/