Monday, June 23, 2008

Understanding The Analytic Spectrum

Reporting and analytic solutions have a wide footprint. A simple listing of orders that is due to be delivered today will pass as a reporting solution, as will the report on total corporate spend across a product category. However that is pretty much where the similarity ends. Almost everything else for these reports is significantly different, including the business process that each of these supports, the audience, frequency, and the process for producing the two reports. This complexity is generally hidden from the users, and frequently produces the frustration in the relationship between the IT, and business on long lead-times and large budgets in deploying the reporting solutions.

Below we present the analytic spectrum from a techno-functional point of view to add the business understanding to IT centric teams and the technology understanding to the business teams.

Operational Reporting is the lowest granularity of reporting. Its objective is to support day-to-day operations of a specified role. These reports need real-time data, and any exceptions need to be addressed immediately. These reports are frequently part of the application that supports the business function, and are directly queried from the underlying applications’ OLTP (on-line transaction processing) database or its mirror.

For example, take the Purchase Order Management function. An expediter may need the listing of POs that are late for delivery. This is simply a list of POs that fit the user specific date filter where the need date has passed and the status of the PO has not changed. An inventory analyst may need a list of all the POs that are expected to be received today to make allocation decisions, or a financial analyst may need a list of all POs received a day before for accruals. All the three reports are immediately produced from the Purchase Order Management application directly from its transaction data, and no data-processing is required for creating the report. The target audience for operational reporting is the people who manage daily operations of the supply chain functions like purchasing, receiving, shipping, etc.

Process Support Analytics is the next level of reporting where the data from Operational Reporting applications is consolidated, processed, and used to create process metrics. These process metrics typically point to inefficiencies in the processes, and help the managers tune them for better performance. These reports typically lose the individual transaction character present in the operational reporting. While an expediter needs the list of PO line-items due on a given day (operational), a manager may need information on the number of items that needed to be expedited from a given vendor in a month to establish if the process is operating normally or not.

This type of analysis typically needs information for a longer time horizon to compare and establish trend lines. The individual transaction information is consolidated and processed to produce counts, summaries, cumulative values and so on. The reports are typically produced by moving the transaction data from the application OLTP database to a process centric database that consolidates the information. For example you may have a purchase database where all purchase transactions from all purchase applications are brought together. In order to bring together data from disparate systems, the data may need standardization, cleansing and referencing. The data is not real-time, and typically brought over after the active life of the transaction is over, for example after the POs are “closed”. Such data stores are often called Operational Data stores (ODS).


Decision Support Analytics finally not only consolidate data for a process, but actually combine it across the processes. The objective of the decision support analytics is to provide inputs for improving corporate efficiencies across processes though better planning and optimization. Combining data across the processes typically needs the companies to be able to harmonize all master data so that the transactions from different business processes can be consolidated with the same context.

For example producing a total spend for a given product category across all vendors means that the financial and purchasing systems either have a common vendor, items, currency, and item hierarchy; or must know the mutual references to produce the common context.

Deploying the Analytic Spectrum

While it is quite simple to provide the operational reporting from individual applications, the complexity of the analytic environments increases exponentially for the Process and Decision support analytics. The most difficult part of establishing good functioning analytic environments is to be able to create common reference master and organizing data. The common master data refers to the entities like items, vendors, customers, locations, time, etc. that is used by several systems. The common organizing data refers to the hierarchies for items, locations, organizations, locations, etc. that is used to process the data up or down the hierarchies, or groups that are used to create consolidated numbers.

Creation of common master and organizing meta-data is a pre-requisite for success, and requires clear leadership from business and IT teams. Business teams need to understand the need of having a common reference, and provide the rules for cleansing and harmonization of this data. The IT teams need to be able to elaborate the need, and establish data staging areas where such cleansing and harmonization can happen with proper error and exception handling strategies. Without such common reference data and active IT-business partnership, any enterprise-wide reporting and analytic initiative is bound to fail.

In a future article we will look at the above spectrum in the supply chain context to establish what a supply chain reporting and analytics environment would look like.

No comments:

Post a Comment