Tuesday, July 29, 2008

Continuous Planning Architecture, Phase 2

Back in April, Kimberley Bermender posted on the finance systems architecture needed to support a continuous planning environment. The overall design goals included:
  • Lower cost of maintenance,
  • Standards-based data integration,
  • Support of accurate, near real-time views into operations,
  • A platform to support detecting, modeling, selecting and implementing change and then measuring results.
The main ingredients were:
  • financial and operational data stores,
  • an analytic (OLAP) engine – with reporting, and
  • planning applications
Inherent in the architecture is the movement of data between data stores and OLAP cubes, as well as fact and metadata management. To build on that architecture, “phase 2” can include two other components that build on those design goals and get you closer to enterprise performance management nirvana.

The first is master data (or ‘reference’ data) management, and the second is a common enterprise performance management rules or calculation engine.

Master data management includes the tracking and control processes of data relationships (especially hierarchies) and instances across the enterprise. For example, product sales for a store in Ft. Collins, CO could roll-up to a ‘Central’ region one quarter, and then to the ‘West’ region the next quarter after a re-org. It’s important to keep track of which region it belonged to when doing quarter over quarter comparisons and other management reporting (not to mention statutory reconciliation and reporting). And that hierarchy could be contained in the store reporting application, the sales forecasting system, the G/L, the customer relationship management (CRM) system, and so on. Right now, those relationships are probably being manually managed and ‘lightly’ controlled. Our more complex financial systems require more automation and more rigorous control over master/reference data. And it certainly addresses at least the first 2 design considerations of Kimberley’s architecture.

For more on master data, see this DMReview landing page.

The second component is a central business rules/calculation engine. In any enterprise performance management environment, users can easily get bogged down in the definitions of data and information. For example, that Ft. Collins store could be looking at a ‘revenue’ report and not know if it’s booked revenue, commissionable revenue, recognized revenue, and so on. And even when they find out what kind of revenue it is, there can be a question of it’s accuracy: how did head office calculate it, where did they get the data from, and does it include intercompany sales or not?

Having one business rules engine lets the enterprise define ‘recognized revenue’ once, with control over the algorithms, the data refresh frequency, the data sources, and so on. Once the rules engine has certified a number, it can be used by all other enterprise performance management systems: planning can use it for prior actuals, strategic financial models can use it for long term scenarios, same store sales dashboards can use it for ranking, and so on.
The three-fold goal is to get better transparency into financial information (how did we get that number), better accountability (finance owns and certifies the number), more efficiency (define it once, don’t reinvent the wheel), and ‘believe-ability’ (start debating what to do about the results, not where the number came from).

Here’s a good article by Robert Blasum in DM Review on central rules (he also connects them to master data management)

Thanks to Kimberley and the team for letting me guest blog, please feel free to visit the Business Foundation blog over at http://businessfoundation.typepad.com/

No comments: