Wednesday, April 30, 2008

Instant Continuous Planning: Just Add Water (Part 2)

In the previous Continuous Planning post, we discussed the evolution of management processes from a lagging to leading view of the business and examined the business processes needed to support the continuous planning cycle.

Defining and evolving management processes requires the confluence of data across and between all stakeholders within the business environment, including customers, supply chain, and organizational groups such as Finance. For the Finance department, this means providing the means for continuous planning and the ability to consolidate these digested results with the organization's other data. To support this, there are a number of technologies that must be in place within the organization:
  • Planning application
  • Analytic application (OLAP database)
  • Financial & operational data stores
  • While not necessary, performance management tools such as scorecards and dashboards provide a feedback loop
Best practices dictate that the following technical and process requirements support the continuous planning cycle:
  • 100% uptime of the planning application. This is particularly important for global enterprises where teams in various geographies are accessing data; no group can be shut down.
  • Near real-time reporting. Incremental data updates and calculations provide near real-time reporting against plan data.
  • Integration with operational and business data. Access to operational data provides necessary planning context; as well as instant feedback and adjustments to the plan. Availability of other data such as supporting detail or plan assumptions is integral within the reporting environment.
  • Consistent performance. Ensuring fast and consistent performance is crucial during the planning cycle.

Architecture to Support Continuous Planning Cycle

The architectural framework required to support a continuous planning cycle includes:
  • Change Data Capture. The planning application collects data and performs value-adding multidimensional calculations upon the data, but does not aggregate it. Through an intelligent backup process, only changed data is backed up and extracted from the application.
  • Financial Data Store. At an established rate (every 2 minutes, for example), changed data is extracted from the planning application and is automatically loaded, with associated metadata and security, into the central repository. The shared data enables the ability to quickly digest and present data to the analytic applications.
  • Load into the Analytic Reporting Application. From the financial data store, the financial assets are loaded into the reporting analytic application, where consolidations can be completed on-the-fly.
  • Reporting from Analytic Applications. The reporting application should be easily accessible from a wide range of tools – including performance dashboards, scorecards, report writers, spreadsheets, and any other common reporting tools used within the organization.
  • Alternative Architectural Options. Flexible reporting strategies can be a valuable extension of this architecture. For example, a planning cube can contain few dimensions and limited granularity. The reporting cube can have additional dimensions and a deeper level of detail to support robust reporting needs.
Previously, the support of this architecture required a patchwork of data movement tools and custom scripts that did not allow for the integration of data from the proprietary source systems into a larger financial data warehouse. This did not enable the sharing of data within the greater organization and generally required a costly combination of tools and consulting that needed ongoing maintenance and support.

But companies are now accomplishing this efficiently and cost-effectively, with a significantly lower cost of maintenance. With the availability of comprehensive, standards-based data integration solutions and management teams establishing best practices around their planning cycles, the transition from a lagging to a leading planning process, with accurate, near real-time views into operations, is a realistic goal for any organization. The return on this investment is the creation of an innovative, flexible, and dynamic planning cycle that allows your company to quickly recognize changes in the competitive landscape, the ability to model changes quickly, determine the best alternatives, implement, and measure results.

Instant Continuous Planning: Just Add Water

It's a strange feeling when personal and business lives intersect, and as I sit down to write about the value of continuous planning cycles in the business process, I am struck by recent personal experience.

It's March 25th, 10 pm, and I am sitting on the floor of my home office, a cardboard box beside me, as I stack piles of receipts, 1099s, stock sale receipts, etc in preparation for tomorrow's tax meeting with my accountant. I really don't know what to expect from this meeting – where am I in relation to the plans I set in January 2007? I could swear that I sat in the same place last year promising myself that I was going to improve this system.

And while some of you may relate to my experience (hopefully, as misery does love company), it gives me pause to think that, on a much larger scale, many of our corporate planning cycles don't fare much better. As a business director responsible for budget and planning cycles, when do you start your process? How many annual/monthly/weekly iterations do you go through? What is the effect of business change on your planning cycle? Are you feeling like there's a cardboard box beside you, yet?

With the demand for businesses to adapt quickly to changing markets in order to stay competitive and operate at maximum efficiency, we have seen significant streamlining of the planning process in recent years. In parallel, we've seen the tools and supporting infrastructure technologies become more sophisticated and robust in order to support operational and reporting needs. With pressure to move from a lagging to leading view of the business, we've evolved from annual budgets, to quarterly planning, to rolling 13 month plans and now we're recognizing the need for a continuous planning cycle that provides an accurate, real-time view of our operations.

I often hear that the concept of continuous planning in a fast-changing corporate environment is a Utopian vision rather than a realistic goal. And yet large, dynamic companies such as Symantec and Yahoo have successfully implemented continuous planning cycles into their business processes. So how do they approach this? What tools do they use? How do they measure results from their planning process?

Implementing a continuous planning cycle in your organization is not about buying a tool. It's not about agreeing to a process during a management meeting. Continuous planning requires the direction and support of the entire management team to reinvent the way the company approaches the planning process. It needs management to create a chain of accountability throughout the organization, from C-level to individual contributors; every person knows their role, their goals, the company's goals and where they stand in relation to meeting those goals. Continuous planning requires the implementation of best practices and an effective process within the organization, as well as the tools to support the process.

A successful continuous planning process requires collaboration between executive management, finance department and operational teams:
1. Define strategic goals.
2. Create plans, rolling forecasts & budgets for all levels of the
organization.
3. Roll out, analyze against execution & provide feedback loops.
4. Model & realign plans as needed.
5. Monitor & report, with audit information for regulatory & statutory
reporting requirements.

In a continuous planning cycle, steps 3 & 4 are iterative and constantly changing, driving the need for updated information to be delivered to the desktop at a near real-time rate.

To Be Continued... The Secret Sauce in the Technical Details

Thursday, April 10, 2008

Insight on the Gartner BI Conference in Chicago

As a long time attendee of the various Gartner events, my favorite has always been the more intimate setting of the BI summit - typically held in Chicago. Boy, was I surprised at the 1200+ attendees of this year's conference and the expectation of bigger growth to fuel a move to Washington, DC in 2009. Last year’s event pales in comparison. BI advocates, typically a cross-over role between Finance and IT, have made a rather dramatic transition into the IT camp for this conference. Many of the sessions were educational and entry-level in scope and oriented to the first time architect and supporter of Business Intelligence environments. The event has clearly moved away from the user of past years.

Also notable was the absence of discussions on the relative cost savings in deploying a BI environment for better business planning and analysis. Other than the one rather cheeky session on how to negotiate a good deal with your BI vendor, I expected to hear more about cost cutting and using BI data to enable efficiencies and identify cost-prohibitive inefficiencies. The market picture has been doom and gloom and the "R" word blatantly used. Instead I heard a great deal about enabling integration and the expansion of analytic applications.

I think this is great news! Clearly the message was to expand the BI footprint and make use of the technology that can drive profitability, not focus on doing more with less. I had conversations with luminaries such as Howard Dresner and Ron Powell, but I also spoke with a number of vendors and implementers such as IBM GBS and Palladium, who had similar messaging of exploiting the existing technologies and refining processes to drive revenue and deliver data on a more continual basis to the employees who can make the right decisions in real-time. This to me is a message of expansion and focus, rather than retrenchment and limitations. I heard a great deal about the continual need for data, an almost real-time need to do continuous planning and report on finer levels of granular information in a more automated fashion. The message was clear. BI is tops in priority and destined to gain more mindshare of senior IT executives as the business expands and focuses their growth. To grow and expand you always need Innovators who can do more with less, but also the visionaries who can see the gold amongst the rock.

Wednesday, April 2, 2008

The Challenge Remains the Same

In the recently published "Cost Cutting in Data Management and Integration, 2008," Gartner emphasizes the need to reduce costs while continuing to support BI initiatives. Sound familiar? The operating motto for CIOs and IT teams for the past several years: Do more with less.

With recent downturns in the US economy, the business needs for the data coming from BI systems have become critical to companies’ maintaining their competitiveness. When there is less money in play, companies have to understand all facets of their business: where they are making money and where they are leaving opportunities for improvement on the table, to plan continuously, and to have the frameworks in place to change directions quickly in order to stay at the forefront of their markets. There is a pressing need for accurate, reliable data from which they can analyze and gain the insight needed to make the best business decisions possible. This source data comes from the same IT team tasked with reducing their costs.

Balancing the business need for insight into the organization and the IT need to reduce costs is where innovation can occur. Gartner’s report provides a number of recommendations for cutting costs within data management initiatives, ranging from the common-sense Optimize Data Integration Tools Licensing to recommendations that are complex and require creativity within the IT teams to successfully achieve: projects like Perform Operational Database Consolidation and Perform Data Mart Consolidation. Gartner estimates the payback from data mart consolidation as a savings of “approximately 50 percent of the total cost allocated to supporting their disparate data marts if they consolidate those marts into an application-neutral data warehouse.”

Easily said, but how to get from here to there? The solution lies in feeding a central application-neutral data warehouse with business critical data from application-specific data marts. To do this requires using intelligent data movement applications to ensure that this feeding of the warehouse is both seamless (does not interfere with the data mart applications), highly dynamic (so the data is refreshed in near real-time), persisted (ensuring a physical copy of calculated or aggregated data is available in the warehouse), and auditable (for data governance and regulatory compliance).

There are myriad opportunities to reduce costs within the IT department. The challenge is to develop innovative solutions that meet the requirements of data governance, security and accuracy, and still provide business users with the data they need to drive the company forward.

InformationWeek explores the meaning of "innovation"

InformationWeek VP and Editor in Chief Rob Preston's recent article previews an intriguing new micro-site about the broader aspects of innovation within corporations, namely the "convergence of customer-focused strategies and global networks." Contributors include University of Michigan's C.K. Prahalad and M.S. Krishnan, authors of the forthcoming book I'm eager to read called The New Age Of Innovation (McGraw-Hill). We'll explore implications of this movement on Finance Systems in upcoming posts, including something Bob Evans highlights from the book's introduction about the need for corporations to bridge the "significant gap between strategic intent and 'capacity to act'."