Wednesday, March 25, 2009

Closing Faster

This is the third in the series of posts about 2009 Finance System priorities following discussions with Rob Kugel of Ventana Research and the executive team from Star Analytics. After reviewing Resolution #1: Focus attention on more strategic activities and less on transactions processing, and Resolution #2. Improve planning effectiveness, here's the next area Kugel recommends that you consider.

Resolution #3. Close faster

"Our research shows that most companies that take more than five business days to complete their accounting close believe they can and should reduce the time this takes."

He adds, "There are many business reasons, including being able to present performance information to people in the business as soon as possible and having more time to spend on preparing reports presenting results to shareholders and regulators.

What changes are needed? "We’ve found there are many small process improvements companies can make, but it’s equally important to have an ongoing focus on finding ways to shorten the process. Along with this, we also find that technology (too many spreadsheets) and data issues (it’s too difficult to combine data from multiple sources) can impede a faster close."

For more on this topic, check out Mark Smith's post on "Continuous Improvement in Finance" on BusinessWeek's Business Exchange, where he refers to the firm's "Fast Clean Close" research benchmark.

Stay tuned for the next resolution, laying the groundwork for effective performance management.

Monday, March 23, 2009

Information Week: Data access & integration hurdles

Better data access and integration tops the list of priorities in the new "2009 InformationWeek Analytics/Intelligent Enterprise.com Reader Priorities" report. Doug Henschen notes, "...many companies still struggle with the basics of information management. After all, you won't be worrying about reducing data latency and supporting faster decision-making if you're still stuck at the first-level challenge of accessing and integrating data." Read more

Sunday, March 22, 2009

Is Innovation Magic?

There seems to be this perception that if you are introducing innovation or a new way to capture information and, as yet, no one else is doing it that way that it must be magic. Hardly. The paradigm of "thinking outside the box" is about enabling a perspective that has either been overlooked or underutilized. The simple concept of combining finance data with operational data should not be new or insightful or magical. It should be a simple everyday occurrence, but yet, it has been cited as unnatural or innovative.

In today’s economy the only way a company can guarantee success is by careful planning. If we think of the three elements driving the financial aspect of the business: the Plan, the Budget and the Forecast, only one of those has to be in constant motion to achieve success with the other two. If you are constantly planning and adjusting your plan, your budget and forecast will follow and you can take adjustments to the budget and forecast in lesser increments than the planning process. The key to enabling the continuous planning process is in having access to the financial data. That data is derived from the business and starts with a baseline plan. It needs to be continually fed with operational data to be reflective of the immediate business health. Then the Forecast will be adjusted and consequently the Budget to meet that Forecast and the iteration of the Plan.

Robert Kugel from Ventana Research sums up the path of the source of the data nicely in his blog of March 13th. CFOs Need Better Financial Information Management. I contend that it is not only the CFO who needs better Financial Data, but the Office of the CFO who must deliver the aggregated Financial data back down to the business to assist them in planning and creating profitable plans. This constant loop of data means that IT and Finance have to be in synch in the granularity of the data and the delivery of the data regardless of the tools in place. Today, it is less about the end-user technology than it is about data access and data delivery. Bad data yields bad decisions. That’s the obvious parable here.

So if we tie this back to innovation and magic, the result is that Merlin has yet to be uncovered in this economy and we suffer from a simple problem of a lack of access to the right data or information to be able to make the right decisions in real-time with the most current and scintillating data. And, more importantly, we have to bring together the education necessary to see what the data is telling us, but that is left for another blog post. Here we advocate continuous planning. The next phase is continuous data education. Can we achieve success in the future by always assessing based on the events of the past? That is where innovation may help.

Thursday, March 19, 2009

Planning Your Assumptions

Probably every budget ever created was out of date before it was even completed, so the smarter finance innovators have adopted (or are at least moving towards) a continuous planning cycle, where real-time planning occurs in response to real-world events and new information, whether actual past results or better assumptions about the future.

So it's a welcome development that CFO Magazine is continuing its excellent webcast series with a CFO Master Class entitled The Evolution from Budgeting to Continuous Planning and Forecasting. In these turbulent economic times, dynamic planning seems like the only way to go.

With any budget, it's the assumptions you make that have the most impact on the accuracy and usefulness of your projections. That's why in my previous post I emphasized the critical nature of near real-time access to the data in your financial systems - not just to report against historical numbers, but to be able to combine that data in a timely manner with operational data to see how current trends and external factors affect profitability - and so allow you make better assumptions, and ultimately, plan more effectively and accurately.

.

Wednesday, March 18, 2009

Better Bottom Line Decisions

Right now who doesn't want to improve their bottom line? To help with this CFO Magazine and SAP are sponsoring a webcast on Improving Cost and Profitability Decisions in a Challenging Economic Environment. There are many well-defined methodologies, techniques and tools to measure both costs and profitability, and everyone should be aware of these (and ideally applying them to their businesses).

However one of the most critical issues that is often overlooked is the need to have access to the right data, to understand the real cost of a decision and whether it's a profitable one (or not). In many companies, this is viewed as an IT problem, but it's typically the Finance Department that owns the critical information which allows the true measurement of both cost and profitability. In many companies that data is trapped in proprietary financial systems that aren't well integrated with the organization's operational data. Yet access to that financial information in real-time (or near real-time) and the ability to instantly analyze any operational decision using that financial information can make the difference between a profitable decision and one which gets you in to trouble - sometime in the future.

True finance innovators know this, and the smart ones are investing in the technologies that allow them this kind of instantaneous access and seamless integration of their finance systems with their operational ones, to help them make better bottom line decisions.

.

Thursday, March 12, 2009

2009 Finance System Resolution #2: The Best Laid Plans

Here are more outtakes from my conversation with Rob Kugel of Ventana Research and the executive team from Star Analytics. The discussion focused on how the economic meltdown is impacting senior executives in corporate finance departments and how technology can help.

If you’re making headway on our previous post about Resolution #1: Focus attention on more strategic activities and less on transactions processing, then this next step may come naturally.

Resolution #2. Improve planning effectiveness.

Kugel points out: “Few companies have achieved a high level of maturity in their planning processes. While some point to the need to reduce the time spent on planning and budgeting, companies need to make their planning and budgeting more effective, not just more efficient. They have to use planning to gain better insight into their performance, achieve greater forecasting accuracy and improving the alignment of strategy and budgets across and within business units. The single biggest factor hindering more effective planning is the use of desktop spreadsheets to drive the process. Dedicated planning applications make it possible to do more effective planning and, in a period of high business volatility, enable companies to revise their plans more rapidly.”

Got an example of where this is working well? Let us know.

Monday, March 9, 2009

Gartner BI Summit 2009 Day 2

Since I am attending the Performance Management Track, it has a decidedly PM feel to the conference. I can't judge the total audience except from the perspective of the Keynote attendance. The group is shy. During the "Comparing the Megavendors" session Bill Gassman was trying to solicit feedback from the audience (a packed house) and the attendees were reluctant to share their experiences and their opinions. It might have made for more lively discussions. I think it is important to sound out ideas and get others' opinions and perspectives to take advantage of avenues to innovation.

At the last count there were 730+ attendees to the Gartner BI Conference. Of those more than 30% were government and public sector. Another 20+% were vendors. That would seem inline with the expectations for curtailed travel budgets. Vendors are hoping that maybe somebody may still have a budget. The government gets a better rate than the commercial attendees and their fiscal year started before the severe downturn. As to why there is 18% from financial services? It isn't the banks but the insurance companies. No one seems to be buying, though. A less-than-scientific survey of the attendees resulted in a great number of people who are trying to figure out if their deployment is keeping pace with their peers and what they can look forward to for NEXT year - 2010.

I would have liked to go to the different vendor presentations but they were all at the same time limiting my ability. IBM (as commented on by another attendee) showed their traditional positioning presentation that had more rhetoric than content. Oracle seems to fitting into that mold as well from what I could see. Microsoft was confused about their BI offering with the elimination of PerformancePoint and Gartner discounted them in their vendor discussions. I was surprised that Gartner was least supportive (fewer positive ratings) of SAP. They seemed most favorable towards IBM/Cognos and resigned to Oracle as the big dog. In advance of the "Cool Vendors" report they also mentioned Adaptive Planning and Host Analytics, but left off Birst and PivotLink. SaaS, Hosting and OpenSource was mentioned but not with the exuberance I might have expected. The audience seemed to prefer a staid, perfunctory BI and PM implementation as opposed to participating in any kind of dynamic discussion on new technologies and methods. A far more toned down environment even than last year. Is it only the economy or did the venue also change the chemistry of the audience and participants?

Sunday, March 8, 2009

Thoughts from the Gartner BI Summit

Only marginally into the BI Summit, but have already had the realization that the Performance Management sessions will be under-attended. Business Intelligence is the domain of IT, but Performance Management is in the hands of the business. Corporate Performance Management, the term which Gartner has chosen to use to refer to those financial applications that are the mainstream of corporate America i.e. planning, budgeting, forecasting, consolidation and financial reporting, is of little interest to the mainly IT audience here in DC.

An interesting observation. Applications that are tailored to meet the business, but have proscribed functions are within the domain of the business, but the tools used for reporting and dashboarding are in the hands of IT. Business Process Management by definition then, should be in the hands of the business; but it is not.

So the conundrum is in where the attention is focused. If there is extensive on-going development or a relational footprint to the applications or tools it comes under the dominion of IT. If it appears as if the majority of the effort in deployment is in the installation and initial implementation then it is under the business' hands. That doesn't make sense to me.

Truly the answer should be one of a collaborative effort where there is someone technical assigned to meet the needs of the business to help them craft the solution that best meets their strategic needs. Finance IT was born for exactly this reason. This role bridges the gap or divide between the implementation of the application and the eventual use and refinement of the implementation AND the future sculpting of the application to keep it in line with the needs of the business. This is the true role of innovation where a blend of fact and function meet the evolving needs of a dynamic business.

Somehow that's lost here at the Gartner event. But as I said, we are only half a day into it - maybe it will resolve itself to be more on topic and dynamic. I should point out that I think Gartner does see the evolution towards a Performance Management driven business culture, but they also say we are two (yikes!) years away from it.