Thursday, November 24, 2011

Are we there… Yet?

There is none like the piercing sound of a little voice breaking the silence only to gauge the point of arrival at the destination. On a trip, the arrival at a physical destination is populated with signs; however this may not be the case when anticipating change without progress markers. More often than not, change; personal or corporate does not occur in isolation. But rather it is a cog in a highly complex environment that is constantly undergoing transformation.



As the global economy continues its daily oscillation between financials bearing red or black, organizations continue to search for the silver bullet; that one illusive ‘magic’ fix that will catapult their organization out from this current economic turmoil. One needn’t look far to experience the explosion of seminars, webinars, articles and the like meant to help you catapult to a better tomorrow. What is so amazing is that many of them come at the right price – FREE. This pricing model is something that subversively attacks the very definition of value.



Several years ago I wrote on the concept of a market continuum, where organizations in a similar market niche become placed along the ‘success continuum’ based on their inherent nature to adopt change. At that time I argued, that their placement on the continuum is the result of organizational culture and its related risk sensitivity. Although I believe this still continues to be the case, I believe that market forces have exposed more elements to these phenomena. To get to these other forces, I believe we need to begin with an understanding of organizational change.



The only thing that is constant in the world is change and this is especially true at the corporate level. Organizations are under constant bombardment from economic and social forces that continuing along the status quo is the road to extinction. Organizations must continue to respond to their environment by making alterations [change] to continue to hold their place along the market continuum. It is the progressive organization that is able to leap out and upset the market continuum as a means of gaining the competitive advantage.



As an attendee at a recent symposium on change, I listened intently for something that I hadn’t heard before – me and 500 other attendees. What was spoken was none different that any 1st year MBA student would gleam from a text book. After hearing it and reading it 100+ times, this time something was different. This speaker espoused his acronym that is the pivotal point to long lasting change – VIM.



Long lasting change is built on VIM - Visualization, Intention and Methodology. Although very simply in vernacular the essence is very deep. For successful change to occur, the change agent must be able to ‘see’ what the end result of the change will be – somewhat like arriving at the planned destination of a long trip. Without this visualization – any end result will do. In the corporate environment, this visualization must be in the hearts and minds of all. If there are nay sayers on the team, their negativity will be a force that derails the entire plan.



The second major facet of lasting change is the Intentional aspect. Intention is the motivation which is the force to continuing along the road to the visualization. The element of intention is the determination to achieve the end goals, this, however is where greater elements of leader ideals become apparent. The last and probably most important factor in change is ‘methodology’. Without having a clear plan of orchestrating the steps to long lasting change, all other elements of VIM are rendered useless.



The important thing to recognize is that these elements do not operate in sterile isolation. There are two major forces, I feel, at work in adopting change. There is the organizational stimulus to the market and the market response. The one thing that is never considered during change is the human element; those inner psychological workings that impact the very foundation of change. This element is probably not addressed possibly because one could assume that talent and human traits are evenly distributed in the market place, or possibly that authors are only attempting to extricate the human element and profess the academic side of change.



In his recent article 'Creating a Master Plan', Mark Wadell professes the road to reaching an objective begins with an examination of the present; the famous S.W.O.T analysis. For the MBA student, S.W.O.T analysis becomes second nature. It is easy to examine a sterile organization on paper and work out its position on the market continuum. But how do the challenges of present analysis and ultimate change become distorted when the conductor is not objectively looking in from the outside. Once inside the organization, all leader actions are veiled by inner persona which can be riddled with neurosis and even psychosis. Couple the myopic leader with a subordinate staff who, because of the economy, are fearful of being made redundant and the possibility of a distorted group think will run rampant.



The question begs, how are organizational goals to be achieved when there are so many forces acting on the organization? Should there be greater internal controls to mitigate the possibility of oppression of the narcissistic leader? Creating a model of greater internal controls meant to nullify the flurry of psycho-oppression may; in fact, stifle the brilliant eccentric who has the ability to catapult the organization beyond its current abilities, may be conceivably possible. Alternatively, the organization could be dealing with the myopic narcissist, who is taking the organization into an abyss.



Probably the best way to identify and thereby react to the organization’s leader comes from the culinary adage "The proof of the pudding is in the eating". The quality of an organizational leader is only discerned by the results of their actions. All leaders will indelibly change an organization, whether it is through tremendous innovation or sanguine financials.



Saturday, June 04, 2011

Value – The Measure of Success

The last decade has demonstrated how no individual is immune to economic decimation. The Great Recession, as termed by many scholars, could be as devastating as the Great Depression of the 1930’s. It is undeniable that almost overnight shareholder value vanished. As we look in the economic rearview mirror, many contend that this economic pulse was the direct result of failure to manage risk. The result, a cascade of events that obliterated value as it made its way across the globe.

The cause of the Great Recession will probably be debated for years to come, as economies continue to rebuild. What did we learn from all of this? Depending how the economic dynamic of the day touched our lives, the lesson learned will be in direct alignment. If the impact was nothing more than a graze like that of a bullet – then we will continue along unaware of the breadth and depth of the economic devastation. Alternatively, the suffering of economic hemorrhaging will indelibly alter our behavior for years to come. These findings are beginning to pepper economic and psychology journals.

One article that caused me to stop and take note was Shareholder-Value Based Auditing by: Kevin Shen. Shen argues that the casualties of the Great Recession could be traced back to management’s failure to mange risk. Shen postulates that auditors may be too focused on risk and not enough on value, or rather the origins of value. Shen contends that shareholder value can be traced back to lines of activities and the audit process should focus on the risk of those lines of activities.

Shen draws on the derivation of shareholder value, along with the Gordon Constant Growth model and the Capital Asset Pricing model to derive a model that quantifies value. The model defines value (V) as the result of the quotient of earnings/profits (E) by the net of cost of capital (K) and growth (G).

V = E/(K-G)

Shen continues to refine the model from the corporate level down to individual activity level and contends that the audit should examine these value contributing elements to be effective in preserving shareholder value.

It is undeniable that the souvenir of the Great Recession is understanding and preserving value. The thought that immediately came to mind was how value is identified in organizations that have a completely different dynamic such as governments and non- profits. One could argue that they don’t have value and are simply an anomaly not to be test scrutinized.

I believe Shen’s model holds merit in the for-profit entities and also for the non-profit (NFP) entities. The transition for the model, however, requires an unpacking of the terminology. Shen proposes the basis of the model as the value accruing to shareholders. But who are the shareholders; those who have an investment in a nonprofit. I feel that the shareholders or better the stake holders in a NFP or government are the community which it serves. The community accrues value by virtue of the NFP, whether by feeding the homeless, aiding the needy or providing safety for persecuted. This value can be quantified by the easing of the burden on the community’s services.

This value is derived by extrapolating on Shen’s model. Earnings (E) cannot be managed in the usual sense as these organizations must expend all of their grants/donations in a specific time frame. However, the Earnings component could be more of savings derived by less demand placed on community and emergency services. Possibly a second tiered level of earnings could be the economic value derived from individuals who have been retrained to be economic contributors in society.

To a NFP or a government body there really is no cost of capital. Funds are either the result of grants, donations or taxation. However these funds do come with a cost. Grants and donations require considerable effort on the agency to attract these contributions. These costs can be hard, soft and opportunity in nature. Once quantified, these costs can be applied to the Shen model.

The last and possibly the most difficult to quantify is growth. Shareholder growth is easily quantifiable. However, how does one quantify the growth of a NFP or government? Maybe growth is not the correct term as it conjures up heavily bureaucratic organization that achieves volumes of red tape. Maybe growth represents the change in the social service the NFP is meant to address. By way of an example, if a NFP has undertaken to serve runaway teenagers in a community. The figure for growth would represent the annual percentage of runaway teenagers in the community year upon year. As the percentage increases (denominator) the value of the organization addressing the issue increases. Likewise as the NFP increases their service and lessens the injustice, their value decreases.

It is undeniable that value has become the new measure of success. Value must be a yardstick to quantify the contribution of every entity.

Wednesday, April 06, 2011

That’s not Fair!

Never a more common phrase barked amongst children as the inequities of a situation. Interestingly the phrase never dies, as adults; we continue to comment on the inequities of life. Not surprising that the very basis of the assessment isn’t the result of looking at those around us, but rather to those who we feel have somehow benefited more than we have. The yardstick of fairness is purely external. The most interesting thing about speaking up about inequities tends to surround our ‘lesser’ position compared to others. Should we possess a greater allotment than another – we are never the first to cry out ‘That’s not fair’. Granted, there are those who have compassion for the less fortunate, however the cry to fairness is deeper and more heartfelt when our ‘account’ is under review. I would love to meet the executive who receives a sizeable compensation increase, far in excess of her colleagues, and expeditiously approaches the executive committee with the inequity of her outrageous compensation The establishing of fairness, however, takes a different tone as one moves from the area of the school yard to the functioning adult of society. Inequities of the school yard are managed through primal instincts of ‘fight or flight’. The disparity is either resolved through force or the oppressed slinks away. As adults, we haven’t evolved beyond ‘fight or flight’. The passionate minority will take up signs, pump-up adrenaline and hit the streets. Then, often their passion for the oppressed runs right in the face of TV cameras and police night sticks. Conversely the majority, complain to everyone they meet of the inequities in the world and how they are hard-done by. Either the majority adults are passive-complainers who thrive on perceived injustices simply to have something to talk about. No change has come about from profuse complaints to a self-absorbed insular audience. Change in any forum requires commitment to the cause and using the established channels to achieve objectives by non-radical means. History has been indelibly altered by such greats as Nelson Mandela, Martin Luther King, and Abraham Lincoln who identified disparities and focused on bringing about change. On March 25 news hit the web of General Electric, America’s largest company, who had no tax liability for 2010 and received a $3.2B rebate from the federal government. This news took on a life of its own becoming almost viral. All of sudden there became a clamor of voices how ‘unfair’ this is. Even one media giant openly condemned GE for their ‘use’ of the tax system, only later to be exposed they had taken advantage of the same tax situations. What these pundits fail to realize is that GE and other huge organizations have done nothing wrong. They have chosen not to seek out inequities but rather focus on self preservation. They, like every tax payer, have access to the same Income Tax Act. They, like every taxpayer, have access to a marketplace of tax professionals. Last time I check there is nothing that precludes anyone from exercising their rights under Income Tax law. During my tenure in public accounting and even to this day I listen to the complaints of how the tax system isn’t fair. My only retort – if you don’t like it, vote to have it changed. Ironically, a well known Canadian Tax lawyer, Vern Krishna, C.M., QC, LL.D, FCGA, in his article Tax Simplification is Imperative, recently wrote how tax simplification is an economic, political and moral imperative. I believe we have come down a long and winding road of an enactment of statute to fund vital initiatives to a monolithic and virtually complex law. We should focus our attention away from those, like GE, who have enlisted the expertise of those who have assisted in their commendable tax position. We need to focus on two real issues at hand, those who have, in retaliation or otherwise have taken the ‘flight’ response and are thereby costing tax authorities hundreds of millions of dollars per year in searching efforts to bring these people to justice. We also need to take up our pens to write letters, to cast our ballots and to build coalitions to enact long term ‘fairness’!

Tuesday, February 15, 2011

Cost of Accuracy

Organizations, for the most part, tend to fall into one of two analytical camps. There is the small, non-public organization which undertakes the drudgery of keeping a set of books only to appease the laws of the land. Their extent of their financial analysis is determined only by way of having enough cash in the bank to make payroll. In the other camp are the public and government associated organizations that are steeped in financial analysis. These organizations live and breathe on the dicing and slicing of analytics.

After spending considerable time assisting organizations in getting a handle on profitability, I feel a certain degree of financial analysis is paramount to a successful organization. To the small organization, an exercise in financial analysis will direct the decision makers to those critical areas where change can be made. Interestingly, “what isn’t reported isn’t important”. I learned this mantra many years ago when assisting a professional services firm to get a handle on its profitability. Through very simple analytics it became blazingly clear which lines of business were profitable and which were a drain.

In the other camp, I feel that financial analysis has hit the zenith of analysis paralysis. It is almost as if the analysis is undertaken for the sake of analysis. In one organization where I had exposure, their level of analysis went down to the 1/1000 of a penny! One only has to stop and wonder where the sense in this is. With their technology, reports were run and the data keyed and re-keyed in to spreadsheets and databases to undertake such granularity in accuracy. I have to wonder the cost behind this degree of accuracy. Then with two resource competing opportunities, is one selected over the other because it was more profitable by more than 1/1000 of a penny.

One of the most intriguing aspects of this degree of analysis is the allocation of costs. I have found that so many organizations fall into the trap of ‘what was done before must continue’. It is almost like paying homage to the Mecca of the author of the analytical model. These firms often never question why they undertake their processes but perpetuate the same logic decade after decade. Although I don’t profess to be a cost accountant, I do feel that cost allocation should be based on a reasonable natural dynamic rather than a theoretical model, which often cannot be justified.

In reviewing the literature on overhead cost allocation, I have found that probably no subject in all of managerial accounting has as much controversy as direct costing. The theoretical argument of overhead allocation is between direct costing and absorption costing. Advocates of direct costing argue that fixed overhead costs related to the capacity to produce rather than to actual production. While the absorption costing school argues that each segment must bear a portion of all costs associated with its production. So, who is right?

In the reality, the debate often comes down to square footage of use in the production segment. I prefer to think of the production segment as either a manufacturing area where specific products are manufactured, or an office setting where specific types of work are undertaken. By way of an example an office segment in a professional services firm would be that area that houses audit, tax and accounting teams. The allocation model takes on a whole new tone when teams argue about office size and the allocation of common areas. Of course this is reasonable as each segment seeks to mitigate their cost of production and thereby justify their existence on the basis of profitability.

What happens to this model if the number of large offices is exceeded by the number of small offices? Are segments unfairly discriminated against because of the size of their office allotment and their related cost of common areas? Organizations can’t, logically, squelch this argument by undertaking leasehold improvement simply to put some type of ‘fairness’ to the allocation method through drywall and paint.

Recently I had such a debate with a very progressive management accountant on this topic. As we dialogued about the inherent flaws in the square footage model it became clear that a natural dynamic was at work. The conclusion we came to was based on reality; what is segment costs were not allocated by square footage, but rather by the number of FTEs in the segment. As the number of FTEs grows so will their space requirements and hopefully their profitability, as their growth would be driven by basic market demands. With this model, the overhead cost allocation per segment seems to be more objective.

Taking this dilemma a bit further, what happens if funding for a segment originates from two very diverse sources, how should costs be allocated then? The segment produces indistinguishable outputs but the revenue stream is from two very different sources. Given this situation the square footage model is blown out of the water as is the FTE model. Since separate funding sources dictate that costs be allocated one must find a natural basis, I feel, of allocation. Recently I was faced with this very situation. Through careful examination of the segment I found the natural differentiator; it was a unique characteristic of the users of the segment. As it turned out, there were users that had a behavior that could be traced back to one single funding source and other users to the other funding source. Once this model was applied, costs and revenue recognition became perfectly clear.

I believe that organizations need to closely examine their activities to ensure that each segment is adequately contributing to the overall health of the organization. The onus rests with the analyst to exercise reasonableness in overhead cost allocation and the level of granularity to be managed. More often than not the cost of extreme granularity outweighs the value of the analysis. More over recognizing the natural dynamic of the segment will lead the analyst to the true differentiator of cost. Relying on historical model or textbook musings will not lead to valuable information but rather a situation where the baby slips down the drain, following the bathwater!