Peer Company Sponsored Benchmarking- What to Look Out For

In many industries, there exists companies who sponsor benchmarking and other types of “data sharing consortiums”. In the Utility Industry, within which we do a significant amount of work, there are a growing plethora of organizations who want you to exchange data openly with them rather than with consultants or other facilitator-brokered services. PSE&G Peer Panels, Southern Company, and others are among the top “draws” for this kind of information sharing among utility organizations. Other industries have similar types of players.

It is not my intention to label these programs “good” or “bad”. In some cases they serve a valuable purpose for many of the participants. But you should also know that these programs are laced with risks- risks that must be understood if you are to manage your involvement proactively.

At the core, you need to ask yourself WHY these companies would offer such programs, other than to gain valuable competitive intelligence, potential acquisition analysis, or other covert reasons. If you’re concerned about their motive, don’t join.

But first, you need to get past the “lure” of these programs, as they all are VERY good at drawing companies in with the following arguments:

– They say the reason they manage these programs is to offer a public service to the industry. Sounds all too altruistic for me. See below- There is no free lunch….

-They say these programs are free. News Alert- they aren’t! Someone always pays. If it’s the shareholder that pays, then 9 times out of 10 they are looking for competitive intelligence. If its the ratepayer who pays, then these companies are about to have bigger problems on there hands, as there are few ratepayers who would support giving their money away to ratepayers in other jurisdictions. If you disagree, find me a few of them who are. Is there even one out there?

-They will also tell you that they provide these programs to “protect you” from the BIG BAD consultants. We’ll here’s another News Flash: There are TOO MANY alternatives out there for you to sell yourself out to your peer companies (future competitors). If you want to stay away from consultant sponsored initiatives, look to some of the other alternatives before you decide on the peer company initiatives, as the latter are a bit too risky in the long run.

-They market to “manager” rather “executive” level individuals. Why? Because mid level managers are more apt to share data without worrying about competitive concerns. Career advancement, workshops in cool locations, and networking are among the biggest drivers for these managers. Executive staff have many more concerns about confidentiality and the value of protecting strategic data and insights, and are often in a much better position to judge when and how to make such tradeoff decisions.

– They will tell you that they are the only option if you want to have “lots of participants that look like you”. True, these programs are good draws. Also true that these companies look a lot like each other- same industry, same region, similar regulatory environments, similar management practices. But is this necessarily an advantage? Perhaps the biggest commonality between these companies are data sharing protocols may be a little “too loose”. It’s also worth pointing out that groups of 20 or so in a sector like Utilities is still a small fraction of the industry. If you total up all utility companies worldwide (and despite conventional wisdom, ther IS a lot to be learned from off shore peers!), there are literally thousands. Once you take into account that many of their members are big holding companies with 4 or 5 subsidiaries, you’re left with maybe 2% of the industry. Hardly a quorum!

Once you get past the “lure” of these programs, you can then begin to filter out the good from the bad, or at least identify the ones where the risk/reward profile leaves a lot to be desired. Are there good programs out there? You bet. But it’s your job to evaluate your benchmarking partner on each of these factors. It’s also important to have a good “rules of the road” checklist (see past post “Rules of the Road”) to use for every invitation you receive relating to data sharing. Without this, you put your company, and yourself at risk.

Benchmarking is a fact of life in best performing organizations. Withdrawing from the game of data sharing is NOT the answer. Managing the process proactively IS.

-b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com


Data Sharing “Rules of the Road”

Over the past several years. My partners and I have spent considerable time and effort applying our performance management technologies in the Energy and Utility Sectors, among others. One of the unique things about that sector is the “extremely open” nature with which they share performance information and best practices with each other. Perhaps a little TOO OPEN? When was the last time you heard of Proctor and Gamble sharing information with its closest competitors THAT openly ?

Am I saying that you should stop sharing information all together? Far from it. What I am saying though, is to TAKE SPECIAL CARE when doing so. You can instill a learning culture and share peer to peer information, but don’t do it without taking the right precautions. As much of a paradox it is, you can BET that P&G is one of the best benchmarkers and learning organizations around. They are just very deliberate and careful about how they do it.

Here are some tips that will help you “manage” the information as you go about your benchmarking and best practice acquisition process.

a)- Don’t even THINK about sharing information in an UNBLINDED fashion. This is cardinal sin #1. The ONLY reason someone would want you to do that is to be able to strip it down and glean information for competitive gain. If the information is blinded, the consultant won’t be able to target you for that lucrative project unless YOU want him to. And that overly philanthropic peer company won’t be able to use the data for competitive positioning. Only you will be in control of your data assuming it stays masked. Peer companies will often tell you that having the data unblinded is necessary in order to maximize value from the program. Hogwash. There are TOO MANY ways to foster learning with blinded data that I have discussed in previous posts for you to give in to that kind of BS. Either share information in BLINDED FASHION or NOT AT ALL.

b)- Don’t begin without clear Executive approval, make sure they know EXACTLY how the program will work, make sure you know what the PROTOCOLS for that sharing are for your company, and follow that process diligently. Trust me, your executives are the only ones with a broad enough purview to make good decisions and tradeoffs between information sharing and shareholder value. Second, is a deep concern I have about decentralized sharing that is not managed centrally. True story: A member of one of these “peer company” sponsored initiatives told me that despite an “iron clad” confidentiality agreement, and the decision to mask data, the FIRST thing that occurred at their results meeting was an exchange of identity codes! No kidding? Do I believe this was endorsed behavior? Absolutely not. It was, however a direct product of information sharing being too decentralized- to the point that the employees sharing the codes had lost touch with their corporate policies on information sharing.

c)- Demand Confidentiality / Non-Disclosure agreements. I’m not talking about these little “we won’t tell anyone if you don’t” type of statements. I’m talking about agreement that will hold legal steam. Be clear about when and how the information can be used. For example, our data cannot be used outside of x, y or z departments for purposes other than a, b, and c. If the information is used for purpose d (e.g. acquisition analysis, competitive targeting, etc.), be clear about the penalty and how it will be enforced (who will enforce it, what jurisdiction, etc…). Also, the less “blinded” the information is, the more complex the non disclosure agreement must be. The first line of defense is in what you share and what the peer company can glean from the data. The non disclosure is your SECOND line of defense. Once they’re in, they’re hard to stop.

d) Avoid consultant and peer company sponsored initiatives. Then who do I use? There are many sources and technologies out there, from published studies, to internet driven benchmarking tools and services ( http://www.benchmarkcommunities.com/ ,for example). The key here is to avoid anyone who doesn’t have benchmark facilitation as their CORE business. The farther their core business is from the facilitation of these programs, the farther you should stay away from them. And by the way, there are consultants whose sole business is benchmarking. In terms of benchmarking, these are the “good guys”. Remember though, these companies are the exception rather than the rule as far as consultants go. So be on guard.

e) Make sure your partner’s data management process is bullet proofed- Even with an iron clad NDA in place, a poorly configured process can be as dangerous as not having an NDA at all. Look for assurances from your vendors or partners that their process is secure. Ask to see their process. Was it audited or tested for compliance? Is it ISO certified in these domains? Is the data transfer technology and platform secure?

There you have it. A nice checklist to go through each time someone invites you into a .data sharing environment. Data sharing can be a very rewarding game if it is played right. But you need to be both cautious and prudent about the process. It’s kind of analogous to “let the buyer beware”, only we’re dealing with bigger companies, more shareholders, and bigger stakes!

-b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com


The Benchmarking Technologies You DON’T See

Not too long ago, benchmarking anything was a major undertaking. Consultants and/or host companies would take us down a long and painful road of data collection, validation, workshops, and reports, all of which would ultimately lead to a voluminous reports that probably still serve as “ornaments” on your bookshelf. Clearly, a painful process that often led to frustrated participants and meaningless information.

Well, now that we have all these great new technologies to help us with benchmarking, those times are gone, right? Wrong. Despite numerous advances in both approach and technology, benchmarking still leaves a lot to be desired. Why?

Not unlike many of our business processes, benchmarking suffers from our tendency to focus on activities rather than business outcomes. We spend lots of time and money automating legacy processes, instead of truly rethinking how our process SHOULD work. The same is true with technologies that should, but don’t enable us to do better benchmarking. There are clearly no shortage of benchmarking technologies that companies can use to acquire and report benchmarking data. But are these systems actually producing valuable information?

In my humble but vocal opinion, today’s benchmarking technologies suffer from major weaknesses in EVERY stage of the benchmarking process:

1. Survey Design and Administration– most consultants and facilitators have put ALL their emphasis on the survey process. Poor prioritization, in my view. Sure, the survey is the most obvious process to try and automate. Since most data collection started as a manual process, it would only make sense to try and streamline it. What’s wrong with that view? Nothing really, except for the fact that most data collection PROCESSES lack both the strength and sophistication that are key in a good benchmarking program. So what we end up doing, in essence, is automating a pretty crappy process.

For example, the internet now allows us to collect data online. Big Whup!!! If that’s all that you expect of your technology, than you’ve got bigger problems with your process. Data collection is the foundation of your whole benchmarking program. Many things are (or at least should be) accomplished in the data collection process. Putting data on a survey is only one of them.

Distributed data entry, error checking, aggregation, internal vetting, boundary testing (against specific definitions), external validation, and range checking, are among the many other functions performed at this stage in the process. If the OUTCOME of data collection is QUALITY information, collected with the LEAST PAIN, in the FASTEST CYCLE TIME possible- then simply automating your old excel spreadsheets has done nothing but administer your current process via the internet. It’s just another route to the same old destination. Nothing more.

Technology today, allows us to do SO MUCH more. Good benchmarking applications will address ALL of the components of your data gathering process, automate many of them, and turn your process into something that is better, faster, cheaper and less painful than your existing one. That’s the true test for any benchmarking technology.

2. Results Presentation– This one is very connected with data collection. If your technology addresses all of the major aspects of data gathering, then you should only be one or two quick steps away from seeing your results. So why does your facilitator need weeks or months to deliver your results?

a)- because your facilitator’s technology didn’t likely deal with data collection the way it could, or should have. Validation didn’t occur at the point of entry, did it? Data still had to go to the facilitator for aggregation, didn’t it? Some definitions were misinterpreted, weren’t they? All of this adds layers of cycle time to your process. The fact that your data was transferred over the internet bought you nothing in terms of being able to use it any better or faster that you would have before.

and b), because your facilitator ONLY focused on the data collection process to begin with. Today’s technologies allow for your data to become instantly part of a relational database that can be easily queried and manipulated. More importantly, the internet allows for that querying to be done in a distributed manner. Which brings me to the main benefit in the reporting process- CUSTOMIZATION!

So, here we have two more requirements for a good benchmarking technology. ON DEMAND reporting (i.e.- instantly upon data submittal), and CUSTOMIZATION (i.e.- you define the form and function of how you view the results). Suddenly, on demand filtering of the peer group, “what if ” and scenario testing, and many other possibilities begin to emerge. Does your benchmarking software do that?

3. Best Practice Sharing– This is one of the big gaps in today’s benchmarking technologies. The tragedy of this one however, is that it is not simply an oversight. IT’S DELIBERATE. That’s right. Benchmarking facilitators make their money by being the information broker, a service that was necessary 10 years ago. But today, they avoid these technologies because it moves them out of the loop, or at minimum changes their role. And change is not a nice word to these types of folks. Turf protection, protecting their job through retirement, etc.. become the real motivators. And who pays for that? YOU DO. The technology exists to make learning and best practice sharing a CONTINUOUS part of our day to day jobs, not a once a year event that puts your benchmarking process at the mercy of when he or she wants to call a conference.

Don’t get me wrong, conferences are fun, especially when combined with little boondoggles like baseball games and trips to the big city. But when they are used to disguise the obstructionist role of data broker is when I call foul. A good facilitator will provide the technology to facilitate on demand peer to peer sharing and surveying. If they don’t, you’re better off finding a new source for your benchmarking information.

As you can tell, this is a subject that I feel strongly about. When I see technology that is avoided because an individual or organization wants to slow down or limit change, it infuriates me. I see it too much, and its about time it changed. I encourage all of you to exert all the pressure you can to get your providers to use your fees, or collective effort , to FULLY employ these technologies instead of using YOUR resources to protect their little patch of turf. Sure, it will create less dependency on them for benchmarking, but if they play their cards right, it will result in a stronger relationship bond and a higher degree of business trust and integrity.

-b

PS- Here’s a tip. Want a better better benchmarking technology? -One that actually saves time, money, and employee frustration, while dramatically improving the quality and usefullness if information?

Visit http://www.benchmarkcommunities.com for a powerful solution that will help your benchmarking program serve you better!

-b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com


“Grinding” Your Way to Success

Earlier today, I spoke with a colleague who has been wrestling with a group of internal customers who seem to be in a perpetual state of “resisting change”.

You know those types… the kind of people who defy the most logical solutions and what appear to us to be the most obvious of necessary process or organizational changes. Oh, those endless conversations about the most granular of insignificant details.Walking away and wondering if my organization will ever “get it”. To add insult to injury, those days are often accompanied by conversations with others who appear to have found that holy grail. The grass always does always appear a lot “greener on the other side”.

But today, my friend was given some very valuable (perhaps career saving) advice. As he wrestled with this dilemma, one of his partners shared a good analogy. He pointed to, of all sports- golf, where its not uncommon for a player to do everything “right” and just not have the “breaks” fall his way. While this has happened to almost every golfer I know, what’s even more amazing is how often this happens to professional tour players. When you have some time, take a look at the tour results- wins, top finishes, earnings, and player statistics. What you’ll find is that while there are 2 or 3 people every year that appear to perform flawlessly week in and week out, they are still few and far between. Most players practice hours on end, only to win one or two events in a particular year. Even the “top guns” go many strokes in between what they would call a perfect shot. Few, if any, ever claim to have a perfect round.

No, golf is a game of “grinding”. Hundreds, if not thousands of shots waiting for that perfect swing. And boy does it feel good when it happens. Golf is a game of doing the “right things” over and over again, even when inspiration and motivation are lacking. Good players know that strength is gained in the “grind”, and it is the process of “grinding” through the misery that ironically produces the best shots.

Changing corporate culture is much the same way. While you will spend hours and hours doing the “right things”, most of the time, it won’t feel like you’ve gained anything. You’ll question yourself, your employees, your leadership, and your culture. Until one day, you’ll hit that perfect shot. Someone in a meeting will utter something that will let you know the culture has begun to shift. Just like the well struck golf ball after months of “grinding it out”.

To my friend and colleague, I say great advice. Hang in there and keep “grinding”. Cultures take a long time to change, but there is nothing sweeter than seeing it occur in action, which makes the long “grinding” phases well worth the wait.

-b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com


Finding the “Metrics that Matter”

A few weeks back, I had an opportunity to view some of the emerging performance management tools in the marketplace, and in particular, where they saw themselves headed. That experience caused me to reflect a little bit on these “new” performance management tools, and what really differentiates them as “best of breed”.

There are literally tons of related tools and systems out there, many of which have historically evolved as part of an Enterprise Reporting System. More often than not, these tools were largely driven by the financial part of the organization, and hence were built upon a financial reporting foundation. That, in and of itself is not so bad, but all too often these systems left gaping holes in the areas most important to operations and line management.

Enter the new breed of performance management systems.

The tools in this class of PM systems (PerformanceSoft, Cyndrus ADS, and PilotWorks for example) have gone well beyond the financial reporting game, and have really delivered a far more powerful and universal solution- one that supports the entire organization’s performance reporting needs. It is not my intent here to endorse one or the other (although I have my opinions), but rather point to some distinguishing characteristics that make these tools unique, beyond simply their “universal” application.

The, first and most important of these features is found in the tools’ foundation itself -the Performance Management Architecture. Most of these systems require management to start with their strategic plan, and align around it. The plan, and its corresponding objectives form the basis for everything else that follows. Everything else “cascades” from it, and is linked directly to it. Nothing is measured or tracked unless it has a direct feed to one or more of the organization’s top objectives.

The second distinguishing characteristic is what I call the “connectedness” of the system. Some might refer to this as “drill down” capability. In essence, what this means is that the user can, at any point in the system, probe deeper into what is driving a particular strength or weakness of a performance indicator or measure. For example, they can define what comprises a particular metric, what inputs are most responsible for current performance, which initiatives are being deployed to strengthen it, and how those initiatives are progressing. Each of these “levels” can be accessed through any of the others, producing a rich tapestry of “connected” information in terms of performance drivers and inputs. This puts the executive in a great position to lead, being able to “virtually” move up, down, and across the organization on demand…enabling her to really understand and manage the most critical of performance drivers.

The third differentiating feature of these systems is the flexibility and adaptability each of them possess. There are two aspects to this flexibility. One relates to how the data is fed into the system. In most ERP environments, the data feeds are “hard wired” , and often require programming to make any significant changes to those inputs. With these new performance management systems, however, inputs can be easily added, deleted, or manipulated directly from an administrative panel, often without a significant amount of external programming. With the advent of “web services” and other data publishing protocols, these features will become increasingly important to system administrators and users.

The other aspect of flexibility relates to the level at which the organization can deploy the technology. Not every organization has the appetite for “complete” drilldown capability, nor is it really necessary. For one organization, getting down to an individual turbine blade on a particular aircraft might be important to one of its strategic objectives, whereas another organization is comfortable just reporting at the level of regional operating budget. In each case, the system is flexible enough to be deployed at whatever level makes sense today, but adapted as internal needs and/or process changes arise.

Taken together, these features- the strategic architecture, the connectedness of these systems, and their flexibility and adaptability- help create an environment in which the organization focuses on doing the “right things right”. Every initiative, project, and metric are PUT TO THE TEST of whether they support the overarching objectives of the business. And with that level of focus, the organization can achieve levels of alignment never before thought possible.

So as you look at your internal metrics and performance indicators, ask yourself do you really have in place the “METRICS THAT MATTER”. Implementation of systems like these can really instill the discipline needed to realign and sustain your performance improvement initiatives, often at far lower costs that their ERP counterparts. If nothing more, take a good hard look at how these systems work. Even if you don’t purchase or implement, the evaluation process will give you a much needed perspective into where your performance management process should, and could be headed.

-b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com