I am a big fan of crowd sourcing. Ever since I read Surowiecki’s book “The Wisdom of Crowds” I’ve tried to incorporate larger, more diverse groups into my normal process for solving problems. I’d say in most cases it works pretty well.
For those of you unfamiliar with crowd sourcing (or using the wisdom of the crowd) it is: the process of taking into account the collective opinion of a large group’s aggregated answers to questions involving quantity estimation, general world knowledge, and spatial reasoning over that of a single expert or person. This process has generally been found to be as good as, and often better than, the answer given by any of the individuals within the group.
In other words – getting more people to provide answers (independently I might add – collusion ruins the effect) to your questions provides you with a better overall answer than simply asking one person – even a potential expert.
Now this isn’t a perfect solution – it doesn’t always work. It works best when there is a specific correct answer to a question. An example used in the book was guessing the number of jelly beans in a jar. When one person guesses – you get that one answer. But if you ask 100 people to guess and plot their answers out you’ll get a distribution of answers and the mean (or average) answer will most likely be very close to the real number (and much more accurate than the one individual guess – unless they are very, very lucky.)
Performance Reviews Aren’t the Place for Crowd Sourcing
I bring this up because I’ve seen references to crowd sourcing performance reviews – either by getting a variety of co-workers to rate performance or, most recently, through the output from a company’s recognition program. The theory being is that the review from the manager is or can be, biased and that the sum total of many reviews is less so. Or, the sum of all the person’s peer recognition events is a good proxy for performance. If a person has a lot of “kudos” they are a good employee – even if their manager doesn’t think so.
At first I thought this was a great idea – even posted on in myself. But I’ve thought about it some more and talked with various people (hey – wisdom of crowds ya know…) and now I’m not as big a fan.
Here’s why…
- Crowd sourcing is most effective when there is a correct answer. Performance reviews are hardly something that can be boiled down to “correct answer.” There is a range of performance and variety of elements that add up to the ratings people are given for performance. There is no one answer to search for so having more input won’t get you any closer to the “right” answer since there really isn’t one.
- No matter how you do this – the people tasked with rating the employee have a vested interest in the outcome. The people providing input either want to brown nose to get credit in the future – or sabotage the review in order to move ahead. Hey – we live in a competitive world. Either way the data isn’t reliable or valuable. At best it will provide red herrings and confuse the issue.
- Recognition events are ONLY POSITIVE. Think about it – if you’re going to use the output from your reward and recognition system as a “performance review” it only contains positive information. There is no “negative recognition.” Therefore the input is already skewed. And again – since most recognition events are not anonymous the “giver” of recognition may be doing it to curry favor. In other words the motivations for the recognition event are suspect from the start and don’t really add a ton of value to the process.
Don’t get me wrong. I think you should have a recognition program. I also think as a manager, you should seek additional input when doing performance reviews. But I don’t think you should rely on these inputs being unbiased and accurate. They need to be taken with a grain of salt.
Performance reviews – love them or hate them – will always be around. Like any asset a business employs to create value for its customers, that asset must be analyzed and reviewed to ensure it continues to be an positive influence in that equation and not a liability. How we do it will change. I don’t think annually makes any sense – and you can’t do it each and every time an employee finishes a TPS report.
Reviews are ongoing dialogs between the manager and the employee. It really shouldn’t be up to a group or skewed system. If anything, I think we need to put more emphasis on training managers to do performance reviews better, more frequently and with less input from the crowd – not more.
Or am I moving this conversation further backward? What are your thoughts about “crowd sourcing” performance reviews?
The post Crowd Sourcing Performance Reviews appeared first on Fistful of Talent.