By:

As the examples of successful use of crowdsourcing to address complex technical, business and social issues grow in numbers, so do the instances of failed crowdsourcing campaigns. To make crowdsourcing a widely recognized idea-generating and problem-solving tool, it’s imperative to understand the reasons of why this tool can fail or underperform.

Why do crowdsourcing campaigns fail?

In my experience, crowdsourcing campaigns fail for two major reasons. The first is using its sub-optimal version, which I call the bottom-up model of crowdsourcing. I wrote about this model and its shortcomings before, most recently here.

The second reason is the lack of understanding that the most crucial factor that defines the ultimate success or failure of any crowdsourcing campaign is the ability to properly identify, define and articulate the problem that the crowd will be asked to solve. I call it the “80:20 rule”: 80% of unsuccessful crowdsourcing campaigns I’m aware of failed because of the inability to properly formulate the question to be presented to the crowd; only 20% did so because of a poor match between the question and the crowd.

Blaming the crowd

Unfortunately, too often when a crowdsourcing campaign fails to deliver, it is the crowd that gets most of the blame. As a result, instead of improving the efficiency of the problem-definition process, organizations begin to fiddle with their crowds. For example, a recent HBR article suggests switching to “carefully selected” crowds, the ones composed of employees or suppliers (the “experts”), rather than consumers (the “amateurs”).

This is a bad advice. The idea that you can cherry-pick the best participants for your next crowdsourcing campaign has no basis. To begin with, if you go to a large external crowd–by using external innovation portals or open innovation intermediaries–selecting a perfect “sub-crowd” becomes either cost-ineffective or outright impossible. And yes, working with internal crowds of employees is a solid approach; however, the usefulness of internal crowds—as opposed to external—has its limits.

Even more importantly, the widespread belief that only people with relevant knowledge and expertise can solve your problem is plain wrong. The history of many successful crowdsourcing campaigns proves that great ideas can come from completely unexpected sources. Moreover, research shows that the likelihood of someone solving a problem actually increases with the distance between this person’s own field of expertise and the problem’s domain. However paradoxically it may sound, packing up your problem-solving team with experts—as opposed to using a large and diversified crowd of “amateurs”—will make your problem-solving process weaker, not stronger. 

How to make crowdsourcing campaigns more effective?

The want for smaller and carefully selected crowds is driven by a fear that large crowds would generate a lot of low-quality ideas, evaluating which would be a huge burden for the organization running a crowdsourcing campaign. However, there is an effective way of generating higher-quality responses with a crowd of any size: to perfect the question you’re going to ask. Providing the crowd with a list of specific, precise and, ideally, quantitative requirements that each successful submission must meet will have a dramatic positive effect on the submission quality. Besides, this will substantially decrease the burden of the proposal evaluation because low-quality submissions will be easily filtered out.

My recommendation to organizations that want to increase the efficiency of their crowdsourcing campaigns is two-fold. First, use as large and diverse crowd as you only can get and then let qualified members of the crowd to self-select based on their own assessment of the problem and their abilities. Second, master the art and the science of formulating a question that you’ll ask your crowd—by both properly defining the problem and describing a “perfect” solution to it.

Remember: the beauty of crowdsourcing is that you don’t need to look for solutions to your problem. You just post your problem online, and then a right solution will come to find you.

About the author

Eugene Ivanov writes about crowdsourcing, open innovation and innovation in general. He blogs at Innovation Observer, edits monthly newsletter on crowdsourcing “We the People of the Crowd” and tweets @eugeneivanov101.

The image was provided by Tatiana Ivanov