One of the more popular social activities between companies and
consumers is crowdsourcing. Crowdsourcing is relatively straightforward:
It’s the online distribution of certain tasks to crowds of experts and
enthusiasts. Companies use this activity to connect their brands closer to
their consumers. Involving them to the point where they almost take
ownership of the brand.
For recruitment agendas it is an effective tool for both companies and
candidates to assess fit. It can be a win-win situation, or it could be a
lose-lose. If not done carefully, it could backfire.
Let’s start with one basic principle. Nothing is really free. There needs to be some return on the invested time/brains that are contributed even if
it is in the form of exposure and accolades. There are some professions
out there that rely on their brains as the product or service
they have. Yet crowdsourcing can be abused and be perceived as a cheap way
to get free ideas. There are online communities out there in the creativity
field where hard working designers, artists, marketers throw free work
(spec) out to posted projects from companies looking to “crowdsource” their
need out to the masses. If there was any point in which the value of “trust”
within social networks is tested, it’s here. At the least authenticity and sincerity.
Crowdsourcing works when there is a benefit for both the participant and the
host company. There needs to be a genuine commitment from both parties. The
host company that is running the crowdsourcing strategy must be clear,
upfront and most of all responsive to all the submissions from the
community. The community should be able to see the clear benefits of
participation, and should not come away feeling “used.” Their submissions
even if they are not the top selection should result in some return of
value. Whether it is exposure, job leads, etc.
There are many types of crowdsourcing techniques out there. Many that you
may not have been aware of. Actually those types are the most effective. The
ones in which you are participating without realizing you are participating.
Those happen because of the extremity of relevance. The content is of
extreme relevance for you and you are passionate about it. If it’s not
something that you are passionate about chances are the value will be too
weak for you to take time to participate.
The crowdsourced job description.
Last July you may have remembered the viral postings on the Barry Judge blog
that stirred the networks. Basically they already had a crowdsourcing
strategy in place that involved consumers directly into their brand and
products. Well, they had a need for a Senior Manager of Emerging Media. So
they posted the job description only to find out that people had other ideas
on what the job description should be in order to move the company forward.
This became a natural progression to allow people to help write the ultimate
job description for this role. After all, the candidate they were after
should be able to construct the ultimate job description that describes who
they are and what they will do. They were also extremely clear upfront of
the expectations in what you as a participant will get back in return. Not a
job…no the legal hounds would be having press conferences as we speak. The
winner is the job description with the most votes. And they get…exposure. In
the twittersphere and blogging networks cluttered the web about a job
at Best Buy. A great example that demonstrates that a major ingredient
of a successful crowdsourcing campaign is in direct relevance to the
strength of the social activity.
And lastly, we can’t forget the
fact that companies need crowdsourcing approaches internally to foster
a healthy collaborative culture. The difference is that the relevancy
is aligned to the overall mission and goals of the company, and the
payoff goes without saying how it can be tied to employee incentives
and a highly successful organization.
So think about what your latest contribution of intellect or skill was. Why did you participate?
This seems to be framed for a company outsourcing, rather than a public service organization.
I think that there are differences between crowdsourcing by for-profit companies and crowdsourcing by non-profits. We are increasingly seeing for-profit companies using crowdsourcing and looking for ways to reward participants to motivate them. On the other hand, with non-profits the reward tends to come from the work itself, rather than an awards program (e.g. the satisfaction of contributing to a valued outcome, people associating your name with a good contribution). That seems to be how successful open source projects crowdsource development, or things like Wikipedia.
As Dan Ariely notes in his book Predictably Irrational, you need to know which kind of incentive you are dealing with (reward or social) and it can be dangerous mixing the two. In his chapter “The Cost of Social Norms: Why We Are Happy to Do Things, but Not When We Are paid to Do Them” he talks about how introducing market incentives can sabotage successful motivation based on social incentives.
Which isn’t to say we can’t use market incentives. The success of the “Apps for Democracy” contest shows we can. But we want to be careful we don’t shoot ourselves in the foot as we do so.
I think that there is a lot of interest and willingness in the public to contribute to initiatives that will improve the public good. This is likely to make them natural contributors to public-sector crowdsourcing. What we need to do to motivate/reward them is to show them how their contributions can effectively make things better.
Great article Mya. I referenced your article in a post at the Army’s milBook community. That post is behind the CAC/AKO firewall, so I pasted it below for discussion here.
Those unfamiliar with the milWiki initiative can learn more about it at MilWiki receives Army’s top knowledge management honor (September 2009)
In “On milWiki & Succeeding Through Crowdsourcing” I wrote:
In a recent post on GovLoop, Mya Wallace wrote a blog article “Crowdsourcing: Win-Win or Lose-Lose” (3 March 2010) and presented examples of when, and when not, to use crowdsourcing techniques and how to successfully use those techniques. (GovLoop, with over 26,000 members, is the premier social network for the government community to connect and share information.)
Many of her points relate directly to the theory and purpose behind the proposed milWiki initiative.
1. “Nothing is really free. There needs to be some return on the invested time/brains that are contributed even if it is in the form of exposure and accolades.”
Easy to say – harder to measure and implement. Quality over quantity comes to mind. A system that rewards participants for the amount of their contributions may encourage people to make unnecessary changes or provide revisions what are not substatially different. One model that may work in this situation is the same one used by other social media sites: book reviews at Amazon.com, buyer/seller feedback at eBay, etc. Allow the other members of the community to rate the change as to the value or usefulness. Then tie a recognitiion system to the overall value over time users provide to the community. (Points 1 & 3 are related. Point 1 addresses a direct individual benefit. Point 3 addresses a benefit an intrinsic beneift received by improving the Army.)
2. “If there was any point in which the value of “trust” within social networks is tested, it’s here. At the least authenticity and sincerity.”
Trust is critical issue in the acceptance of “crowdsourcing doctrine development.” Do we trust our Soldiers, DA civilians and other authorized parties to revise the doctrine? Do we trust the system, and the associated checks and balances in place, to catch and respond to poor or inappropriate changes?
3. “Crowdsourcing works when there is a benefit for both the participant and the host company. There needs to be a genuine commitment from both parties.”
In the milWiki case, all of the participants are within the Army organization and have a shared vision, purpose and mission. When doctrine is improved, the entire organization benefits.
4. “Their submissions even if they are not the top selection should result in some return of value.”
An absolutely essential ingredient for success of this program. The participant’s input must not be rejected without feedback, eitherwise the person may not return. If others in the community see inputs ignored, in turn they will also not participate and the system will fail.
5. “And lastly, we can’t forget the fact that companies need crowdsourcing approaches internally to foster a healthy collaborative culture. The difference is that the relevancy is aligned to the overall mission and goals of the company…”
There is no question regarding the relevancy of the milWiki project and improving doctrine. All participants involved benefit when doctrine is brought up to date and made more useful. Fostering a “healthy collaborative culture” comes back to the single word issue addressed in Point 2 – Trust. That type of culture can not exist without trust.
Your thoughts? Are there other points to be considered on the applicability of crowdsourcing techniques within the military environment?
Thanks Mya, this is a great article. I think finding adequate incentives is critical to a successful crowdsourcing effort. Pulling this off in the public sector can be a bit difficult, especially in these times of diminishing budgets. So, we have to rely on the things you and David pointed out- altruism and recognition.
Why would a private company devote its time and resources to a government crowdsourcing effort that doesn’t offer a direct payment? Maybe they just like helping people. But, if the publicity and recognition of not just the winners, but all the entries, is adequate, people in the private sector will see the effort as an opportunity to spread their name.