The Collaboration Project, an initiative by the National Academy of Public Administration (NAPA), just released a new resource: Tools for Online Idea Generation: A Comparison of Technology Platforms for Public Managers
The document is a follow-up to their previous introduction to online group brainstorming, which I thought was a nicely done primer for anyone just getting started with online engagement.
This latest document compares ten online tools for idea generation (one of which hasn’t launched yet), including key benefits and pricing information.
They also provide some guidance what to consider when choosing a tool. From the document (PDF):
Considering your needs
Many technologies are commercially available for online idea generation, ranging from low-cost, out-of-the-box tools to large scale, custom-built solutions. As such, many factors should be considered in deciding which technology best fits your needs. These include:
- Duration of engagement. How long will the idea generation project run? Will it be limited duration or an ongoing “market” of ideas?
- Community. Do you need to be able to identify your “power users,” those whose ideas tend to be the most influential? Do these users need to stand out among participants?
- Responsiveness. Do you need to respond to ideas and comments as the dialogue happens or provide updates on the status of particular ideas?
- Output. What kinds of data and analytics do you need the technology to provide?
- Structure of dialogue. Do the ideas need to be strictly organized or siloed (e.g., by topic), or can all ideas mingle together?
- Cost and resources. What budgetary and staff resources can you allocate to this engagement?
- Support. What degree of technical support might you need from the vendor?
- Deployment. How quickly do you need to launch the engagement?
Remember: While deciding on a technology is important, this decision is best made by aligning the the (sic!) technology to your core purpose for engagement. In other words, let the tools fit your needs, not vice-versa.
A few notes I might add:
- Raw contributions collected via these tools are usually quite messy. The ideas are half-baked (not a bad thing at all, by the way), not unique and tend to include a lot of other stuff (see this older post about different types of participant input). Accordingly, voting on these tools tends to happens prematurely.
- Strictly speaking, most of these tools fail to adhere to one of the core rules of brainstorming, namely to suspend judgment during the initial phase of idea generation. Remember there is no “thumbs down” in brainstorming (see this older post on brainstorming, which contains a couple of helpful definitions and explanations).
- Another common feature in this tool category is that they expose the leader board (those ideas which have received the most votes). This, of course, tends to significantly distort the results as the top ideas receive the bulk of participants’ attention (see this post).
No tool is perfect, they all have their strengths and weaknesses. Choosing a tool always requires looking at the trade-offs relevant to a particular situation. The point here is to be aware of the potential limitations and challenges and, where possible, mitigate around them.