Forget the 80:20 and 90:9:1 Rules. For Online Training, It’s 50:50.


When it comes to group behavior, most of us – both anecdotally and based on research – cite the Pareto Principle (aka “the 80/20 Rule”). Generally, this principle states that 20% of the people create 80% of the value for an organization. This 20% group includes those standout leaders who make most organizations function more effectively – especially when it comes to volunteer-driven initiatives. That’s not to say the 80% don’t bring value – this theory reveals that they just don’t put in as much time or have the impact of the 20%.

With online communities, that number changes a bit and has been adjusted to 90:9:1. Given a group of 100 people, 90% will be lurkers / readers who are not active and generally act more as a consumer of information. Nine percent will comment on blogs or discussions or share content beyond a website or platform, and there’s just a small sliver of 1% that are actual content producers. There’s been some debate about whether this notion still holds true — that we’ve reached a new level of maturation in online behavior — but generally this idea is ingrained in the thinking of people who monitor web-based interaction.

What I have not seen discussed much is the behavior of participants in online training, such as webinars. In fact, when I searched on “show up rates for webinars,” I couldn’t find much information (nowhere near the search results for “80:20” and “90:9:1”). One commentator said the average can range from 33% to 60-65%, depending on the target audience. What we are seeing at GovLoop for our online trainings is a pretty consistent “50% Rule.” Generally, our online trainings gain hundreds of registrants (and our Virtual Career Fair topped 5,700 sign-ups). On average, what we’ve seen is a 50-55% show-up rate on the day of the live online event. In fact, more recently we’ve been seeing upwards of 60% participation, especially when you include archive views – but even live listening is on the rise.

That same principle held true earlier this week for a social learning pilot that we are running with the U.S. Office of Personnel Management. We had 50 out of 100 participants attend our live online training session. It was the first of a six-week series, so it will be interesting to track people’s participation levels over time. However, I believe we’ve seen enough data across a couple dozen webinars to be able to expect at least a 50% turnout rate.

If you host live online trainings, what percentages are you seeing?

Does it coincide with our findings?

************

Want More Posts Like This? Sign Up For Email Updates

***********

Also, be sure to check out all our “Human Resources” content.

************

Leave a Comment

3 Comments

Leave a Reply

Walker Hardy

Andrew – I certainly agree with your 50% turnout rates, and have seen simliar for the couple of webinars that I’ve done. But I think the turnout rate isn’t the important number, and needs a qualifier in order to be compared with the 80:20 or 90:9:1 rules. Those group behavior and online community statistics are really indicating the value added (or impact) as you stated. Just turning up isn’t adding value or necessarily gaining value as a student. The measurement that is needed is whether those 50% of participants who did show up to the webinar actually learn or gain anything. We need to (I certainly admit that our organization can also measure this better) run well-designed Kirkpatrick-based Level 1 and 2 evaluations of those webinars to measure learning/knowledge gain. Eventually we should also run a Level 3 behavior-change based evaluation as well, but first we have to make sure the Level 1&2 evals consistenly show good results.

I suggest a number for webinars that actually mimics the group behavior and online communities statisitics. For webinars the ratio might be 50:20, where the 50% is the turnout rate as you stated, but more importantly the 20% is the percentage of people who actually learn or gain from attending. The other persons probably just have the webinar on in the background while performing other work (or getting coffee or checking facebook or maybe govloop.)

Andrew Krzmarzick

Thanks for the input, Walker. We are actually running a pilot “social learning” course with OPM right now and are doing Kirkpatrick Level 1-3 evaluations both during and after the course to ascertain learning and behavior change. The course includes the weekly webinars, combined with required readings (blogs), a live group discussion in our virtual classroom (a private GovLoop group) and a 1-to-1 weekly reflection with an assigned course partner by email. I’m excited to see what we can find out about overall impact and effectiveness of online, blended learning.

Henry Brown

Although discussing E-Learning could have some relevance to webinars….

From The International Review of Research in Open and Distance Learning:

On-the-job e-learning: Workers’ attitudes and perceptions

Abstract:
The use of e-learning for on-the-job training has grown exponentially in the last decade due to it being accepted by people in charge of businesses. Few papers have explored virtual training from the workers’ standpoint, that is, the perception they have about the different training methodologies (face-to-face vs. virtual) and the attitudes they have towards on-the-job learning. Training, in this context, is an investment for both the two participating agents: businesses and workers. It seems logical that knowing the perceptions and attitudes shown by the targets of the training is, at least, as important as knowing the advantages for the companies.

To analyse workers’ perceptions and attitudes we conducted an online survey of 2,000 employees of the leading European savings bank, CaixaBank (http://www.caixabank.com/index_en.html), on training habits, perceptions, motivations, and disincentives of undertaking face-to-face or online instruction.

The results reveal that workers perceive e-learning as a more flexible and up-to-date training methodology. On the other hand, face-to-face training continues to be perceived as a more motivating methodology compared to virtuality and with better explanations from the course trainers. As regards motivations given by the workers when it comes to training, there are three main groups of attitudes: those which are more affective and social, those which reveal poor adaptability or fear of the new training requirements, and, finally, those linked to the knowledge society.

Such results state that while the benefits of distance methodology can be clearly identified from the company’s point of view (i.e., as a flexible and efficient methodology to develop the employees’ skills and knowledge), from the employees’ standpoint, the advantages of virtual training are not so clear and depend to a great extent on their attitude towards the use of virtuality.