« The Quality Mountain | Main | Quick clicks: More to learn »

The future of learning: Get serious

Offering another perspective on the future of learning for associations is Jeffrey Cufaude, a former association executive director and now president and CEO of Idea Architects, where (among other areas of expertise) he facilitates and designs conferences, workshops, and other learning opportunities. Jeffrey also blogs at the Idea Architects blog, where he’s currently writing a great series of posts about developing powerful presentations.

Here’s some of the many great things Jeffrey had to say about where associations are with regard to learning, and where we need to go.

I’ve heard you talk lately about issues related to diversity in association learning events. What should associations be doing to hold themselves accountable for greater diversity and inclusion in their learning programs?

I think you can start with the presenters. I think associations have an obligation to be doing due diligence about what messages they send based on the presenters that they are selecting.

I'm not saying that there should be a particular message, but if we believe and we value inclusiveness and top quality education and a whole host of other things, those then should be lenses by which we filter the choices we're making about our presenters, particularly the people who get the biggest platforms.

It would be rare for an association to bring in a political speaker or a person who has a political take on a topic without having thought about the consequences of only spotlighting one particular viewpoint. They may still choose to do it and say, no, we want this Democrat or this Republican's take on this issue. But they would have done this with deliberation and understand the consequences.

I don't think the same type of considerations are going on on other lenses. What does it mean if our three general session speakers are all 50-year-old white males? I'm not saying that's inherently bad. From my value system it is; from the association's standpoint it may not. But think about what that means in relation to the overall values of the organization.

I know, having been an education director and been around for long enough, how general session speakers are often selected. Who is the biggest name that people will get excited about listening to? And then secondly, when we get down into that plenary level, who will someone sponsor or who can we get for free?

That means our only criteria are those two core values. It's not looking at the broader set of core values. I don't think that's what people who are serious about learning should be doing.

And the consequence is that we continue to elevate the same voices and the same perspectives, and we create an echo chamber that those then become the voices and perspectives that people see because those are the ones that everyone's talking about.

What are some other things that you think associations need to be holding themselves accountable for with regard to learning?

The bulk of [conference] evaluation forms still primarily focus on satisfaction with the session. We're getting better. In my experience, maybe a quarter of those, up to a third, are getting into [questions like] “How relevant will this be for you in your workplace?” “I received ideas that I'm going to be able to use.”

But we're not even, in the basic level, asking questions that measure the effectiveness and the applicability of both the content and the format. We're still [asking] “this speaker was knowledgeable; AV and handouts were good.” I feel like we haven't even made the commitment to the baby step of holding ourselves accountable, let alone having a more sophisticated assessment mechanism to find out what actually was used.

To me that suggests that we're not really serious about ensuring that we're delivering education that is actually used back in the workplace.

What would it look like if we really were serious about that?

I think you’d see that as the finish line. Right now most associations and most directors of education see the finish line as the end of the event or the end of the webinar. I totally get that. But all we've really done is get people trained for the race; the real race is back there in the workplace.

I think there has to be an initial shift of thinking: We [know we] are successful three to six months afterwards, when people can tell us what percentage of the knowledge they used, what has worked for them, and what hasn't.

If you take that as a beginning mindset, you design things very differently from the very beginning. …

Why don't people ask what percentage of the session's content is going to be relevant to you in the work that you do? Why is that so hard to get that put onto an evaluation form? Sometimes we think about it being the meeting planner, focusing on logistics versus the director of education focusing on content. But I think that's too easy to blame the meeting planner.

If we're really serious about learning, why aren't we further along in this arena? That's the same thing I've been saying with diversity. If we were really serious about it, wouldn't things look different?

My bottom-line takeaway from that is we're not serious about learning. We're serious about delivering information, and that's not sustainable 10 to 20 years from now.



Bravo! Jeffrey -- for all you said. There are too few who are 'serious about learning' - who are looking at sustainable learning or delivery of content -- and far fewer who look at all session facilitators of learning (aka speakers) and what they deliver let alone what they represent.

Recently I received a membership card from an assn. On the printed piece, it told me what to DO not what the value of my membership was. More, the photos, clearly taken at a meeting (lanyards, w/ badges at pupik level, gave it away), featured 2 older white men. I was stuned! This is an assn. that has put out many statements about their commitment to diversity.

Is it that too many associations are not communicating well internally and externally? That feedback is not asked that provides better information? What does it take to get it right?

Thanks as always for provoking brain stimulation.

Jeffrey and Lisa -- Great post! I completely agree with the need for education leaders to be more accountable (to themselves and members) when it comes to learning outcomes. And I agree that -- in general -- we're not doing a very good job of putting in place the measurements needed to tell us whether we're meeting those learning outcome goals or not.

I'd go a few steps further.

First we need to recognize the difference between meeting planning and education (see my post "I Am NOT a Meeting Planner" at http://alearning.wordpress.com/2008/10/05/i-am-not-a-meeting-planner/). As long as we confuse our roles (they're separate, even if we are educators AND meeting planners), we'll confuse the outcomes we need to be measuring.

Second, we need to acknowledge that there's value in different sorts of event evaluations: smile sheets (those on-site evals) are great for assessing the environment (particularly if we're considering returning to it for future sessions), and immediate impressions about value.

Incorporating follow-up transference evaluations is the only way to get a firm grip on whether the event provided applicable content to the learners: once they're back in the workplace, ask them what they have incorporated into their tasks or responsibilities because of what they learned. Then you'll know where the true value of learning was.

The PD section has more information on various evaluation types in past enewsletters as well.

Thanks for covering a key education issue!

I, too, appreciate the delineation between education director and meeting planner. I think the difficulty arises, particularly in smaller associations, when the education director is the meeting planner and resources are limited, including time, money and the availability of diverse speakers. That's when we must work smarter, not harder. For me, that means surveying our constituents on a regular basis, understanding the changing needs of our members and planning education programs that meet those needs. Of course, evaluating speakers, content and relevancy following a program is an important step of the quality improvement process; however, I think spending the time and energy up-front is almost more beneficial. Rather than haphazardly selecting speakers and topics I hope are on target (or, worse, are simply most convenient), I feel successful when I've identified a need and met that need through a well-structured and learner-oriented program. In other words, education directors must be equally competent in managing a program and managing that program's content or curriculum.

I appreciate the comments and hope the conversation continues.

As far as the education director/meeting planner issue, I think we could play forever with the semantics of those efforts, and I will leave that to MPI, PCMA, etc. In my first association position I managed both and found that dual role most helpful in creating learning experiences that had integrity in both the content and the operational elements.

That being said, if you're primarily tasked with the logistics, you've got to remember that how they support and enhance learning is a key success criteria. And if you are tasked primarily with content design, you have to be thinking about the logistical implications of your choices and whether or not the great learning experience you envision can actually be pulled off within the parameters that have to be managed.

And associations could advance efforts quite a bit if they provided speakers far more advance info (related to both logistics and content) that would assist them in delivering a quality learning experience. I'm rarely given basic demographics about a conference audience unless I ask, let alone information about current professional issues for them, their learning preferences, language to use or avoid, etc. And while knowing the room I am presenting in is helpful, far more helpful is knowing the room dimensions, the exact room set (not just rounds, but how many of them), the placement of the screen, the lighting options ... all those variables help a presenter adapt his/her content and teaching techniques to the space in which the presentation will occur.

So while I can (and do) find that info out on my own, I am deeply appreciative when the association offers it and dream of the day when it is standard practice. Because if we were truly serious about learning, it would be.

An insightful and helpful article. Progressing from evaluation using "smile sheets" to measurement of changes in knowledge, competence, and performance is an essential step for associations committed to ongoing improvements in their educational efforts. Optimally, identifying knowledge and performance gaps in association members, then clearly articulating desired results of the education will form a firm foundation upon which educational events can be planned. The evaluation should then seek to determine if those knowledge and performance endpoints were met. There are multiple methods/tools for this evaluation that can be borrowed from the continuing medical education community--I'm happy to share those methods with anyone interested. Derek

This is a very thought provoking post. Thank you! There are many associations that are doing call for papers and voting on those submissions. For many associations, that primarily attracts industry consultants or the same old crowd.

I like the approach of defining topics that will be relevant for each track and then recruiting the best individuals or experts to facilitate or present that topic.

There are also many associations out there that need to do a better job of weeding out presenters that don't rate in the top quartile for future years and recruiting those that rank high to speak at other events.

Comments are now closed on all posts.