Matthew Richter posts daily comments in LinkedIn—well, almost daily. You can follow him and join the conversation by going to http://linkedin.com/in/matthew-richter-0738b84.
For the benefit of our readers, we decide to compile and reprint some of his provocative pieces from the past. Let us know what you think.
Accent
Too often, as we engage in programs about multiculturalism, diversity, style, and difference, we forget that in many ways it is a question of perspective and position. Language is one of the worst ways we label differences and it is almost impossible to avoid theLinked In with Matt ingrained bias we associate with it. By highlighting the differences from “us,” we are indicating a “them” that can imply higher or lower status dynamics. Accents and conceptual misunderstandings cause many a discriminating interaction. These differences are socially created. We all have accents but only notice them in others. And, we all associate some accents as better and others as worse. This is a mistake. Accents regardless of being inside or outside our borders all have cultural associations. These associations can lead to insidious assumptions about the person in front of us. Just the word “accent” itself, implies “different,” “not like me.” Yet, we all have one from someone else’s viewpoint. Linguistically, there is no proper way to speak from an accent perspective. Yet, the baggage can be insurmountable. Unless you have a French accent. Then you rock!
Measurement
Should you run a training workshop if you cannot measure its efficacy?
The answer is more complex than what it seems. First, what are you proposing to measure? Is it the ROI of the program? Whether learning actually occurred? What type of learning? The effectiveness of the program design or the instructor? In other words, before you answer whether you should measure, you should first ask what you might measure and how those results strategically help you. Then, can you measure and gather the data you want? The next big area to explore is once you measure and analyze the results, what will you do with them? Too many efficacy studies produce lots of data, but there is no analysis and subsequently no application of those results. Today, we are being knocked over the head with experts telling us we need to measure, but without the context and direction for what that means. Measuring means we know what we are evaluating. We can evaluate it. We will analyze it, and then do something with the information.
Lectures
Dumb question time: Given we all know and accept that lectures are not an effective way to train our participants, how do we explain that one can happily watch documentaries on TV and remember quite vividly what was saw? Aren’t TV documentaries simply glorified lectures with high-end visuals? What about TED talks? People love them. I don’t claim documentaries or TED talks are skill builders, but are they effective content disseminators? I pause now for your response.