Some weeks ago, I wrote about my key takeaways from the free content strategy sessions I started last year. During these sessions, I did not only confirm how some common misconceptions are held year after year, but also how these false truths spread across industries. As instructional designers, we need to advocate for examples that dispel these misconceptions and also, we need to show clients and stakeholders which is the right path to create meaningful learning experiences. Otherwise, we are doing a disservice to them and to our learners.

More content = more learning

There is a persistent misconception that more content will result in more learning. However, it is not the amount of information that will help learners acquire a new skill or gain more knowledge in a specific area or improve the way they carry out certain tasks. It’s the relevance of the information from the learner’s point of view what will have an impact at the end of the training. Evaluating, cataloging and organizing the information with a clear training or communication purpose in mind is essential.

Standard structure = real learning experience

Even when you get to that stage of content selection and structuring, placing content screen after screen and including some questions at the end doesn’t equal to an effective learning experience. This is the second misconception. Even the most relevant content needs to be audited, questioned and re-evaluated in the light of the instructional objective or the message that needs to be communicated. Some questions that might help determine the relevance of the content for the learner and therefore, its relevance for the goal you seek to achieve are:

(Organizational level) What problems is your organization experiencing? What are the challenges your team will be facing this year, in the next 3 years, in the next 5 years? How can you better prepare your team for those challenges? How can you measure the success of your training initiatives and how this success will impact the organizational growth?

(Individual level) Would this information be useful for the learner in the real world? How can he apply this to his own context of performance?

Use of an authoring tool = instructional design

Many times, companies and organizations see instructional design as something that plays a role at the very end of the process or which is worse, instructional design is conceived solely as the use of content authoring tools. An instructional designer needs to be involved from the very beginning in the process of creating digital learning experiences: from the audience analysis and subsequent content selection and organization to the design of the activities and assessment procedures. Instructional design is not just “good use of a tool” or an afterthought. It should be the foundation for designing more effective digital learning experiences.

More bells and whistles = more engagement

I truly believe that nowadays technology and resources can enhance the teaching and learning processes but only when they are used for a reason. It is easy to be tempted to use the so-easy-to-add interactive features of today’s content authoring tools but adding interactivity just for the sake of it doesn’t add up to the learning experience.

Brand identity = use of the logo

Every product you create, either for training or communication purposes, will play a role in the interaction between your users and you. And you, here, means your company or your organization. How delightful, consistent and memorable would you like that interaction to be? How can you make your voice be heard in every conversation you engage with? Your voice, your message, your identity goes beyond your logo and yet, so many times the logo is the only noticeable element. From the visual design, to the language you use, to the type of interactions you create, you can start a memorable dialogue and really connect with your audience. That is what we refer to when we talk about creating experiences through brands.

Learning analytics = completion rates

Learning analytics play such an important role in the equation of determining the effectiveness of a training but, most of the time, they are neglected. A comprehensive analysis of the learners’ performance in a given training context can help us gain valuable insights. The data we extract from a learning platform is more than just numbers. It is the key to make more informed decisions to improve and manage learning environments and really contribute to the organization’s development. You can go beyond the completion rates and analyze: individual learners’ patterns of behaviors, preferences and achievements; learners (or group of learners) that may need supporting interventions or reinforcement; and which skill or knowledge gaps are predominant and will demand the design of new training programs.

One more thing about tools

One of the biggest issues, and one that I come across very frequently too, is the preselection of tools without properly identifying the training or communication problem. The fact that a company owns a tool doesn’t make it the most suitable option for designing the solution. Would it be more convenient to make edits once the course is finished? Yes. Should that be the main determinant to select that tool? No. Big no, actually. I always ask my clients to tell me about their goals and/or problems and what they would like to accomplish. However, selecting the processes and technologies to be used is my job. As I define a course of action to help them accomplish what they have envisioned in the best possible way, I may need different tools, no conventional processes and a combination of various technologies.

These are some of the misconceptions about eLearning design that can lead to inadequate and defective learning solutions. Just one final question for you: do you just want to fulfill a training requirement or do you really want to empower your workforce with knowledge and skills for the years to come?

%d bloggers like this: