5 Things You Should Do Before Creating Your Next Captivate ProjectMay 17, 2014
Controlling Captivate – BookmarkingJanuary 5, 2015
Your objectives are approved and you’ve inserted them in your lesson. Awesome! You continue developing your content, building your interactions, and eventually … your course is finished! Testing begins and you’re feeling pretty good about things. Then you notice that your email box is filling up with comments from testers and a feeling of dread starts to set in. What has gone wrong?
Same story as above only this time your course is finished. It passed all of the internal evaluations and students have begun taking the course. The phone is ringing off the hook, your team is being blasted with emails, and your boss is becoming visibly upset. What’s going on?
In both situations people are complaining they can’t complete the interactions because the exercises deal with content that wasn’t addressed in the lessons. How can this have happened? Your objectives were approved and the content should’ve addressed the objectives. Your exercises were designed to measure the objectives. Why is this happening?
Sound familiar? If not, consider yourself lucky. So what is the source? Typically it’s one or more of the following issues:
- Your objectives were written and finalized early on in the process. Somewhere along the way content changed and no one evaluated the objectives again to determine if they were still applicable. This tends to happen more often when someone else, perhaps a subject-matter expert (SME), writes the content and you and your team are developing the course based on the material you’re provided.
Remember, you’re objectives define what you want the student to be able to achieve at the end of the learning segment. They tie directly to your content. You need to ensure the content you’re including actually supports those objectives. If they don’t tie together, you must do one of two things: revise the content or revise the objective. Keep in mind that changing one or more objectives may affect a higher level objective so use caution. Because of this, it is recommended that you revise the content. Do yourself a favor … take the time to review the content and ensure it addresses what it’s supposed to address. Taking a little extra time upfront will save you a ton of time in the end.
- Another situation might involve exercises that are too difficult because there wasn’t enough supporting information, discovery, or both provided in the lesson. Maybe the exercise is dealing with something that hasn’t even been covered in the material. Your development team and SMEs are quite possibly too close to the material and may not be able to recognize there is an issue with an interaction(s).
If more than one person is having difficulty, then you need to take a look at the content that you’re covering in an exercise and decide if you need to beef up your content, redesign the interaction, or even add a method for users to review the related content if they’re struggling. First and foremost, make sure what you’re testing students on was literally addressed in the content. You should also review the exercise to determine if the instructions are clear. Ensure your interaction works as it was designed. So many little things can cause big-time problems for users which will eventually cause even bigger issues for you.
So what’s the real message you should take away from this blog? Pay attention. Pay attention to your objectives all the way through a project. Pay attention to testers and students who voice complaints and concerns. Pay attention to the details. Not only will it make your life easier, it will help you ensure students are actually learning what you need them to learn from your course.
Sigh, unfortunately this happens FAR too often, and even if you don’t have bosses mad or students pitching a fit, the learning isn’t effective, for it’s not truly providing the core content the objective needed to have to support it, usually it’s the “Ten pounds of “stuff” in a five pound bag” syndrome, and not the other way around. SMEs, particularly very seasoned and knowledgeable ones (obviously who gets “picked” to be an SME usually?), are so far removed mentally from those the training is designed to support, they obfuscate the actual objectives with so much frivolous, but in their minds “absolutely essential prerequisite knowledge” materials, that students are suffering cognitive overload and shutting down before they ever make it through all the objectives, then usually having to suffer through repeated viewings just to make a mediocre score on the supposed assessments.
SUCKS, to be sure, but who’s fault IS it? Not the SME’s, not the Manager’s, the buck stops squarely on the ISD. Being an ISD is as much about keeping content OUT as it is about putting the most relevant and targeted content IN. This can be alleviated somewhat by making sure that SMEs are actually trained on how to be SMEs, which doesn’t happen nearly enough. But, depending on the temperament of the SME, you can start things off on a better footing by actually starting backwards with best outcomes and then working with the SME to define the best objectives to meet them. Also, make sure that the SME clearly can articulate the agreed basic knowledge level of the proposed student and they need to justify every piece of content included. I usually emphasize bandwidth and files sizes to help diminish their desire to “pump up” the content with “Gee Whiz” information that’s either rarely actually needed at the trainees current level, or could be provided in a job aid – available, but NOT an integral part of the course content.
Yes, paying attention to dumping is a good thing, but clearly identifying scope, desired outcomes, target audience and limitations, as well as diplomatically providing “advice” to the SME on what their role actually is up front – can often prevent the above unfortunate results from happening.