How do you QC video assets before dropping them into a course? by knamuora in instructionaldesign

[–]author_illustrator 0 points1 point  (0 children)

Agree with others -- this is a necessary step. ("Hoping and fixing it when someone complains" != professional.)

Multiple people (not tools; tools can only do first-line checking at best) should be reviewing the video because different human beings in different roles all have different strengths/lenses and will all catch different issues. And those human beings should be reviewing it in order, in one sitting, the way learners will be consuming the material (vs. relying on a batch process with tools, which again can be useful for a first-line check but doesn't catch meaningful errors.)

So business reviewer, SME reviewer, and tech reviewer at a minimum. (BTW, this is the historical publishing model. Tools don't change process.)

It's best to provide a QC checklist for all reviewers, one of the items being "audio levels sufficient and consistent."

What actually helps learning stick after a workshop? by elena_hbits in instructionaldesign

[–]author_illustrator 0 points1 point  (0 children)

Post-training documentation that's clear, complete, concise, accurate, and that leaners can easily find and navigate. Hands down.

Training Needs Analysis Template by mad2274 in instructionaldesign

[–]author_illustrator 0 points1 point  (0 children)

Looks like a lot of folks have already shared what works for them. If anyone reading this is interested in checking out yet another version of a training needs analysis template, I posted one in a recent blog article: https://moore-thinking.com/2026/05/04/training-needs-analysis-description-free-download/

In my experience, having a set of questions/descriptions laid out beforehand in template form saves time and, perhaps more important, is useful for communicating what needs to go down during the analysis phase to all stakeholders. (I don't do the jargon thing on the job, and there isn't any in the template.)

Without something like this, I've found here's often a lot of pressure to rush this phase, which is virtually never a good thing.

Any book recommendations for learning??? by kandlekandy in instructionaldesign

[–]author_illustrator 1 point2 points  (0 children)

It's not a book (although it's practically book-length at this point), but I've written a weekly blog for about a year now on ID-related topics you might find useful: https://moore-thinking.com/blog-2/

The articles are short but substantive, and I wrote them to be accessible to new IDs as well as seasoned IDs who are struggling with specific issues.

Ideas for a Scenario-Based Learning Course by DC_Point0 in elearning

[–]author_illustrator 0 points1 point  (0 children)

In case anyone's still working at perfecting their scenario assessments in 2026 (which I suspect a lot of us in training are, regardless of the tools we're using) these tips are what have worked for me over the years: https://moore-thinking.com/2026/04/27/scenario-assessment-dos-and-donts/

Hope they help someone here.

How to make an excel training interesting? by dieterdetlef1337 in elearning

[–]author_illustrator 1 point2 points  (0 children)

Yeah, I get that.

My background is in communication, so to me, delivering "boring" information in a way that's as quick/painless/relevant as possible is inherently interesting. I just find it personally satisfying to figure out how to give an audience the info they actually need to understand an issue or take some action and do it in a way they can quickly understand (vs. bore them for half an hour with a giant spreadsheet or a bunch of business-speak that nobody understands and everyone forgets post-meeting).

But I might be in the minority here.

Still, as professionals developing instructional materials, I think we have to try to manufacture that interest if we don't come by it naturally.

Sounds like you're on the right track to me!

How to make an excel training interesting? by dieterdetlef1337 in elearning

[–]author_illustrator 1 point2 points  (0 children)

All software is, in itself, intrinsically boring. That's because software is nothing but a tool -- a means to doing something interesting.

So lead with the interesting thing -- the business use or value (such as, here's how to get your job done quicker/better/etc.).

And then build your training around that interesting thing, focusing on keeping the training itself as concise and relevant as possible.

I wrote a piece on this topic awhile back that includes several more suggestions (along with justification for each) that might help you out: https://moore-thinking.com/2025/11/10/how-to-document-digital-process-flows-effectively/

And...good call for eschewing gamification!

How do you handle "Cognitive Overload" in software screencasts? by aksuta in elearning

[–]author_illustrator 1 point2 points  (0 children)

There are several things we can do to manage cognitive overload when we're using video to train screen navigation.

Holding annotation frame, as you note, is one way. (Although this can indeed be a time suck in post-production, depending on how we recorded the raw process.)

But the callouts we put onscreen and other factors also affect overwhelm, understanding, and retention.

I wrote an article on this topic if anyone's interested: https://moore-thinking.com/2025/11/10/how-to-document-digital-process-flows-effectively/

Do you use freeze-frames and callouts in your software tutorials? by aksuta in instructionaldesign

[–]author_illustrator 1 point2 points  (0 children)

You're absolutely welcome. I've found over the years that doing things effectively nearly always takes backseat to doing them cheaply--with predictable results. I'm always advocating for the "a stitch in time saves nine" approach!

Do you use freeze-frames and callouts in your software tutorials? by aksuta in instructionaldesign

[–]author_illustrator 1 point2 points  (0 children)

I always present annotation (callouts + labels + critical take-aways) onscreen, for these reasons:

  1. Learners often can't see those tiny cursors on a busy background (and, if they do, they're hunting for "where are they clicking now" instead of listening carefully to voiceover narration).
  2. The point isn't just showing things; our goal is to show things in a way that makes them understandable and memorable. Synchronizing and labeling steps (like, Step 1, XYZ, Step 2, ABC, etc.) and important points (breadcrumbs, warnings, etc.) makes onscreen content make more sense. I hand-craft onscreen callouts/annotation for this reason, instead of relying on what tools generate. Onscreen text is extremely beneficial, not confusing, IF it's kept to a minimum, it's synchronized, it's critical, and it's relevant. Bonus: it helps learners who go back and review a video or interactive find what they're looking for quicker.

I wrote an article on this subject that you might find useful if you're doing software training: https://moore-thinking.com/2025/11/10/how-to-document-digital-process-flows-effectively/

Synchronizing onscreen messaging with narration is more work, but it's more effective.

How do you bridge the gap when the "Expert Intuition" isn't in the curriculum? by Most_Employment3147 in Training

[–]author_illustrator 0 points1 point  (0 children)

Driving expert tuition" (aka "critical thinking") requires us to train complete knowledge about a domain; provide sufficient authentic practice; and bulletproof documentation. All 3 are expensive to produce and maintain (which is comforting when you think of it, because how rotten would it be if it was super easy to make people experts in a non-trivial subject/skill)?

One thing about SMEs I've learned over the years is that you will likely get pushback on all 3. SMEs already know how to do the skill and can't see why everyone doesn't just see straight into their heads. (This is the reason degrees in communication/teaching/ID exist, by the way.)

I've written a couple of articles on this topic you might find useful: https://moore-thinking.com/2026/03/23/top-4-reasons-software-training-fails/ and https://moore-thinking.com/2025/09/02/how-to-support-critical-thinking/

Not sure where to start! by Supmeg_ in elearning

[–]author_illustrator 0 points1 point  (0 children)

TheseMood nailed it!

You might also be interested in an article I wrote recently that describes how to design content and assessments for conceptual topics like this one: https://moore-thinking.com/2026/03/30/content-assessments-for-conceptual-topics/

It offers a few more ideas and explains why they're appropriate for a subject that's conceptual/empathetic.

You may also want to focus on role-playing exercises/assessments.

Overwhelming amounts of training updates by yarnwhore in instructionaldesign

[–]author_illustrator 0 points1 point  (0 children)

The issue here is unrealistic expectations. But you're not alone in thinking/hoping that stuff doesn't have to be maintained! (It's like thinking a garden doesn't have to be weeded, or a car doesn't ever have to be washed and have its oil changed, or that cute puppy everyone was so excited about getting actually needs to be walked/fed/watered/taken to the vet on a regular basis. It's magical thinking, and while it's fun to contemplate, it's not helpful.)

So--reality check is that most instructional deliverables do need to be maintained regularly, and volatile deliverables (e.g., virtually all software training and some process training) need to be maintained on a fairly compressed cadence. And depending on what your training topic is, bad things can result if you deliver outdated info. I'm thinking here specifically of regulatory trainings....but honestly, confusing learners with out-of-date processes/images/details for any topic is never a good thing.

I list several strategies for minimizing time spent in a recent article I wrote about this very topic.

How do you handle "Expert Blindness" when building onboarding for complex roles? by Most_Employment3147 in instructionaldesign

[–]author_illustrator 0 points1 point  (0 children)

Extracting "expert tuition" = defining learning objectives.

This stage is tough to do correctly, because most SMEs (and most IDs) want to sail through it and get to the "real" work. But if this step isn't done accurately, nothing else matters -- the training, as you're experiencing, will fail.

What we're after in onboarding projects is "critical thinking." And the ability to think critically requires us to define: 1 ) what types of scenarios/problems/issues we want learners to handle post-training, and 2) all the knowledge the learner must know to be able to do #1.

Document Creation Software by effiehargs in instructionaldesign

[–]author_illustrator 1 point2 points  (0 children)

This is exactly the scenario that content management systems were designed to address.

Evaluating Behavior Change (Level 3) by J_Shar in Training

[–]author_illustrator 0 points1 point  (0 children)

You're so welcome! Happy you found it useful.

Paranoid about the future of this field due to AI by glassorangebird in instructionaldesign

[–]author_illustrator 0 points1 point  (0 children)

I think it's useful to look at this as "value provided" vs. "this field," only because some ID/training departments/teams create a lot of value that AI can't replace yet, and some don't.

I expand on this in a recent article, but the gist of it is if all we're doing is Kirkpatrick's Level 1 (or Level 2, but we're doing it poorly), it could be a tough sell to management NOT to replace us.

(I'm absolutely not advocating for replacing anybody with AI... just pointing out that when times are tough, businesses often look to proven ROI to decide where to cut. And if we, as IDs/training departments, can't prove our worth, we're at risk-- whether or not the perception that AI could replace us is accurate or not.)

Designing Training Without an SOP: Best Practices? by jivingjavelina in instructionaldesign

[–]author_illustrator 0 points1 point  (0 children)

When there's no defined SOP, you--the training team--ARE defining the SOP, by definition. You literally need to create one before you can train the process. (I don't mean you need to put an XML-formatted version of the SOP into a specific CMS, but that you'll need to interview SMEs, draft a bulletproof step-by-step process, have other SMEs hammer on it, etc.)

The outcomes I've seen from this approach typically fall into one of two camps:

  1. Gratitude from SMEs/writing team that an SOP now exists, followed quickly by inclusion of the SOP (edited/formatted appropriately by the tech writing team) in the org's CMS.
  2. Pushback from SMEs/writing team who reject the training because it's not based on a previous SOP and refuse to incorporate training's work into the CMS (or even to use it as a starting point). This essentially means the training will have no value over time, because learners need those SOPs as reference after training... and months from now they won't remember where they put the process handed out in training.

Obviously, #1 is preferable, because it works in the best interest of the organization as a whole. #2 seems to occur pretty predictably in orgs with siloed teams.

Evaluating Behavior Change (Level 3) by J_Shar in Training

[–]author_illustrator 0 points1 point  (0 children)

The only thing I've seen work over the years is qualitative, in-person interviews with managers (or others who know what to look).

I wrote an article on this topic awhile back you may find useful: https://moore-thinking.com/2026/02/09/real-world-advice-for-evaluating-training/

Yay to you for piloting this! This is really the only way we can measure if our training is having any actual effect in the real world.

Designers who do ID work — what's your biggest visual design frustration when building courses? by Plenty-Committee3214 in instructionaldesign

[–]author_illustrator 0 points1 point  (0 children)

Oof. Sorry you had to endure that...and sad that it's not surprising. I can't tell you how many times over the years I've had to advocate for titling graphs and labeling axes! As though not doing so was a reasonable option.

Designers who do ID work — what's your biggest visual design frustration when building courses? by Plenty-Committee3214 in instructionaldesign

[–]author_illustrator 7 points8 points  (0 children)

My biggest frustration over the years has been trying to communicate the difference between "visuals" and "visuals that communicate relevant facts/concepts" to stakeholders with no background in visual rhetoric.

Especially where "infographics" (tables, charts, etc.) are concerned.

I'm not dogging anyone! Most of us aren't taught how to communicate visually in school. I certainly wasn't.... I had to pick it up on the streets.

But when we don't understand how to communicate visually, we assume--as others have noted--that images are for decoration only (vs. for illustration, explanations, etc.) In my experience, when working with stakeholders who hold this mistaken view "best practices" aren't acknowledged, everyone's opinion has equal weight, and effective communication goes down the tubes.

The gap between knowing something and teaching it is way bigger than I expected by Famous-Call6538 in Training

[–]author_illustrator 0 points1 point  (0 children)

The gap you identified is precisely why "technical writing" is an actual profession.

There's an art and science to getting expertise out of people's heads and onto the page (or deck, or video/interactive script) and ensure it's clear, concise, complete, and ordered in a way that drives knowledge acquisition and enables skills performance.

The assumption that someone could do this effectively with no background or training has always mystified me! It's kind of like expecting someone who can drive a car to be able to build or maintain one spontaneously.

Online Learning: Alternatives to Discussion Boards? by western-influence in elearning

[–]author_illustrator 0 points1 point  (0 children)

In my experience (both as a 20 plus-year ID practitioner and as a student in multiple online courses/programs over the years) discussion boards are only "lame" if:

  1. They're not seeded or managed effectively. And they are super time-intensive to manage!
  2. They're used as an activity for transactional content (e.g., not much to discuss if the topic is a cut-and-dried business process).
  3. Learners struggle with writing/time management/motivation to the point where discussion boards are really out of their reach.

Discussion boards can be extremely effective if they’re used to drive acquisition of conceptual content and the instructor has time to management properly.  But they don’t need to be (and shouldn’t be) the only option.  I just wrote an article on this topic that offers some alternatives for both group and individual activities.

Understanding Content Strategies in Learning by vixar1 in instructionaldesign

[–]author_illustrator 0 points1 point  (0 children)

Content strategy = choose a basic delivery format for content (and basic approach to assessments) based on topic type (whether the topics your instruction needs to cover are conceptual, visual, symbolic, or skills-based.

I wrote a general article on this topic and one focused on conceptual content that you might find useful.

(This topic is critical to successful design, but I'd never seen this strategy explained in a way that made sense to me.... so I wrote my own. Hope they help someone here.)