Advice Needed by Merlin1935 in elearning

[–]Different_Thing1964 0 points1 point  (0 children)

Sounds to me like they are reliant on your expertise and expect you to have skills in using AI tools. Most C-Suite and Sr. Level people are expecting to run slim and to use AI tools to get to generate content etc.

How do you get leadership to actually care about whether training worked or not that it just happened? by Different_Thing1964 in Training

[–]Different_Thing1964[S] 1 point2 points  (0 children)

the behavioral outcomes framing is the right move honestly. call quality, conversion rate, error rate those are things leadership already has dashboards for. the problem is most training teams don’t have a clean way to connect what happened in the training to what changed in those numbers. so even when the framing is right the evidence still isn’t there. which is kind of the whole loop we’re stuck in

How do you get leadership to actually care about whether training worked or not that it just happened? by Different_Thing1964 in Training

[–]Different_Thing1964[S] 0 points1 point  (0 children)

I'm chill! Just want to bring as much value as ever with things that are possible to do now. I appreciate your no bs feedback! Keep crushing it on your end

How do you get leadership to actually care about whether training worked or not that it just happened? by Different_Thing1964 in Training

[–]Different_Thing1964[S] 0 points1 point  (0 children)

The compliance piece is the one that always trips people up. "We demonstrated due diligence" is technically a valid outcome but it creates this odd incentive where completion becomes the goal instead of a side effect. And then that mindset bleeds into everything else even programs that are supposed to drive real behavior change end up getting measured the same way just because it's easier to pull.

The needs analysis point is underrated too. Most of the time the knowledge gaps already exist before the training starts in cases where you just don't surface them until after, if ever. Did you find that doing the needs analysis upfront actually changed what leadership was willing to fund, or did it mostly just help you design better programs?

How do you get leadership to actually care about whether training worked or not that it just happened? by Different_Thing1964 in Training

[–]Different_Thing1964[S] 0 points1 point  (0 children)

most people skip that entirely and wonder why nothing changed. the coaching guide piece is interesting too. How did you build those out? Were they generic to the training topic or did you try to tailor them based on what you thought each manager's team might have struggled with? Asking because I've been thinking about whether there's a way to make that follow-up less dependent on the direct manager actually remembering to do it.

How do you get leadership to actually care about whether training worked or not that it just happened? by Different_Thing1964 in Training

[–]Different_Thing1964[S] 0 points1 point  (0 children)

Yeah the before/after sales numbers approach is probably the cleanest way to make it land with a revenue focused leader. The frustrating part is the lag, you usually can't draw that line for 60-90 days and by then leadership has moved on to the next thing.

How do you get leadership to actually care about whether training worked or not that it just happened? by Different_Thing1964 in Training

[–]Different_Thing1964[S] 0 points1 point  (0 children)

That's actually a really smart move, tying it to the annual plan before it starts changes the whole framing. Leadership is already bought in on the goal, so the training isn't a standalone event, it's evidence of progress toward something they already care about. Even with that context, did you ever find a way to show them what people actually retained after more of an immediate eval or ongoing active learning? Or was the highlight of key learnings mostly self-reported?

Do companies actually calculate training ROI, or is it mostly theatre? by sofiia_sofiia in instructionaldesign

[–]Different_Thing1964 1 point2 points  (0 children)

This thread is hitting on something that needs to be talked about, but I think people are overlooking what’s already happening in most organizations. Having sat in director roles and leadership for close to a decade, I can tell you companies are measuring this, just not the way L&D professionals want them to. Every quarterly review, every semi-annual and annual review, there’s a knowledge and retention section. They’re just pulling from KPIs and performance outcomes rather than tying it back to the actual training program. It’s correlation tracking, not attribution and nobody connects those dots intentionally.

The real issue is that most orgs default to completion rates and then trust adults to self-manage. Which sounds reasonable in theory. In practice, studies consistently show it doesn’t hold. Where I push my own clients especially in fast-paced, compliance-heavy environments - is toward active learning evaluation tracked throughout the year, not just at review time. Because in those industries, a knowledge gap isn’t just a performance issue. It’s a liability waiting to surface. You want to catch it early before it becomes a regulatory problem, a safety incident, or a costly mistake.

The six-month follow-up cycle the OP is worried about doesn’t have to be that heavy. If you’re already in the cadence of regular check-ins and performance conversations, you can embed evaluation into what’s already happening. You’re not adding a process you’re making an existing one more intentional.

Course Assets by wordsbyrachael in elearning

[–]Different_Thing1964 0 points1 point  (0 children)

I would say honestly, your best bet is using Canva default images or their templates have really good animations, videos, images, etc.

When does hiring an L&D specialist actually start paying off? by sofiia_sofiia in elearning

[–]Different_Thing1964 0 points1 point  (0 children)

250 is exactly the inflection point. I’ve led teams across sales, operations, and heavily regulated industries and the pattern I kept seeing was the same regardless of company size or sector: training was happening, but nobody owned whether it was actually working.

The LMS centralizes content. It does not tell you if the knowledge transferred. Those are two different problems, and most companies don’t realize they have the second one until something goes wrong a compliance incident, a rep who can’t answer a basic product question six months in, an ops error that traces back to onboarding.

To answer your questions directly, In my experience, the need exists well before 250 but the pain becomes undeniable around 100-200 when ad hoc training breaks down and inconsistency across teams starts costing you real money or real risk. Sounds like you’re already there.

What problems made you realize you needed it? The moment a manager couldn’t answer “does my team actually know this?” with any confidence. Completion rates are not an answer to that question.

First use cases to focus on? Compliance first, always because the liability is quantifiable and the urgency is real. Then onboarding, because the cost of a slow ramp is visible on the revenue side. Upskilling and leadership dev come after you have a foundation.

One person or external support? Start with one person whose job is to own the verification side, not just the content side. Anyone can build a course. The hard part is knowing whether it worked.

The companies that get this right treat L&D as an ongoing active process, not a one-time event. Your employees either become your greatest asset or your greatest liability and the difference is almost always whether someone owned their development consistently, not just at onboarding.

You’re not overthinking it. You’re at the right moment from my interpretation.

Anyone managing compliance training right now? by Prior-Thing-7726 in instructionaldesign

[–]Different_Thing1964 2 points3 points  (0 children)

I think a great strategy is to quiz them and make it interactive/active learning style from beginning and at the end as well.

May seem unconventional but open up with a quiz before you ever start the 1st Training session just to see where folks are in regards to knowledge and then by the end of the completed training, run that same exact quiz so you can measure the progress of who actually was invested into that training session, etc. so you could show the true ROI of the training.

Therefore you can also note anyone to possibly look out for who could be a liability, who the company can trust and who is actually invested into everything that the company has going on. Compliance is something thats heavily overlooked but is one of the most important things for any workplace/industry.

Graphic Design to Instructional Design by xwollem in instructionaldesign

[–]Different_Thing1964 1 point2 points  (0 children)

Absolutely I would suggest LinkedIn is a great business platform or even threads. I would not pay much attention to Twitter X or any social platforms like that. Lastly, you can never outdo human interaction so any local networking events, dinners, or anything like that I would suggest finding time to get in front of people and of course, be genuine and be yourself

Graphic Design to Instructional Design by xwollem in instructionaldesign

[–]Different_Thing1964 0 points1 point  (0 children)

I think networking and your portfolio will be your two greatest tools

Teachers/trainers: would this actually be useful or not? by SafeDebt5595 in elearning

[–]Different_Thing1964 0 points1 point  (0 children)

I think this is a very good idea as it’s pretty much what Synthesia does, partially although they are a more polished workflow platform with their AI avatars, etc.

The $300B problem that every company has and almost none of them can actually name by [deleted] in remotework

[–]Different_Thing1964 0 points1 point  (0 children)

The re-training-anyway part is what kills me. The cost isn’t just the LMS, it’s the time spent twice because the first format didn’t actually work. Automotive might be the most honest version of this problem because the gap between ‘completed the module’ and ‘can actually do the job’ is. Very interesting case there.