L&D Strategist & Consultant
I build learning systems that change how people think — and how fast they decide.
15 years across healthcare, K–12, corporate, nonprofit, and arts education. One question drives all of it: are people making better decisions because of the learning I designed?
I started in classrooms — real ones, with students who weren't interested in compliance, only in whether what I was teaching meant something to them. That question has never left me. Fifteen years later, I'm still asking it. The classrooms look different now: conference rooms, clinical training suites, leadership cohorts, virtual delivery platforms. The question is the same.
My work spans healthcare, K–12 education, corporate learning, nonprofit leadership, and the performing arts. Each environment taught me something the others couldn't. Healthcare taught me that bad learning design has consequences. K–12 taught me that psychological safety comes before performance, every time. Corporate taught me that stakeholder alignment is part of the design work, not a prerequisite to it.
I run Rooted Growth Consulting — where I work with organizations to build L&D systems that actually change behavior. I also publish Learn with Maydwell, a blog and podcast for leaders navigating hard transitions: conflict, unlearning, the shift from directive to dialogue.
Outside the work: I play and teach trumpet, spend too much energy on Philadelphia sports, and keep looking for the problem nobody has named yet.
eLearning modules, scenario assessments, leadership programs, facilitation guides, onboarding systems, LinkedIn content engines, and resource hubs.
BVA Strategic Learning Architect pursuit. Scaling the LinkedIn Authority Engine to 7–10 clients. Launching The Alignment Check Kit.
Every engagement, every program, every piece of content I create comes back to the same question: are people making better decisions because of this?
"If the people I trained aren't deciding better, the program didn't work. Full stop."
I design learning programs tied to actual business problems — not training requests. That means diagnosis before design. Partnering with SMEs and senior leaders to identify what behavior needs to change, then building the system that changes it.
My output isn't courses. It's capability. Onboarding that produces proficiency. Leadership development that builds decision quality. Clinical training that improves patient outcomes.
A fully documented operating system for consultants and practitioners who need to build a professional audience. Every client runs the same repeatable cycle: intake → NotebookLM intelligence build → monthly calendar → batch production → performance loop.
The system is designed for 7–10 simultaneous clients. A 30-minute discovery call, a 3-hour preview cap, and documented batch rules keep it sustainable. I don't do custom — I do consistent.
Organizations don't have an AI tool problem. They have a decision-making problem that AI is exposing. Most AI training teaches people to prompt better. That's not the job. The job is changing how teams think, evaluate, and decide.
My frame is Decision Velocity: faster, higher-quality, more consistent decisions. That's the outcome worth designing for. Not tool fluency. Not prompt libraries. Judgment.
Development is not a transaction. It is a transformation. When individuals feel seen, supported, and challenged, they unlock the confidence and capability to grow — both personally and professionally.
Learners must know what's expected, why it matters, and how it connects to their goals before anything else. Clarity creates buy-in. Buy-in drives growth.
Journaling, peer feedback, digital checkpoints — built in, not optional. Reflection is the bridge between learning and becoming.
Learners are agents, not recipients. Critical thinking, active listening, and purposeful application sit at the core of every program I design.
Progress and mindset matter more than flawless output. I create cultures where experimentation, curiosity, and iteration are celebrated — not punished.
Knowledge becomes leadership when people understand principles deeply enough to adapt, apply, and teach them forward. The goal is never compliance. It is transfer.
"My role is not to deliver training. It is to create conditions where learning becomes culture."
— Jonathan Maydwell
Start with the problem. Not the solution.
Every learning experience must have a clear "why" visible to the learner.
Data informs design. Outcomes validate it.
Psychological safety comes before performance expectations.
Inclusive design is non-negotiable across all environments.
Never let a program go static. Learning is a living system.
The question I use to evaluate every program I design is not "did people complete the training?" It's: are they making faster, higher-quality, more consistent decisions because of what they learned?
Decision Velocity is the metric that separates learning design from learning activity. Most organizations measure the wrong thing — seat time, completion rates, satisfaction scores. Those measure attendance, not change. They tell you people showed up. They tell you nothing about whether the work mattered.
This frame applies to everything I build — whether it's a clinical onboarding program, a leadership cohort, or an AI readiness initiative. If the people I trained aren't deciding better, the program didn't work. Full stop.
It also applies to consulting engagements. The one metric that tells me a year was worth it: are the organizations and people I worked with making faster, higher-quality, more consistent decisions because of how I designed their learning or their content? If yes — the year worked.
Increase in leadership training participation
Improvement in team execution quality
Improvement in learner retention year over year
Client discharge success rate
Most AI readiness training makes the same mistake: it teaches tools instead of thinking. Get people comfortable with ChatGPT, add a few prompt templates, call it transformation. That's not readiness — that's compliance with a new interface.
The real question isn't "can your team use AI?" It's "does your team make better decisions when AI is in the room?" Those are completely different questions, and most organizations are only asking the first one.
I work with organizations navigating what I call the Quiet Pivot — the slow, intentional shift from how work was done to how it needs to be done. Not a big-bang transformation. Small, structured steps that build judgment faster than any tool adoption program can.
The Quiet Pivot rejects two ideas: the Automation Fallacy (automating broken workflows to make them faster) and compliance-first AI training that measures course completion instead of behavior change. Both produce activity. Neither produces readiness.
Talk AI ReadinessAI doesn't replace judgment. It exposes gaps in it. The goal is building decision-making capacity that holds up — with or without the tool.
Sustainable AI integration happens in deliberate, small steps. Not massive rollouts. Not mandated adoption. Through designed learning that changes how people think first.
Tool fluency fades when the interface changes. Decision quality compounds. That's the asset worth building.
Automating a broken process makes it break faster. Every AI readiness engagement starts with diagnosing the workflow before touching the tool.
Writing about the things that don't make it into training decks: conflict that signals change, empathy as a design principle, the courage it takes to unlearn something that used to work. Narrative first. Insight second. No listicles.
Short, substantive conversations on leadership and L&D — built for practitioners who need something useful in the time they actually have. Five minutes. One idea. Enough to act on it before the next meeting starts.
If you're dealing with a learning problem — or you're not sure yet if it's a learning problem — that's exactly the right place to start a conversation. I don't pitch solutions before I understand the situation.