Skip to main content
MentorStack Team

What AI actually changes about career growth (and how to lead through it)

AI career growthAI in talent developmentHR AI strategyAI mentorship platforminternal mobility AIAI in L&D

I've been having roughly the same conversation with HR leaders for a year now. They've seen what AI agents can do. They've watched their employees use Claude or ChatGPT for half their day jobs. They know career development is going to be reshaped by this. But when I ask what they're actually doing about it, the answer is usually some variant of "we're piloting an AI coaching tool" or "we've added an AI module to our LMS."

Both of those are fine. Neither is the strategic move.

The strategic move requires understanding what AI is actually changing about career growth, not just bolting AI onto the existing development stack. And what it's changing isn't really what most HR teams think.

The shift most people are missing

The standard discussion of AI in career development focuses on the wrong layer. Articles about AI-powered learning paths. AI coaches. AI skill assessments. All useful. None of them are the actual shift.

Here's what I think is the actual shift: career planning is collapsing from a quarterly event into a session-level interaction.

When an employee can open Claude on a Tuesday afternoon and produce a credible 12-month growth plan in five minutes, the entire cadence of how careers get developed changes. The annual development conversation isn't dead, but it stops being the moment that matters. The moments that matter are whenever an employee has a question, a doubt, a new aspiration, a moment of clarity. That can happen any day. Increasingly, when it happens, the person they're talking to is an AI agent, not their manager.

If you accept that as the underlying shift — and I think you have to — most of the rest of the development stack needs rethinking.

The development form, for instance. Forms exist to capture intent at fixed points in time. AI agents capture intent continuously. The form is a fossil from an era when capturing intent was expensive. Most companies still spend significant L&D effort improving theirs.

The annual review's role changes too. It's still useful for compensation, for performance, for big-picture realignment. But it stops being the moment where development gets done. Development happens between reviews now, in moments your L&D team probably never sees.

And then there's the harder one. The L&D function's strategic question stops being "do we have the right curriculum, the right frameworks, the right learning paths?" It starts being something more like "is our internal mentor network deep enough that any employee with a plan can find someone relevant to learn from, on roughly the timeline AI is operating on?"

Most companies are not set up to answer yes to that. They are set up to answer yes to the curriculum question. That's the gap I want to talk about.

Where the bottleneck moves

A career plan written by an AI is a starting point, not a finish line. It can tell an employee what to learn and what to position for. It cannot tell them who in their specific company has done this before, who's free to grab coffee, who's running the kind of project worth shadowing, who's got political capital to sponsor them when a role opens up.

That's all human work. That's where careers actually get made.

Once you accept that AI is going to handle the planning side, the strategic question for HR leaders becomes how to make the human network around it as available, well-matched, and easy to access as possible. That's a different problem from the one most development teams are organised around. And it's a harder problem, because it requires investments that don't show up neatly in a learning management system.

You need mentor density first. Most companies don't have it. Their mentor pool is a handful of senior leaders who get asked to mentor everyone, get burned out, and stop saying yes. That's a structural problem AI agents will make worse, not better, because more employees will be producing more credible plans more often, all of them needing real humans to act on those plans.

You also need matching speed. Quarterly mentor matching cycles do not survive contact with on-demand AI planning. The match has to happen at roughly the speed the plan happens, which means days, not months. If your current matching process can't do that, that's the next investment.

The third piece is harder to name and most companies lose it entirely. Every mentor conversation, every shadowing arrangement, every sponsor relationship generates information about what's working. Where the network is strong. Where it's brittle. Which roles are getting filled internally and which aren't. Most of that signal evaporates because there's no infrastructure capturing it. The companies that build that capture loop will know things about their own talent that their competitors won't.

What this looks like for someone actually using it

Take a real example. Someone on the marketing team wants to be a director in twelve months. Tuesday afternoon, she opens Claude and asks it to build her a 12-month plan. The agent works through it: own a cross-functional campaign in Q1, build management foundations in Q2, lead a small team in Q3, position for the conversation in Q4.

In an organisation without the right infrastructure, that plan goes into a Notion doc and dies. She knows what she should do. She doesn't know who to learn from, who could sponsor her, or which projects are worth shadowing. She might bring it up at her next 1:1, but the friction between "great plan" and "first concrete action" is high. Most plans don't survive the friction.

In an organisation with the right infrastructure, what happens next is different. The agent doesn't stop at the plan. It calls into the company's mentorship platform — through MCP, the protocol Anthropic released in late 2024 that lets AI agents talk to external systems — and surfaces the relevant humans. The director who made the same jump eighteen months ago. The VP who's a credible sponsor. The senior manager running a campaign worth shadowing. It books the first conversation. Thursday at 2pm.

This is what the MentorStack MCP server does, by the way. It's the example I know best because we built it. The architectural pattern is the more important point though: AI agents handle planning, mentorship platforms handle matching and scheduling, MCP wires them together. Whether you build that yourself, buy MentorStack, or buy something else, that architecture is roughly what the next decade of well-run development programs will look like.

What I'd push you to actually do

If you're trying to figure out where to start, here's roughly what I'd push.

Audit your mentor network honestly. Not how many people are enrolled in the program. How many active mentor relationships exist, how recently they've actually met, how distributed they are across functions and levels. Most companies discover their mentor pool is concentrated in fifteen people who are exhausted. That's the density problem, and AI agents will make it visible faster than anything you've previously rolled out.

Pick one cohort and run an actual experiment. A specific function or team. Give them AI planning tools and a frictionless way to find internal mentors, then watch what happens to their development conversations and their mobility rate over six months. Don't try to roll this out company-wide before you know it works inside one team — that's how good ideas die in HR.

Stop investing in the form. I mean this literally. If you've got resources going into improving the development form, the IDP template, the talent review document, redirect them. Those things will be largely irrelevant within two years and the investment isn't compounding. Put the money into the mentor network and the matching layer.

Get serious about matching speed too. If you can't tell me how an employee gets connected to a relevant internal mentor in under a week, you don't yet have what you'll need. This usually means a platform — manual matching at scale doesn't actually work, and adding more L&D headcount to do it manually has its own problems.

The last one is underrated. Treat AI fluency as an expectation for managers, not a nice-to-have. I don't mean prompt engineering training. I mean managers understanding what AI can and can't do for someone's career, so they can guide their reports through using these tools instead of feeling threatened by them. The managers who can't make this transition will be increasingly unable to coach their people, and it will become very visible very fast.

The thing I want you to take away

The companies that win the next decade on talent will be the ones that treat AI as something that puts more weight on human development, not less. They'll invest in the things AI can't do — the mentor network, the sponsor relationships, the internal connections that turn a plan into actual growth — at exactly the moment everyone else's planning is becoming effortless.

This is going to feel backwards. As AI gets better, the obvious instinct is to put more money into AI. I think the smarter move is the opposite. AI raises the bar on what humans need to provide, because the planning is now everywhere and the bottleneck has moved to execution. That means investing in execution capacity, which is human capacity.

If you want to see one version of what this infrastructure looks like, the MentorStack MCP server is live: mentorstack.co. I'd be glad to walk you through it. The platform isn't really the point of this article though. The point is whether you see the shift clearly enough to invest ahead of it.

Some teams will. Most won't. The gap between those two groups in eighteen months is going to be bigger than people expect.

Ready to launch your mentorship program?

Start a free pilot with up to 10 people. Scale when you're ready.

No credit card required.