I made an observational post on LinkedIn recently that got me lightly ratioed. I said that hiring junior people is becoming untenable for many employers and junior talent is not learning as they should be. The responses raised some good questions:
- Who's going to train the next generation of talent?
- How should students be using AI?
- What should junior talent do to prepare for this new future?
The landscape for junior talent sucks and I think it’s going to get worse. I taught college courses for 8 years because I wanted to help students enter the workforce with skills I didn’t have when I graduated. I don’t have everything figured out, and things are moving quickly, but I have some ideas for students and junior talent on how to get ahead.
The brutal reality, observed
The apprenticeship model is breaking
Many professional industries run on some version of an apprenticeship model. You hire someone junior, pay them to learn, lose money for a couple of years, and recoup the investment once they're competent. Law firms do it with associates. Consulting firms do it with analysts. Accounting firms, agencies, engineering shops: it's the same structure.
A partner at a top law firm told me that senior partners are questioning whether they should hire associates at all anymore. The apprenticeship model is economically brutal for them: it only works when the associate sticks around to year three or four when their billing finally exceeds their cost. But retention is terrible and many associates churn before they ever become profitable.
AI makes junior talent harder to justify. The work that juniors do (research, first drafts, summarization, routine analysis) is exactly the work that AI excels at, and Claude won’t leave after eighteen months for a competitor.
Pre-AI, senior attorneys spent most of their time supervising, project managing, and reviewing, not drafting. If you’ve kept up with modern skills, that is exactly what people who are leveraging agents are doing right now. A senior’s skills translate perfectly to our current AI-driven workflows.
So why not train the next generation to develop those skills? The issue is that the work that a senior employee does in training a junior can be applied directly to building an automated system instead.
Great training is great for displacement
Another attorney friend started a solo practice and hired his first associate in mid-2024. He was excited to train someone and invested heavily in building training processes from day one, including thorough documentation, procedures, structured workflows.
As he built them, he noticed something. The same processes he designed to train his associate worked just as well to train AI agents. He'd roll out a training program, ask his associate to do the work, then out of curiosity, give Claude the instructions and workflows and ask it to do the same thing. Claude consistently came back faster and with higher quality.
When the associate left about a year later to pursue big law, he was relieved that he didn’t have to fire them. In building his training program, he inadvertently built a system of agents and procedures that did the work he'd hired the associate to do.
He will likely never hire an associate again. If he scales, he'll bring on a partner or counsel - someone with judgment who doesn't need training, not someone he has to develop from scratch.
It’s a bittersweet truth for those of us who like to teach people at work: the better your training program, the less you need the trainee. Formalizing tacit knowledge into explicit systems is also what makes that knowledge automatable.
This exacerbates the classic problem: you need a job to get experience, but you need experience to get a job. Getting that experience is going to be more challenging than ever.
Seniors with AI scale exponentially faster than juniors with AI
On a call with a group of founders, the consensus was unanimous. It's just not worth hiring junior people anymore. They take too long to train, too long to contribute, and they don't know what good looks like. Any work they ship needs extensive review and rework.
Meanwhile, a senior engineer can prompt an agent, review the output, catch mistakes, and confidently ship.
Fortunately, the skills that make seniors valuable in an AI world are learnable, but not from blindly shipping AI-generated output. They come from processes that hone judgement: project management, code review, design critique, and editing.
There’s a chicken and egg problem: how do you learn what good looks like if you can’t find a manager to teach you and the AI doesn’t know?
The talent isn’t developing itself
There’s a supply problem compounding all of this. In 2020, everyone was stuck attending Zoom University. I taught during that time, and I saw first-hand how educational standards were relaxed (I got bad reviews from students who thought they should get less work because of COVID). Then, just as schools were rebounding, students received the blessing and curse of ChatGPT 3.5.
AI lets you skip the part where judgment develops
Using AI is great. Doing more faster is great. But not when it allows you to skip the part where judgment and context develop. It’s fun to imagine that judgement is a matter of talent, but it’s a skill, and skills are a product of time.
Historically, that meant spending time with your work. A writer could be stuck staring at the same bad intro for weeks, figuring out how to get a piece started. All that time iterating forces you to think about tradeoffs and nuances between different options. It also meant gradually putting your work in contact with reality throughout the development process.
AI collapses that time to zero.
Junior talent doesn’t know they aren’t good
Everyone is a great investor in a bull market. When output looks competent, it is very easy to assume you are competent as well. How many security risks shipped in 2025 because vibe-coders don’t know that you shouldn’t put credentials in your code-base?
I was teaching when 3.5 launched and I cancelled a lecture to discuss ChatGPT with students. I wanted my students to use AI, but I wanted them to know its limitations so they could get great results. I decided to rework the final exam to offer credit to anyone who used ChatGPT and could tell me what it got right and what it got wrong. I was amazed at the striation of results.
- The 3 most curious students leaned into AI and produced impressive work.
- Most students didn’t engage with AI at all.
- The bottom quartile submitted AI slop, but they didn’t take the time to get extra credit for explaining why their ChatGPT essays were bad because they had no idea they had submitted slop.
Learning is no longer a necessary condition to production
People who don’t know they’re bad don’t invest in getting better. Why would you? Everything you ship looks fine.
Before AI, if you wanted to ship good work, you not only had to learn how to ship good work, you had to learn how to learn how to ship good work: how to diagnose your own gaps, how to find resources, how to ask the right questions of the people evaluating your work.
I recently managed an employee straight out of college. She was incredibly hard working, but she never learned to ask me for help. She fortunately used ChatGPT to review her work, but it took her too long to understand that the person she needed to learn from was me. I approved her work, and ChatGPT and I had different standards.
Learning is a meta-skill that develops as a byproduct of getting good. When AI does the work for you, that byproduct disappears too.
Whose problem is this?
The most common retort I got for my quick LinkedIn post was something along the lines of: “Employers are evil/making a mistake by not training the future.”
Unfortunately, this is not a productive line of thinking. There is little incentive for employers to take on the cost of training people in competitive and liquid talent markets. Companies that don't invest in junior talent still hire from companies who do.
The more productive conversation is about what individuals and schools can do, because that's where the incentives align.
From the employee's side, the incentives are clear: every dollar you invest in your own development compounds. The difference in salary between a junior and a senior employee can be partially explained by training costs. Your lower salary is the market pricing in the cost of your development. If you close that gap on your own, you can earn the senior salary faster. The training budget exists, it's just embedded in your paycheck instead of an HR program.
I have thoughts for educators in this separate post. But for individuals, it pays to take the training responsibility on for themselves instead of waiting for their school to catch up to the rapidly-moving market or waiting for their employer to teach them.
What individuals can do
This ultimately falls on you. Getting displaced by AI is no longer a hypothetical problem, and learning to use AI to avoid your own displacement is the safest solution. Fortunately, this is the best time in history to learn anything if you’re curious and self-directed.
I’m not encouraging anyone to drop out of school, but don't rely on school alone. You'll change yourself a lot faster and a lot easier than the education system will change for you.
What can you do today?
Use it
If you’re reading this and aren't using AI at all, that's the first thing to fix.
Taking a moral stand against AI is a choice, but be aware you're trading against your employability and competitiveness in the talent market.
If you haven’t used AI because you don’t see the use-cases or don’t have time to learn how to use it, make time. Block 30 minutes every day to play with ChatGPT, Claude, Gemini, Midjourney or any other AI tool. If you don’t know what to do, ask the AI to help you brainstorm.
Stay on top of the meta game
Staying informed of what is possible with AI is a recurring investment. AI capabilities and tooling are changing rapidly, with significant announcements almost daily.
Understanding what others can achieve with these tools is important for understanding where you need to stay competitive. If an early adopter is using AI to mass-produce marketing content, others will eventually follow. You want to stay on that frontier with them.
Do this while learning about problems in your field, and you'll start developing ideas for applying your newfound capabilities to the job you want to do.
Knowing what’s possible compounds. I recently met Parth Patil in the Village Global podcast studio and he framed this metacognition as the key compounding advantage; thinking about how to apply intelligence rather than simply applying it. The full interview with Parth is worth a listen.
Play with the tools. Learn the limitations. Let one project inspire the next.
Build things and put them in front of people who can tell you they're bad
Building alone isn't enough. If you only ever evaluate your own output, you're grading your own homework. This has been true of any creative endeavor: it's nearly impossible to get good in a vacuum.
Make something you’re happy with and show it to someone with good judgment, like a mentor or a professor. Better yet, show it to potential customers or employers who'll tell you where it needs improvement. This is how you develop the internal compass that AI can't give you.
Learn how things get built, not just how to build your piece
The skills that make seniors valuable (reviewing work, managing projects, understanding how design, engineering, and marketing fit together) can take years of organizational experience to develop intuitively. You don't have years, and the apprenticeship that would have taught you is disappearing. So study it deliberately. Learn what a project management workflow looks like. Understand how code review works and why. Read about how products move from idea to launch across teams. The more you understand the full system, the better you'll know which parts AI can handle and which parts need you.
Use AI to learn, not just to ship
AI is the best tutor most people have ever had access to. It may not have your context or expertise, but it knows more than you do where you are not the expert. Lean into this and help it help you learn.
The simple version of using AI as a tutor is to ask it questions: when you build something with an agent, ask it to explain what it did and why. When it writes code, ask it to walk you through the tradeoffs.
The less simple version is to build feedback loops into your process:
- I asked Claude Code to surface insights while I work. It teaches me things I wouldn’t think to ask.
- I recently started keeping a learning journal. This is tedious to maintain when I’m trying to ship, so I built a skill that summarizes what I learned during a build session. It will summarize the questions I asked, the answers it gave, and how it influenced our decisions.
- I periodically ask ChatGPT and Claude to tell me where I might have blind spots as I work on things. I am still learning how to learn.
Learn to discover your knowledge gaps and choose which ones to fill
It may be a controversial opinion, but I think it’s fine to ship a full app with AI and not understand how any of it works. The important thing is to know what you don’t know: it’s dangerous if you confuse shipping with knowing.
Perfect is often the enemy of done, and getting things done often involves making tradeoffs. If you know you are shipping an app that has security vulnerabilities and you know the scope of the vulnerabilities, you’re taking a calculated risk. If you don’t know those vulnerabilities exist, however, that’s a problem.
The right overlap is being curious and lazy. To be disinterested and lazy is fatal.
Show your ability to learn
Signaling capabilities is an important part of getting hired. My pre-AI advice was to show your work: ship side projects, have a portfolio, demonstrate agency. That advice is evolving now that anyone can ship a website or automate an ad campaign. The output alone doesn't prove much.
Historically, a portfolio got you to an interview, which then probed your thinking. My prediction is that junior portfolios will be discounted to zero soon. Employers will want to understand how you got to the output before they even meet with you.
Whether "show your work" evolves into "show more work" or "show how you work," the underlying signal will stay the same: prove you can learn so that they know you can ship more and better over time.
Give yourself room to explore
If you don’t know what’s possible with AI tools, or even if you think you know, it’s worth tinkering. You might get better results asking AI to propose solutions to your problems than you would asking it to apply the solution you thought was best. My current areas of exploration involve working with Claude Code to analyze my work to make sure I’m focused on the right problems to begin with.
The most creative people I know are finding ways to express themselves that were inaccessible before. You might set out with a particular career in mind, but as you learn what you are capable of with these tools, you may discover that there's something more interesting to pursue.
Don’t wait, don’t give up
Riding a bull and trying not to fall off it look the same. In either case, you best grab the rope. Whether this is the best of times or the worst of times depends on your curiosity, creativity, and courage. You don’t need a firm to train you, a manager to review your work, or a curriculum committee to approve a course. The work we all do is changing, and while it’s affecting junior talent first, it will come for the seniors too.
Better to drive the tractor than to be the ox.
If you’re an educator, I wrote a companion piece on what schools can do about the junior talent crisis.