Erik Pavia / Notes

What Schools Can Do About AI Killing Junior Hiring

I recently wrote about how AI is killing junior hiring. The short version: AI is automating the work that justified hiring juniors, the apprenticeship model is breaking, and junior talent isn't developing the judgment needed to be useful. That piece focused on what individuals can do. This one is for educators.

If you're working in education, the problems I described should concern you. Your students are graduating into a market that is less willing to train them than ever before. Here's what you can do about it.

Embrace vocational skills

Universities get well-deserved criticism for graduating students with unemployable skills. Over the last several decades, schools that started as trade schools have gradually and unnecessarily shifted into pseudo-liberal-arts schools. For the illiterates on LinkedIn, yes, there is a place for the small, private, liberal arts college, but that education is a luxury. If you're coming from a lower socioeconomic background, like I did, you probably go to school to get a job.

Historically, a school could take 3-4 years of a student's time to teach them general knowledge and employers would take the responsibility of training graduates on the specific skills necessary to do their work. However, that willingness to train is evaporating, and if a student wants to graduate into a job, they should know both the general knowledge and the specific skills.

Schools who want to graduate employable students would benefit from returning to vocational training. White collar education should look more blue collar: the further a school sits from academic abstraction, the more likely it has a good model for training people who can graduate with the skills they need for employment. I find it incredible that a student can graduate with a degree in accounting without ever having touched a real corporate bank account.

Schools don't need to sacrifice their pedigree and status to do this. Waterloo is a shining example in tech. Through its co-op program, students complete four to six paid work terms, up to two years of real work experience, before they graduate. The work experience is a graduation requirement, not an optional internship. Shopify CEO Tobi Lütke has said that 40% of Shopify's interns come from Waterloo because the school produces "a different tier of talent."

Waterloo is one of the top schools people look at for engineering talent:

  • It ranks #1 in Canada for computer science, math, and engineering.
  • Over 8,000 employers hire from the program.
  • In my time working with top-tier startups, I've seen Waterloo go from a relatively unknown source of talent to a school with street cred. On a call with entrepreneurs swearing off hiring junior employees, there was still consensus that Waterloo was a source for "cracked" engineers.

By the time Waterloo students graduate, they know what work in the field looks like. The model isn't complicated. Make students do real work before they graduate. Employers trust the graduates because they've already proven they can work, which makes the school more attractive, and draws stronger students.

Hack your curriculum

Most professors and administrators have no idea what awaits their students in the real world. If you teach or design curricula, I recommend following tech people on X. Industry people still take it seriously and LinkedIn influencers are generally behind. Build a pool of people who are tracking trends and ask them to share their insights periodically. Seek your most entrepreneurial alumni, contact the hiring managers at your most sought-after employers, and ask professors to identify the tinkerers in class. Bring them together to share what they're building.

If you are an educator and you see what's happening, you are undoubtedly in for a hard time trying to get your school to change. Waiting for consensus and a perfect new program isn't an option. By the time committees have convened and surveys have been completed, any program will be outdated. The best bet is to find the most relevant existing resources and repurpose them for this new reality.

Figure out which courses already exist and repurpose them. Open them up as auditable courses, elective options, or certificate components so that students get closer to graduating as part of taking the course.

When I first started teaching, my college's Dean and I wanted to introduce a course on tech startups. The Dean knew it would take too long to go through the correct department, so he found a course that was under his department and which I met the requirements to teach. We finessed every ambiguity in the Business Law 4391 course description, and I ended up teaching a course I called "Legal Issues for Startups." I taught as little law as I reasonably could and focused on exposing students to the things I had learned from founders at Stanford and in Silicon Valley.

And then there's the professor problem. Too many educators spent the last three years obsessed with preventing students from using ChatGPT instead of figuring out how to teach alongside it. Some want to put their head in the sand and pretend none of this is happening. Some believe that it shouldn't happen. However you feel about it, it is happening, and people need to face it head on. Debate the ethics. Write the papers. Hold the symposiums. But none of that should come at the expense of the education students are paying for.

Schools have always been in the business of preparing people for the world they’re about to enter. That world is changing faster than most institutions can adapt, but adapting slowly is better than not adapting at all. The students paying tuition right now don’t have the luxury of waiting for a five-year strategic plan. Get a grip on what’s happening to work, give students vocational skills, expose them to real work, and get out of the way of the tools that will define their careers. The schools that do this will produce graduates that employers actually want. The rest will produce rapidly deprecating credentials.

AI is Killing Junior Hiring — Here's What to Do About It

I made an observational post on LinkedIn recently that got me lightly ratioed. I said that hiring junior people is becoming untenable for many employers and junior talent is not learning as they should be. The responses raised some good questions:

  • Who's going to train the next generation of talent?
  • How should students be using AI?
  • What should junior talent do to prepare for this new future?

The landscape for junior talent sucks and I think it’s going to get worse. I taught college courses for 8 years because I wanted to help students enter the workforce with skills I didn’t have when I graduated. I don’t have everything figured out, and things are moving quickly, but I have some ideas for students and junior talent on how to get ahead.

The brutal reality, observed

The apprenticeship model is breaking

Many professional industries run on some version of an apprenticeship model. You hire someone junior, pay them to learn, lose money for a couple of years, and recoup the investment once they're competent. Law firms do it with associates. Consulting firms do it with analysts. Accounting firms, agencies, engineering shops: it's the same structure.

A partner at a top law firm told me that senior partners are questioning whether they should hire associates at all anymore. The apprenticeship model is economically brutal for them: it only works when the associate sticks around to year three or four when their billing finally exceeds their cost. But retention is terrible and many associates churn before they ever become profitable.

AI makes junior talent harder to justify. The work that juniors do (research, first drafts, summarization, routine analysis) is exactly the work that AI excels at, and Claude won’t leave after eighteen months for a competitor.

Pre-AI, senior attorneys spent most of their time supervising, project managing, and reviewing, not drafting. If you’ve kept up with modern skills, that is exactly what people who are leveraging agents are doing right now. A senior’s skills translate perfectly to our current AI-driven workflows.

So why not train the next generation to develop those skills? The issue is that the work that a senior employee does in training a junior can be applied directly to building an automated system instead.

Great training is great for displacement

Another attorney friend started a solo practice and hired his first associate in mid-2024. He was excited to train someone and invested heavily in building training processes from day one, including thorough documentation, procedures, structured workflows.

As he built them, he noticed something. The same processes he designed to train his associate worked just as well to train AI agents. He'd roll out a training program, ask his associate to do the work, then out of curiosity, give Claude the instructions and workflows and ask it to do the same thing. Claude consistently came back faster and with higher quality.

When the associate left about a year later to pursue big law, he was relieved that he didn’t have to fire them. In building his training program, he inadvertently built a system of agents and procedures that did the work he'd hired the associate to do.

He will likely never hire an associate again. If he scales, he'll bring on a partner or counsel - someone with judgment who doesn't need training, not someone he has to develop from scratch.

It’s a bittersweet truth for those of us who like to teach people at work: the better your training program, the less you need the trainee. Formalizing tacit knowledge into explicit systems is also what makes that knowledge automatable.

This exacerbates the classic problem: you need a job to get experience, but you need experience to get a job. Getting that experience is going to be more challenging than ever.

Seniors with AI scale exponentially faster than juniors with AI

On a call with a group of founders, the consensus was unanimous. It's just not worth hiring junior people anymore. They take too long to train, too long to contribute, and they don't know what good looks like. Any work they ship needs extensive review and rework.

Meanwhile, a senior engineer can prompt an agent, review the output, catch mistakes, and confidently ship.

Fortunately, the skills that make seniors valuable in an AI world are learnable, but not from blindly shipping AI-generated output. They come from processes that hone judgement: project management, code review, design critique, and editing.

There’s a chicken and egg problem: how do you learn what good looks like if you can’t find a manager to teach you and the AI doesn’t know?

The talent isn’t developing itself

There’s a supply problem compounding all of this. In 2020, everyone was stuck attending Zoom University. I taught during that time, and I saw first-hand how educational standards were relaxed (I got bad reviews from students who thought they should get less work because of COVID). Then, just as schools were rebounding, students received the blessing and curse of ChatGPT 3.5.

AI lets you skip the part where judgment develops

Using AI is great. Doing more faster is great. But not when it allows you to skip the part where judgment and context develop. It’s fun to imagine that judgement is a matter of talent, but it’s a skill, and skills are a product of time.

Historically, that meant spending time with your work. A writer could be stuck staring at the same bad intro for weeks, figuring out how to get a piece started. All that time iterating forces you to think about tradeoffs and nuances between different options. It also meant gradually putting your work in contact with reality throughout the development process.

AI collapses that time to zero.

Junior talent doesn’t know they aren’t good

Everyone is a great investor in a bull market. When output looks competent, it is very easy to assume you are competent as well. How many security risks shipped in 2025 because vibe-coders don’t know that you shouldn’t put credentials in your code-base?

I was teaching when 3.5 launched and I cancelled a lecture to discuss ChatGPT with students. I wanted my students to use AI, but I wanted them to know its limitations so they could get great results. I decided to rework the final exam to offer credit to anyone who used ChatGPT and could tell me what it got right and what it got wrong. I was amazed at the striation of results.

  • The 3 most curious students leaned into AI and produced impressive work.
  • Most students didn’t engage with AI at all.
  • The bottom quartile submitted AI slop, but they didn’t take the time to get extra credit for explaining why their ChatGPT essays were bad because they had no idea they had submitted slop.

Learning is no longer a necessary condition to production

People who don’t know they’re bad don’t invest in getting better. Why would you? Everything you ship looks fine.

Before AI, if you wanted to ship good work, you not only had to learn how to ship good work, you had to learn how to learn how to ship good work: how to diagnose your own gaps, how to find resources, how to ask the right questions of the people evaluating your work.

I recently managed an employee straight out of college. She was incredibly hard working, but she never learned to ask me for help. She fortunately used ChatGPT to review her work, but it took her too long to understand that the person she needed to learn from was me. I approved her work, and ChatGPT and I had different standards.

Learning is a meta-skill that develops as a byproduct of getting good. When AI does the work for you, that byproduct disappears too.

Whose problem is this?

The most common retort I got for my quick LinkedIn post was something along the lines of: “Employers are evil/making a mistake by not training the future.”

Unfortunately, this is not a productive line of thinking. There is little incentive for employers to take on the cost of training people in competitive and liquid talent markets. Companies that don't invest in junior talent still hire from companies who do.

The more productive conversation is about what individuals and schools can do, because that's where the incentives align.

From the employee's side, the incentives are clear: every dollar you invest in your own development compounds. The difference in salary between a junior and a senior employee can be partially explained by training costs. Your lower salary is the market pricing in the cost of your development. If you close that gap on your own, you can earn the senior salary faster. The training budget exists, it's just embedded in your paycheck instead of an HR program.

I have thoughts for educators in this separate post. But for individuals, it pays to take the training responsibility on for themselves instead of waiting for their school to catch up to the rapidly-moving market or waiting for their employer to teach them.

What individuals can do

This ultimately falls on you. Getting displaced by AI is no longer a hypothetical problem, and learning to use AI to avoid your own displacement is the safest solution. Fortunately, this is the best time in history to learn anything if you’re curious and self-directed.

I’m not encouraging anyone to drop out of school, but don't rely on school alone. You'll change yourself a lot faster and a lot easier than the education system will change for you.

What can you do today?

Use it

If you’re reading this and aren't using AI at all, that's the first thing to fix.

Taking a moral stand against AI is a choice, but be aware you're trading against your employability and competitiveness in the talent market.

If you haven’t used AI because you don’t see the use-cases or don’t have time to learn how to use it, make time. Block 30 minutes every day to play with ChatGPT, Claude, Gemini, Midjourney or any other AI tool. If you don’t know what to do, ask the AI to help you brainstorm.

Stay on top of the meta game

Staying informed of what is possible with AI is a recurring investment. AI capabilities and tooling are changing rapidly, with significant announcements almost daily.

Understanding what others can achieve with these tools is important for understanding where you need to stay competitive. If an early adopter is using AI to mass-produce marketing content, others will eventually follow. You want to stay on that frontier with them.

Do this while learning about problems in your field, and you'll start developing ideas for applying your newfound capabilities to the job you want to do.

Knowing what’s possible compounds. I recently met Parth Patil in the Village Global podcast studio and he framed this metacognition as the key compounding advantage; thinking about how to apply intelligence rather than simply applying it. The full interview with Parth is worth a listen.

Play with the tools. Learn the limitations. Let one project inspire the next.

Build things and put them in front of people who can tell you they're bad

Building alone isn't enough. If you only ever evaluate your own output, you're grading your own homework. This has been true of any creative endeavor: it's nearly impossible to get good in a vacuum.

Make something you’re happy with and show it to someone with good judgment, like a mentor or a professor. Better yet, show it to potential customers or employers who'll tell you where it needs improvement. This is how you develop the internal compass that AI can't give you.

Learn how things get built, not just how to build your piece

The skills that make seniors valuable (reviewing work, managing projects, understanding how design, engineering, and marketing fit together) can take years of organizational experience to develop intuitively. You don't have years, and the apprenticeship that would have taught you is disappearing. So study it deliberately. Learn what a project management workflow looks like. Understand how code review works and why. Read about how products move from idea to launch across teams. The more you understand the full system, the better you'll know which parts AI can handle and which parts need you.

Use AI to learn, not just to ship

AI is the best tutor most people have ever had access to. It may not have your context or expertise, but it knows more than you do where you are not the expert. Lean into this and help it help you learn.

The simple version of using AI as a tutor is to ask it questions: when you build something with an agent, ask it to explain what it did and why. When it writes code, ask it to walk you through the tradeoffs.

The less simple version is to build feedback loops into your process:

  • I asked Claude Code to surface insights while I work. It teaches me things I wouldn’t think to ask.
  • I recently started keeping a learning journal. This is tedious to maintain when I’m trying to ship, so I built a skill that summarizes what I learned during a build session. It will summarize the questions I asked, the answers it gave, and how it influenced our decisions.
  • I periodically ask ChatGPT and Claude to tell me where I might have blind spots as I work on things. I am still learning how to learn.

Learn to discover your knowledge gaps and choose which ones to fill

It may be a controversial opinion, but I think it’s fine to ship a full app with AI and not understand how any of it works. The important thing is to know what you don’t know: it’s dangerous if you confuse shipping with knowing.

Perfect is often the enemy of done, and getting things done often involves making tradeoffs. If you know you are shipping an app that has security vulnerabilities and you know the scope of the vulnerabilities, you’re taking a calculated risk. If you don’t know those vulnerabilities exist, however, that’s a problem.

The right overlap is being curious and lazy. To be disinterested and lazy is fatal.

Show your ability to learn

Signaling capabilities is an important part of getting hired. My pre-AI advice was to show your work: ship side projects, have a portfolio, demonstrate agency. That advice is evolving now that anyone can ship a website or automate an ad campaign. The output alone doesn't prove much.

Historically, a portfolio got you to an interview, which then probed your thinking. My prediction is that junior portfolios will be discounted to zero soon. Employers will want to understand how you got to the output before they even meet with you.

Whether "show your work" evolves into "show more work" or "show how you work," the underlying signal will stay the same: prove you can learn so that they know you can ship more and better over time.

Give yourself room to explore

If you don’t know what’s possible with AI tools, or even if you think you know, it’s worth tinkering. You might get better results asking AI to propose solutions to your problems than you would asking it to apply the solution you thought was best. My current areas of exploration involve working with Claude Code to analyze my work to make sure I’m focused on the right problems to begin with.

The most creative people I know are finding ways to express themselves that were inaccessible before. You might set out with a particular career in mind, but as you learn what you are capable of with these tools, you may discover that there's something more interesting to pursue.

Don’t wait, don’t give up

Riding a bull and trying not to fall off it look the same. In either case, you best grab the rope. Whether this is the best of times or the worst of times depends on your curiosity, creativity, and courage. You don’t need a firm to train you, a manager to review your work, or a curriculum committee to approve a course. The work we all do is changing, and while it’s affecting junior talent first, it will come for the seniors too.

Better to drive the tractor than to be the ox.

If you’re an educator, I wrote a companion piece on what schools can do about the junior talent crisis.

Writing is thought creation

Writing is thinking. Committing thought to writing is critical for creating thought.

The life expectancy of an unrecorded thought is low.
If you have a thought and you don't write it down, there is a high likelihood that you will forget it, and it will be as if the thought never existed.

A thought can be anything when it’s in your head, tethered to a cloud of neighboring neurons without form. Writing it down forces the probabilities to become discrete.

Putting a thought in writing is an act of creation. You are making it real.

The only way to have a friend is to be one

I got a fortune cookie that said, “The only way to have a friend is to be one.” The quote is attributed to Ralph Waldo Emerson, but I won’t pretend to have found it in the depths of literary activity. I was eating Panda Express.

In the past, I’ve found myself wondering why I am the one initiating every conversation or hangout with some of my friends. They seem happy to hear from me, but they’re not the ones checking in on me. I wonder if they’re not interested in being friends. I also feel immensely guilty when others reach out to me and I’m unresponsive. I worry I’ve damaged the relationship and am reluctant to reconnect.

The fortune cookie revealed two important things.

First, people might be open to friendship, but may not make the effort for a myriad of reasons. They might be busy. They might not see you as a friend yet. Or they might not have internalized this fortune cookie/Emerson wisdom about the importance of being proactive in maintaining relationships.

Second, we all have meaningful power to create friendships simply by doing friend things. If there’s someone you want to be friends with, you can message them, call them, invite them to events, make jokes with them, or ride tandem bicycles until you’ve created a relationship.

Rather than take offense or be hurt by the fact that some of my friends don’t reach out to me, I appreciate that I have the ability to unilaterally build and sustain a friendship with people I care about.

Perception

A bed of mud. A nursery for the lotus.

The flower blooms whether one sees it or not.

Glittering fish shelter in shadows and shallow roots.

The great white heron hunts with wings unfurled and unseen.

Maxims for Supporting Entrepreneurs

These are a few lessons I’ve learned over a decade of supporting early-stage entrepreneurs in Silicon Valley and El Paso. No entrepreneurial journey is the same, but these principles are broadly applicable.

  • Only entrepreneurs can build companies.
  • Only entrepreneurs can build themselves into entrepreneurs.
  • There are unknowable and uncontrollable factors that can dictate an entrepreneur’s success or failure. Even the best founders can fail if market conditions aren’t favorable.
  • In most cases, you will not have the applicable skills or experience to help entrepreneurs with any particular problem. The best way to consistently support entrepreneurs is to 1) help them maintain morale and 2) find people who can help.
  • There is no right way to build a company, but there are countless wrong ways.
  • Bad advice is worse than no advice.
  • Your advice should be regarded as one data point.
  • The winner is often whoever stays alive the longest.
  • Every successful tech company must break a few important conventions, but it must not break all conventions. The challenge lies in picking the right conventions to break.
  • Most companies should do most things conventionally so they can focus on breaking the right conventions.
  • There are no shortcuts. Whatever impedes the entrepreneur’s path should become the path.
  • A company needs people who can sell and people who can build.
  • It is significantly easier to become a salesperson than it is to become a builder. AI is shifting this balance, but for now it still stands.
  • A salesperson who cannot hire builders will not succeed.
  • 99.9% of great ideas come from experience or networks with experience. Students and young entrepreneurs typically have little of either.
  • Everyone wants to connect the dots, but few people want to go through the trouble of collecting dots.
  • Expertise must be lived, and it goes stale fast.
  • Real entrepreneurs don’t need permission to start.
  • Money is an accelerant, not permission.
  • If you think you know how to find the next great entrepreneur, quit your job and make billions as an investor.
  • If you tell people they are something, eventually they start to believe you.
  • You cannot alter risk preferences of sophisticated investors.
  • Entrepreneurs should only take investments from sophisticated investors.