The more I consider trying to use LLM shortcuts for work or research, the more I realize I wish I would've just had a mentor or better resources and just need to take the time to learn the wisdom/skill itself.
This is true not just for coding but for everything AI is trying to replace. Art takes inspiration from struggle too for example. Doing something wrong but be just what you need to figure out how to do it right or maybe you like what is wrong and incorporate it somehow. It's how we grow.
I'm not saying that there aren't specific things you learn by using tools in certain ways that change when new tools and new layers of abstraction crop up.
it happens all the time. we definitely should think about how to make sure that students still get the right computational literacy.
This articulates a lot of my problem with AI across the board - in the arts, in science, in programming, and elsewhere. We learn by doing the whole task.
You don't need to know the answer, sometimes you don't even know the question to ask. But you know both where to start looking and more importantly *how* to ask the questions. That's what the grunt works teaches you.
This is exactly what I'm seeing at work all the time.
The guys on my team who are using co-pilot to do what we would consider. The boring coding work end up not having any idea how their code works or why because for them the boring work is the hard work of learning the language.
I think this applies to every situation where an LLM can replace an entry level position. In my job I learned more in the first 6 months than I had ever learned in school. So where are those workers going to gain that experience? how does a drafter I become a Drafter IV?
The answer can not be more school, that would be paying to pretend to work. a company trains workers on the job while they are working, if we are not doing that anymore you eventually end up with nobody who knows how to do anything, and certainly they won't know how to check the work of an AI agent.
I find even if I've got to do some boring admin, I cope best not by rushing, but by leaving myself loads of time, including time to get depressed at having to do it, then totally immersing myself in it so I get attuned to its nature.
Which is another downside of this is people being accused of using AI even if they haven't done so, and also people who grow up with this learning to write, as humans, like AI.
The AI discourse keeps reminding me of UA Fanthorpe's poem, Atlas. Lifesaving Poems: UA Fanthorpe’s ‘Atlas’ – Lifesaving Poems share.google/g90bCyzJUiBzggIOt
And that’s why students using AI to write their essays is such a bad thing. We train our brains to think, to analyse, to synthesise, to see patterns by the creative process called coursework. We also develop self-discipline - also vital to a well-functioning adult life.
That’s my fear as well; l, having grown up scientifically with all the grunt work, feel somewhat safe in using AI to bypass it now; but l worry about the students and also about myself if l de-skill a little too much. It’s like giving a calculator to a kindergartner…
…and them never learning what multiplication means…and l also know for myself that using calculators all the time has made me worse at calculating in my mind…but at least l retained “the concept“ of it.
Loved this. I’m in literature and the explosion of AI produced content is causing a crisis. You’ve pinpointed the reasoning authors use to justify its use, and the reason that editors are pulling their hair out over a surge of AI-generated writing that sometimes is almost fine but feels “off”
PWC recently announced they would not be using AI instead of audit juniors, because juniors need to learn subjective judgement to perform as seniors. Once you’ve crunched lots of numbers, anomalies stare you in the face.
Working in dozens of programming languages and tools has led me to be extremely cautious with minor formatting errors. Even when they aren't directly causing compilation or logic errors, spacing inconsistencies pile up to become unreadable, unmanageable code over time.
It’s an education problem more than an execution problem— it is a tool OK to use AFTER you have learned to distinguish important elements and have a good base of knowledge to judge the output. This hold for my discipline as well.
Also, capitalism and the market doesn't value this perspective.
Serendipitous brilliance and moments of deeply drawn skills don't fit on a Gantt chart. They're not predictable in a sprint. They don't have an hourly rate. MBAs stare at the napkin on which Picasso has doodled, vexed with valuation.
When I see a young and budding graphic artist still stretching their images or making a 400px image blown up to 1200px or trying to save a 10mb image to a deliverable disk or staring at a monumental task of needing to convert 1000 pngs into jpegs w/o knowing scripting - they need the grunt work
AI is manipulative shit being peddled to us be the absolute worst people in the world- do you think that Altman, Musk, Zuck, Theil, Huang, Karp, Google actually want to do something good for humanity or anyone but themselves? They don’t. We need to destroy these people and their companies
AI gives the average person a similar ability to have work done for them or to be surrounded by sycophants as the billionaires, with the same shitty outcomes and psychosis.
If it has this effect on regular people, what must it be like to have that but also with people's livelihoods at your disposal
French scientist Louis Pasteur noted in an 1854 lecture that in observation, "chance favors only the prepared mind," suggesting that unexpected breakthroughs require a knowledgeable observer.
And this happens already. The number of times some brilliant guy wrote some code and then left and nobody knows how it works is such a common theme in programming.
Exactly happening in software dev world. Seniors continue to be useful, as they smell bad code, understand complex problems, and can appreciate what is useful to the business. But you only get senior devs by employing junior devs.
These tools aren’t going away though, especially in software dev, so I think we have to figure out some way to build knowledge while using them. The SDLC is going to reconfigure and I don’t think anyone has properly figured out how yet.
Same deal with translation: companies flocked to AI slop that requires very heavy MTPE (machine translation post-edit) which is typically done by experienced translators. However what the AI translates poorly is typically what beginner translators translate to "get their feet wet".
Forgot to say, senior devs can be quite productive with AI as it can produce a lot of boilerplate/gruntwork code. BUT you still need the senior to fix the AI code when it fucks up, which it does all the time. The senior can only do this because they were once a junior, doing the gruntwork.
My work place is about to fire 1/3 of our staff due to AI. I wish I could shout this from the rooftops. Efficiency is good but it is not an end in itself and it’s the enemy of growth.
This summarizes it so well. Starters need to get the reps in. By skipping the fundamental building process, one can't identify problems as efficiently. They're giving their skill development to the computers instead of learning it first hand.
Maybe. Nobody actually knows what the impact will be. The hypothesis about building intuition is a fair one, but who can say that the bandwidth opened up by not doing the grunt work makes space for other kinds of important intuition?
Nobody is saying that ... The grunt work is the point. That's the whole point. You can't skip the point. Making space for other kinds of important intuition is precisely what happens WHEN YOU DO THE WORK 🫠😅
I've wanted to build a very specific database for 4+ years. I've dodged learning SQL for the same reasons I've dodged learning a lot of coding languages for decades. I have "kept the bandwidth open" to learn other things.
Agree. My grad students understood it that way, too, that the “grunt” work of deciphering penmanship, struggling to summarize a dense article, or writing up a bibliog is essential, not to be farmed out, skipped, etc. Very pleased to hear them reason through it.
My boss told me to try out AI code and while I was skeptical, I decided to see what it was about.
At one point I had to write a recursive method that was melting my brain, so I got the AI to write it. Got it right after a few attempts. So I plugged it in, cleaned it up and moved on. (1/2)
But ever since I've been left with the niggling regret... if I had spent just an hour trying to write it myself, solve the difficult problem myself, I'd have a much deeper understanding of recursive code in the future.
(We were all laid off and replaced with AI months later, btw)
This is the exact reason I volunteer to do things I haven’t done before. I love to learn how to do new things, especially when I can learn them from people who have tricks they’ve built up over the years that a traditional class won’t teach. I’m not going to let AI steal that joy from me.
I think there's going to be a huge problem in a huge number of industries in 5-10 years when there just aren't "grunts" to promote who have gained expertise. I do believe that LLMs can substitute for a junior (job) when a senior is reviewing it, but that junior gains so much from these tasks.
I’ve seen similar already in the last 10 to 20 years with the move to offshore a lot of the routine work. Fast forward 10 years and management appeared to be baffled why there weren’t enough people with 5 to 10 years quality experience. AI is the next step in this.
I work in an industry where "normal" mechanization did this and it's a nightmare because now when we try to add people back for training/workload reduction they're seen as burdens and overtime thieves by the jaded hotshot lifers.
This applies to so many fields of work/study/research, including the non-programming IT work that I've done/managed for 40+ years. There are things that I know that I wouldn't have known if I hadn't done the frickin work.
This may be true, but the current model for AI is to aggregate the advice from those same cops and doctors. So not only are you getting the same results, you're doing it through a system that erases culpability for them.
In the L.A. Quartet-novels, James Ellroy's characters calls this "shitwork"; sorting through endless files to look for clues. But they're all fascists using the cases to fill their own pockets with money. All the time they're violent isn't "shitwork", that's perks.
Point: if someone calls DOING THE ACTUAL JOB "shitwork" or "grunt work", they're not actually interested in doing said job and should find something else to do instead of ruining everything for the rest of us.
School made me take for granted what learning was. I don't think I had a conscious appreciation for it until my mid 20s playing video games. Struggling with Gradius until I beat Soldier Blade, then coming back to Gradius and inching along... the fruits of learning are not always apparent.
Reminds me of a guy promoting AI for a business by saying how it was great for programming because even his son had been able to use it to create some game thing. I’m like, great but he doesn’t know a thing about how it works so when it breaks, he’s going to be helpless to fix it.
I feel this so much in teaching first-gen, "non-traditional" college students. People talk about the "nuanced" use of AI with discernment & seem totally uninterested in how discernment actually develops. They've got theirs, so who cares what it's doing to people who didn't get the time to learn?
As authors, you lived this, every rewrite that felt pointless until suddenly it wasn’t. What’s one ‘grunt work’ moment that ended up shaping your writing the most?”
Funny… this tracks exactly with my brief studies in cognitive psychology. Intuition is what gives us the sense of connection between things that were previously not seen as “connected.” It is that realization of similarities that is how innovation happens.
The whole series is fascinating. It was this approach that guided me as I designed one of the first digital motion picture post-production facilities at an educational institution back in 1996.
Agreed. It's the difference between navigating via GPS and growing up somewhere, and knowing it so well you can tell where you are because of the sound of the ground underneath your feet, or the smell of laundry.
The latter lets you connect things that GPS won't do, can't do.
Also, unless you’ve done the grunt work to develop the intuition, you won’t necessarily know when AI is feeding you BS. AI is helpful to me as a transactional attorney only on subjects that I know well because I know when it’s off.
This has been one of the things I keep raising about LLMs and other so-called expert systems. Once everyone is reliant on them, where are the experts to train the next model? If novices can skip the trial and error part, how are they meant to learn and get experience?
Yep there’s a reason you do awful gamma matrix algebra before they let you use Feynman diagrams. Because process matters and you wake up every morning grateful to Feynman.
I’ve been saying this for a while. When you replace entry level workers with AI, you are undermining your long term success by not training your future leaders. This is true in every profession. Judgment comes from experience.
A coworker asks the LLM to write a prompt to help him write an email. So he’s not even thinking through the prompt anymore! I can see the consequences of this daily when he has no clue what’s going on because he doesn’t actually read or write anything on his own. Work slop at its finest.
YES. This! It's why they make kids in architecture school learn how to draw and make models by hand before they get to use the cool software. Watching my kid go thru all-nighters of wailing as the model collapsed or she'd f'ed up a measurement - a crucial part of learning the basic skill set.
And at one point they were doing this off of decades of hard-won recognition. I'm pretty sure by now they're largely just men who have paid other people to do all of their learning for them.
i feel this is nearer to T Nagel than at first sight - it has so much to do with how we 'consider/experience/make-space-for/live' consciousness and the 'outer' landscapes of mind in general...
In other words, perhaps as an experienced driver you may find some utility for Autopilot, but first you need to do the time and work to become an experienced driver.
Indeed. Each experience we get by doing does not always reveal its usefulness immediately. It can be years/decades later that some nugget of learning/understanding from seemingly innocuous experiences becomes important to our growth, judgment, or ability to solve a completely different problem.
This is true of many realms of endeavor- coding, writing, cooking, public speaking, law, yoga. All practice changes and enhances our abilities and understanding. Turning the bulk of the practice and learning over to a computer impoverishes our minds.
It also points to a schism we’re already seeing in today’s job market -so little is available for inexperienced workers…mainly youth. Now we’re automating more of those jobs. How will anyone become the kind of diligent, thoughtful employee we’ll still need when there is no entry-level work any more?
True for teachers & lawyers too. The human brain stores a lot of information while learning on the job. I know a lawyer who took a job “training” AI a few years ago for the eventual purpose of replacing new lawyers. A flexible gig to pay bills & they hated practicing law, they weren’t good at law.
Truly likable person but their passions are not law & lawyering. So I wouldn’t trust AI for legal advice or processing arguments because I think the people hired to train AI are not the cream of the crop in their field.