Jesus, take the wheel. š
And Github Copilot, take the IDE. š»
Github says, 92% of US devs are using Copilot.
What. Seriously?!
When did you hear 92% of any demographic using a singular thing?
Unless of course... you say 100% of all people who ever existed, consumed di-hydrogen monoxide.
(There's exactly one way this line gets dark. Don't go there. š)
Join me on a quick journey as I talk about:
š„ When the machines took over the world in 2024
After a quick Google search, it seems like most devs are using AI assistance in writing code. Iād be lying if I said I havenāt used AI to write code at all. Of course, I have. I donāt live under a rock.
I've seen devs get oddly comfortable with the idea of sharing their code related data with third party cloud services, which are often not SOC2 (or something similar) certified, and make vague unprovable claims of privacy at best.
Things like Github Copilot (and Copilot chat), Bito.ai, and several other AI code extensions on the VS Code marketplace have more than 30 million installs. Crazy! š¤Æ
And then there's me. Iāve not made AI assistance a part of my regular code workflow. A couple of times Iāve taken help from GPT to get some boilerplate written, sure. But those times are an exception. Things like Github Copilot, or any kind of code review, code generation tool, PR creation, or commit-assistance, isnāt a part of my IDE or CLI flow.
Maybe itāll change with time. Weāll see.
āButā¦ why?ā
š What I'm ACTUALLY worried about
The answer is simple. š
1. I fear my programming skills will get rusty
I am concerned that the way I write and read code will suffer if I get too used to AI assistance.
- Iām concerned Iāll begin to overlook flaws in code that I can catch otherwise.
- That Iāll start to take AI generated code for granted.
- And looking up APIs, built-in methods, or other documentation will start to feel like a chore.
I fearā¦ that Iāll start slipping.
2. I'm not comfortable enough with sharing all of my code with a third party service
Companies can be damn smart about inferring things from the data you provide. Sometimes they'll know about things that your family won't know about.
The idea that sensitive business logic may get leaked to a third party service, which may eventually be used to make inferences I'm not comfortable with or just... straight-up leak? I mean, software gets hacked all the time.
I think I'm being very reasonable thinking that I don't want to expose something as sensitive as code in an unrestricted manner to a third party company. Even if that company is Microsoft, because even they f*ck up.
š From the Experienced Devs' point of view
This isnāt a take that is unique to me either!
1. More-experienced devs tend to not want to lean on ācrutchesā to write their code.
Iāve even had the pleasure to work with senior devs who didnāt want to use colored themes on their IDEs because they thought itāll hurt their ability to scan, read, or debug code! (that was a bit much for me too)
After all, āprogramming skillsā is a lot more than just writing code.
2. Older devs have seen all kinds of software get hacked, data leaked, etc.
I mean, when haveibeenpwned.com sends you emails about your credentials, emails, and other data leaks every year for over 10 years... MANY TIMES from billion dollar corporations...
When you hear "When you're not paying for the product, you are the product" for the bazillionth time, which is then backed by yet another company selling that data to some third party...
Yeah... it gets tiring.
And it gets real easy to just disconnect as many wires as you can and go back to stone age.
āOlder devsā? Am Iā¦ Am I getting old?
Nah, Iām still 22 and this is 2016 or somethingā¦ right? Right??
Btw, the answer to the question in the title is š this. Congrats! The post is over! Over to the next oneā¦
Buuuuutā¦ if you want to continue readingā¦
š¶ Let's take a step back for a moment...
I think my fears may be exaggerated.
Let's keep the whole data privacy angle aside for now, because that's a whole other topic on it's own that I feel about rather passionately.
I personally donāt have enough data to empirically say that using AI assistance will bring about the doom I fearā¦ That itāll downgrade me from what I am today, to an SDE1.
But Iāve seen patterns.
- Iāve seen AI-generated sub-par quality code go through code reviews and end up on the
main
branch.
- Iāve seen library functions being used without properly understanding what, or what alternatives exist just because an LLM generated that.
- Iāve even seen code generated to solve a problem, for which a utility function already existed in the codebase but wasnāt used because knowing this utility existed was a lot more work than asking GPT to generate it for you.
š Diamonds are Bad code is forever
āWait a damn minuteā¦ Iāve seen this movie before!ā
- LLMs are a pretty new thingā¦ but š© code has been eternal!
- Every. Single. Dev. Ever. Has used library functions without fully understanding it or looking at alternatives. You, and me, are both guilty of that. (What? You thought
Array.prototype.sort
was the best way to sort anything? Itās just sufficient in most cases!)
- A piece of logic gets reinvented (re-copy-pasted) all the damn time! Just that before it used to be from StackOverflow, now itās from ChatGPT.
š¤· So, whatās the big fuss about?
"Will using ChatGPT make me a bad programmer?"
I think, no.
The takeaway is that you just need to care about what you build.
Take pride in what you build.
š¤ Where the heck does LLM/AI fit in?
LLMs are not inherently evil.
In fact, they can be pretty damn useful if used responsibly:
-
Quality Code: An LLM might handle edge-cases that a less diligent developer wouldnāt consider.
-
Comprehensive Tests: LLMs might write tests that are more comprehensive than what some devs would produce.
-
Comprehensive Types: It might even write types more "completely" than an average dev might write on their own, or might have the skill to write.
However, the responsibility lies with the developer to ensure that the code output is guarded and well-monitored. Someone who doesnāt care would have done a shoddy job at any point in history. The presence of LLMs doesnāt change that.
š The art of actually giving a f*ck
There's a lot of devs out there who don't care.
But youāre no such dev. You DO care.
Else you wouldnāt be here on dev.to learning from peopleās experiences.
I recently wrote about what new devs should care about to grow in their career. Itās a LOT MORE than just code.
Maybe Iāll introduce some AI in my VSCode.
I think itās a matter of when, instead of if.
Whatās more important isā¦ as long as I care about making sure my programming output is readable, performant, high quality, and easily reviewable, I think Iāll be fine, and so will you.
š P.S.
If you want an example of something I care deeply about, and has both great code šŖ andā¦ less than excellent code š¤£, take a look at our open-source repo!
Itās something that lets you spot how long it takes for you to deliver your code, how many times PRs get stuck in a review loop, and just generally how great your team ships code.
āØ Open-source DORA metrics platform for engineering teams āØ
Open-source engineering management that unlocks developer potential
Join our Open Source Community
Introduction
Middleware is an open-source tool designed to help engineering leaders measure and analyze the effectiveness of their teams using the DORA metrics. The DORA metrics are a set of four key values that provide insights into software delivery performance and operational efficiency.
They are:
-
Deployment Frequency: The frequency of code deployments to production or an operational environment.
-
Lead Time for Changes: The time it takes for a commit to make it into production.
-
Mean Time to Restore: The time it takes to restore service after an incident or failure.
-
Change Failure Rate: The percentage of deployments that result in failures or require remediation.
Table of Contents