by mhdi_kr99 on 8/24/23, 1:13 PM with 65 comments
by jebarker on 8/24/23, 1:29 PM
by gorjusborg on 8/24/23, 2:21 PM
Those types feel that the end goal of 'working code' is all that matters. Their definition of 'working' meaning that the functional requirements of the feature are met.
That mindset is an extreme. A developer's job is to solve a problem for someone, not to code, but copy/pasting code without understanding as a consistent behavior is a recipe for eventual collapse of a codebase.
I see Copilot and other LLM code generation as methamphetamine for that sort of programmer. They will feel great, but the situation around them will degrade, those surrounding them will be unhappy, and eventually it will be clear the behavior was unsustainable.
From a different angle, I would feel uneasy giving others access to my proprietary data (code), and also accepting LLM generated code into my companies products. The legal rules don't seem fully sorted.
by mattpallissard on 8/24/23, 1:39 PM
I guess I'll just add programming to that list and not think about it too much.
by Toutouxc on 8/24/23, 5:20 PM
So far GPT-4 has proven to be much more helpful in giving me pointers or straight up answers (that I DO check!) to some of my questions than any other source, tool or entity in my life. I've explored and lightly discussed topics I never knew I was interested in.
by baz00 on 8/24/23, 1:42 PM
by koryk on 8/24/23, 1:44 PM
by eigenhombre on 8/24/23, 2:01 PM
When I was learning Go I had Copilot on most of the time. I was amazed at how much progress I could make despite being new to the language. And, when switching to Emacs, I'd notice myself reflexively waiting for Copilot to supply autocompletions that weren't there.
Yet, strangely enough, I can actually write Go code now (with or without Copilot). All the troubleshooting, refactoring, redesigning, bug fixing, etc. that I still had to go through saw to that. (Presumably having programmed already for many years didn't hurt.) Repeated and quick exposure to the relevant idioms supplied by my tooling seems to have helped at least as much as it set me back.
On the lisp (Clojure, Elisp, ...) side of things, I have found Copilot (and ChatGPT) much less helpful, presumably because of the relative paucity of training data. Simple things such as matching parentheses seem to elude these tools, and hallucinations are worse and more frequent. I'm still feeling out how this difference w.r.t VSCode/Copilot and other languages affects my experience and the quality of my output, but it definitely feels different (and, frankly, a little clumsier).
Over time I've learned how helpful different kind of tools are for making software. Learning one's editor well. Writing good (and fast) tests. Using linters and code formatters. Using type systems when available. The new AI tools certainly involve trade-offs, but as I gain experience with them I am becoming convinced that putting them aside will make programming a different, and probably much slower, experience than what most programmers come to experience. (Slow can be good, but probably not what most of us want.)
by aldarisbm on 8/24/23, 2:17 PM
You can use an llm, while learning yourself.
I agree with the take on copilot being a hindrance on using your brain. While learning a new language it was really hard to understand whatever is being put in front of me (as autocomplete) is truly best practice or something tied together very loosely.
The extra time it takes you to google, and learn about strategies on how to code, and patterns, and design. This is what is actually valuable, being able to speak to these things in conversation when defending the code you wrote on a PR etc, it makes a difference.
by jmarchello on 8/24/23, 3:20 PM
I recently decided to make my editor much more minimal and even went so far as to disable autocomplete entirely. So far I haven’t noticed a reduction in my productivity and actually find it much easier to think about what I’m writing since I’m not having to correct autocomplete and Copilot all the time. All the extra things popping up on the screen were really distracting and I find that for me, a minimal approach gives me more space to think deeply.
by dgroshev on 8/24/23, 2:00 PM
Problem is, LLMs are really good at uninsightful, repetitive, boilerplate-y bits of code that shouldn't be there. The moment I get to anything remotely complex or subtle, it either doesn't grasp what I'm doing, needs more coaxing than it's worth, or worst of all introduces subtle bugs. It just incentivises me to write mountains of pointless code instead of stopping and thinking if it even needs to be written in the first place.
What we get from code writing LLMs are exactly the same kind of "improved productivity" we got a decade ago with IDEs autogenerating Java boilerplate. It might help in the short term, but long term it just masks the pain that drives tools and skills getting better, more concise, less repetitive, more insightful.
On the other hand, I'm really looking forward to tools that will be able to figure out how much "insight per line of code" our code has, some sort of "Kolmogorov complexity" but with LLMs. Anything too predictable, anything LLMs can easily fill in is a good code smell signal.
by adamsb6 on 8/24/23, 1:35 PM
For whatever reason the Google results just aren’t as useful as they used to be. Highly specific queries return very generalized results, and it takes a lot of effort to find something relevant.
With ChatGPT I can ask it how to do just about anything in any language and it nearly always gives me exactly what I need.
by userbinator on 8/24/23, 1:47 PM
by DigitalNoumena on 8/24/23, 1:57 PM
Sure, pure learning is enjoyable but also ephemeral. Building/doing = learning, yes, but no need to do everyhting.
by bitwize on 8/24/23, 1:50 PM
Despite the widespread availability of flatpacked IKEA style furniture, there is still the desire for the sheer joy of craftsmanship... and there is still demand for the end product.
by taylodl on 8/24/23, 1:47 PM
I know a lot of people who learn by looking at code and then playing with it to understand it. Generative AI is great for those people - they can quickly get code they can start with.
Different strokes for different folks.
by kushie on 8/24/23, 1:28 PM
by Traubenfuchs on 8/24/23, 1:34 PM
Sounds the same like people who rejected IDEs. Or even syntax highlighting. Yes, those people existed and some might still rawdog xml and java in vi today.