Good thing that not all of the use cases that are finding footholds today require skills it does not yet have. Why are "significant reasoning skills" needed to help you write a song, or help rewrite your resume, or find an answer to a niche piece of knowledge, or fuck... this is exhausting.
I'd have thought that people on HN would have far more vision than this. For every single "well it can't do that yet" observation there are dozens of use cases people are finding TODAY. It's already widely useful and this is year one.
> Why are "significant reasoning skills" needed to help you write a song, or help rewrite your resume, or find an answer to a niche piece of knowledge, or fuck... this is exhausting.
discussion started with person claiming that soon chatgpt will be trained on docs, code and jira tickets and completely replace engineers, which imo will require significant reasoning skill.
I agree that for those tasks you described, chatgpt may find its niche.
That was actually me. And it already can write Jiras, scaffold code, and help write docs. Doesn't need any significant reasoning, just a well trained model and contextual data.
However, I didn't say it was going to 100% replace anyone. The horizon looks like GPT will become a significant assistant tool for numerous tasks. I think it will need humans to review and approve output and direct it for the foreseeable future.
> And it already can write Jiras, scaffold code, and help write docs.
yes, and my hypothesis that's where it will stop, because core meaningful engineering work on complex product/system requires way more reasoning abilities.
That's like saying junior developers will never take your job because they just don't know enough context or have enough experience.
Why do you think there is any reason for progress to stop? When has progress ever stopped? Even just looking at the simplistic Github Copilot from ~3 years ago I could see the writing on the wall for jnr devs. When these models have your entire codebase consumed it's quite apparent that the gestalt has changed in a way I think you are underestimating.
Complex product/systems is exactly where you'll need an AI code writer. It knows all of everything in your massive codebase. It will be able to suggest efficiencies you couldn't possibly be aware of unless you'd read every line of code yourself.
Remember when AlphaGo made that one move that showed it was far, far superior than a human at improvising and seeing ahead leagues of moves? It shocked the whole community, even the DeepMind engineers were shocked. That's going to be you one day. There is absolutely no reason I can see for this not to happen bar some sort of as yet undiscovered logical limit.
> Why do you think there is any reason for progress to stop?
I described you reasons: jun developer has proven abilities to learn reasoning and context. and chat gpt does not have proven abilities to learn to reason.
I'd have thought that people on HN would have far more vision than this. For every single "well it can't do that yet" observation there are dozens of use cases people are finding TODAY. It's already widely useful and this is year one.