i'm a real developer and it is absolutely magic! if someone showed me a demo of talking to a computer which works for one hour to find and fix my bugs all on its own five years ago I'd definitely call it magic and I still call it magic.
does it make a mistakes? yes sometimes but you can verify with tests or with lean.
I experience(d) the same level of magic when I throw keywords into google and get relevant results back. LLMs are just an extension of this kind of magic.
Their efficacy opens up a lot more possibility, but given they’re not AGI (without getting into a definitions debate) a lot of the magic is gone. Nothing fundamentally changed. I still use them a lot and they’re great, but it’s not a new paradigm (which I would then call magic).
I think the key point here too is LLMs demo like magic. You see the happy path and you think we have AGI. You show me of 10 years ago the happy path and I’d be floored until I talked to me of now and got the whole story.
mattering doesn't mean you get what you wanted/voted for. Mattering means your vote was counted among the hundreds of millions, and the ultimate consensus reached, and the minority voters have accept the result of the majority voters.
I think people have (recently at least) mistakenly believed that democracy means your vote is a demand to be fulfilled, and if it isn't, then democracy is failing.
my conspiracy hat tells me that this one is one of the biggest incidences of insider trading ever, where the government willfully tries to push down the markets because Participants might have secret put options.
thank God, Pytorch gained so much momentum before this came out, Now we have a true platform independent semi standard For parallel computations. We are not stuck with NVIDIA specifics.
It's great that parts of pie torch which concern the NVIDIA backend can now be implemented in Python directly, The important part that it doesn't really matter or shouldn't matter for end users / Developers
that being said, maybe this new platform will extend the whole concept of on GPU computation via Python to even more domains like maybe games.
Imagine running rust the Game performantly mainly on the GPU via Python
This just makes it much, much easier for people to build numeric stuff on GPU, which is great.
I'm totally with you that it's better that this took so long, so we have things like PyTorch abstracting most of this away, but I'm looking forward to (in my non-existent free time :/ ) playing with this.
does it make a mistakes? yes sometimes but you can verify with tests or with lean.