That's another way to state the same thing actually.
One can adopt a static definition of "general intelligence" from a point in history and use it consistently. In this case, GPT3+ is a leap in humanity's quest for AGI.
One can also adopt a dynamic definition of "general intelligence" as you described. In this case the equivalent statement is that in hindsight GPT3+ shows that language ability is not "AGI", but rather, "merely" transformer models fed with lots of data. (And then humanity's goal would be to discover that nothing is "AGI" at all, since we'd have figured it all out!)
The fact that we see things differently in hindsight is already strong evidence that things have progressed significantly. It proves that we learned something that we didn't know/expect before. I know this "feels" like every other day you experienced, but let's just look at the big picture more rationally here.
One can adopt a static definition of "general intelligence" from a point in history and use it consistently. In this case, GPT3+ is a leap in humanity's quest for AGI.
One can also adopt a dynamic definition of "general intelligence" as you described. In this case the equivalent statement is that in hindsight GPT3+ shows that language ability is not "AGI", but rather, "merely" transformer models fed with lots of data. (And then humanity's goal would be to discover that nothing is "AGI" at all, since we'd have figured it all out!)
The fact that we see things differently in hindsight is already strong evidence that things have progressed significantly. It proves that we learned something that we didn't know/expect before. I know this "feels" like every other day you experienced, but let's just look at the big picture more rationally here.