From a doomer/EA perspective, publicly saying that GPT5 is AGI or such would likely inspire & accelerate labs around the world to catch up. Thus it was more "Altruistic" / aligned with humanity's fate to stay mum and leave the situation cloudy.
Having an existing example showing that something difficult is possible causes everyone else to replicate it much faster, like how a bunch of people started running sub-4-minute miles after the first guy did it.
If you know the outcome is favourable then you go all in. Right now the other competitors are just trying to match GPT4, if they knew AGI was achievable then they would throw everything they have at it in order to not be left out.
Show me the researchers and companies where the people in charge and the people doing the work don’t think it’s possible to get better than GPT-4 and are slow-rolling things.
I suppose maybe there are ones where they are slow-rolling because of their opinions about existential risk of AGI. But that’s not contingent on what OpenAI says or does.