I think you're massively underestimating the development cost, of the number of people who would actually purchase a higher vram card at a higher price.
You'd need hundreds of thousands of units to really make much of a difference.
Well, IIUC it's a bit more "having more than 12GB of RAM and raising the price will let it run bigger LLMs on consumer hardware and that'll drive premium-ness / market share / revenue, without subsidizing the price"
I don't know where this idea is coming from, although it's all over these threads.
For context, I write a local LLM inference engine and have 0 idea why this would shift anyone's purchase intent. The models big enough to need more than 12GB VRAM are also slow enough on consumer GPUs that they'd be absurd to run. Like less than 2 tkns/s. And I have 64 GB of M2 Max VRAM and a 24 GB 3090ti.