Absolutely. I can't tell you how many times I've been starting at a project that I basically know how to do, but it's got a bunch of moving parts that I need to account for. I'll just stare at the screen, drink more coffee, read HN, basically do anything besides actually work on it, because it's too big and unactionable. Some of this is actually useful brain-organization time, but some is just time wasting.
Eventually I'll get the bright idea to make a notes doc listing all the things that are bothering me. Just writing them down gets them out of my nebulous headspace into a permanent record, which inevitably makes it less scary - and before you know it's I'm either halfway to a runbook or technical design, and it turns out this project actually will only take a day or two once I have all the prep work done.
> In 6 months you'll be able to get memory chips and GPUs for nothing.
I highly doubt that. Memory chip production takes years to scale up, which is partially a reason why the memory market (both RAM and solid-storage) is so susceptible to "pig cycles" - high prices incentivize new players to join the market (although less likely than decades ago, given just how much capital one needs and how complex the technology has gotten) and for established players to scale up their production, and then prices collapse due to oversupply.
For GPUs, the situation is even worse. During the GPU crypto mining craze, at least that was consumer GPUs so there indeed was an influx of cheap second hand gear once that market collapsed due to ASICs - but this time? These chips don't even have the hardware for rendering videos any more, so even if GPU OEMs would now get a ton of left over GPU chips they couldn't make general-purpose GPUs out of them any more.
Additionally, this assumption assumes that the large web of AI actors collapses in the next 6 months, which is even more unlikely - there's just too much actual cash floating around in the market.
You can't, it doesn't have any video output port per the product brief [1].
Of course, if one is inclined, I'd take a wild guess and say you could try something like Steam Remote, but I wouldn't bet on that actually working out. And even if you could get it working - per an analysis of German newspaper Heise, the bloody thing has less shader compute capacity than the iGPU of AMD's Ryzen CPUs [2]. 30.000€ - and it'll probably struggle running GTA 5.
> all the memory chips and energy. And GPUs of course.
Yes and no and yes.
AI is having a bad impact on electricity prices, but the actual graph of US electrical use was pretty flat for 20 years and is barely increasing even now. If the bubble keeps going strong we might see AI get up to 10% several years from now.
The RAM and GPU use is a whole different category of dominating.
My entire (non American) education career was exam based. The exams were tightly supervised, no books etc. Every thing had to be memorised. Cheating was impossible.
Funny thing is, memorising something is a big help to understanding it.
In that system, AI is a very useful tool. AFAIK, this is how they still do it in many Asian countries.
It worked pretty well. Produced a lot of educated people.
Yes, same. We also had oral exams for particular subjects where you essentially had a discussion with a teacher or panel of teachers on a particular topic. All of that will eventually come back. I don't see how that doesnt come back as a normal thing in schools
reply