Trust me, laziness is a real thing; even when everything in the world points towards doing the work being the best thing for you, you just can't be bothered.
Of course it's a real thing. He never said it wasn't. What he said was that it doesn't explain anything, and that's true.
Laziness is a symptom, not a cause. It's like diagnosing someone with a "cough". It may be correct, but it's pointless, because it's obvious and tells you absolutely nothing about what to do.
At what point is a personality merely a symptom? If everything about us is a result of our ailments, and all of our ailments have treatments, how does a person know which parts of his personality are him and which aren't? What am "I" then?
I would probably say that we can classify our personality as the difference in sets of what anyone with our ability and in the same situation would do, and what we do.
That being said, our ailments could probably be categorized by affecting all aspects of our life equally and in full measure of time (i.e something that doesn't fluctuate across the years is probably an untreated ailment).
You're the little sane bit sitting on top herding the various cats of brain lobes, psychological disorders and just plain basic insanity into acting as a useful whole every day.
A sufficiently self-managing nutter is indistinguishable from a genius.
In one of the articles I read on this (sorry no link) I'm pretty sure it mentioned their new philosophy, for hardware and software, was rapid iteration.
Firefox can't do anything to prevent a plugin from crashing. That and I'd wager most of the Flash movies are very poorly coded, resulting in exceptions that could've been prevented. You can tell by browsing the web with a debug build of Flash.
Why is it so important? Is it for gaming, maybe? Because I am pretty sure it's impossible to notice that while working, because there is no flicker between frames anyway. And gaming on a 4K screen (at PC viewing distance) is pretty crazy.
Yes. Quake players swear by 120Hz+ screens. We're all geeks and nerds who still use CRT monitors because those new hip 'flat' things have too much input lag and too low refresh rate. In a fast-paced game like quake, these things make the difference between winning a duel and losing one.
Also in the fighting game community (King of Fighters, streetfighter) input lag is a killer. if your monitor has lag, you won't be able to chain your combos as well as the pros.
For working and document editing and browsing, 50Hz with slow input time is fine, but for 'serious' gaming it's not, I guess.
It's also one of the arguments I hear why PC gamers consider themselves 'more awesome'. "Why would I play on an xbox, on one of those TVs with the horrible refresh rate, and all the post-processing that adds milliseconds of delay"
I see. But I suppose that this kind of monitors are not made for gamers, anyway. Ultra-high resolution is good for sharp text/CAD/etc rendering. A game is just too fast-paced and textures aren't that high-resolution anyway.
Compare that to the original game data being around 30MB.
Interestingly, fan texture projects almost always have super high fidelity that normal games don't. Most of the Epsilon textures are in the 4kx4k neighborhood.
If I ever get fu money I'm going to start double-blind testing everything - screen refresh rates; mouse rates; coffee preparation methods; speaker cables[1]; bitrates for media; compression for media; everything I can.
You won't be able to buy the time to run those adverts, unfortunately.
Kalle Lasn[1] of Adbusters fame has been trying to do it for years, and the networks just say "We won't run adverts contrary to our big sponsors". Yes, even PBS and the CBC say this.
Can't say for gaming, but 120Hz is great for watching movies/anything at 24fps. With a 60Hz screen you gotta do something to make that 4:5 ratio work (either holding frames, or attempting to interpolate/interlace). At 120Hz, you got a nice clean integer ratio, so you can watch 24fps 'as it is'.
Anecdotally... The way I perceive framerate is related to the resolution. Take a horizontal panning shot at 720p/60hz... a vertical line might move 2 or 3 pixels per frame. At 2560*1600/60hz the same line jumps 4 or 6 pixels per frame and starts to appear discontinuous in vision. High contrast makes it more obvious. Motion blur removes this in exchange for input lag, but high frame rate is a better solution.
I've had numerous discussions with people about 60Hz vs 120Hz, including one where a friend cited some study (sorry no link) which claimed that, perceptually, people couldn't tell the difference between these two refresh rates. I was incredulous. I'd been playing Quake at a high level for a number of years and anything lower than a v-sync'd 120Hz setup was painful, on the flip-side, playing 120fps@120Hz (CRT) felt so fluid, like water (hard to describe, you have to play on the two setups to feel it).
I haven't had the same experience on any flat panel display I've used till now but I'm on the lookout for a good 120Hz LCD gaming monitor in the hope I get the same experience again.
If you ever have a chance to play with a monitor at 120hz, especially with lightboost, it's so obvious that you wouldn't feel the need to have a double-blind test. If you're not playing games, it's most noticeable when scrolling down a page in your browser or moving the mouse.
It's very easy to spot the difference for anyone, even the uninitiated.
Just launch your favorite first person shooter, and rotate the view very quickly. Of course the game needs to be running at framerates above 60 FPS, and with v-sync off. (Or at framerates above 120FPS, with vsync.) The difference between 120 Hz and 60 Hz will be immediately visible through the "smoothness" of the rotation.
The next generation of graphics card will probably be able to take the dubious honor of being able to run Crysis (2007) at 1920x1080 / 60fps on a single GPU...
You still have that option, I can't imagine any 4k displays would fail to support running at a 2x multiplier given that it is designed specifically for that.
I notice the difference whenever my desktop has quietly switched back to 60hz from 120Hz even in non-gaming applications. I'm not going to claim that it is important. But, the wet-glass smoothness of 120Hz makes me happier :)
Thanks for the note! Although I don't think it makes it a bad example, there's no requirement that a double construction always be used in order to count. It being perceived as correct when used is enough.