Hacker Newsnew | past | comments | ask | show | jobs | submit | MattDL's commentslogin

Trust me, laziness is a real thing; even when everything in the world points towards doing the work being the best thing for you, you just can't be bothered.


Of course it's a real thing. He never said it wasn't. What he said was that it doesn't explain anything, and that's true.

Laziness is a symptom, not a cause. It's like diagnosing someone with a "cough". It may be correct, but it's pointless, because it's obvious and tells you absolutely nothing about what to do.


At what point is a personality merely a symptom? If everything about us is a result of our ailments, and all of our ailments have treatments, how does a person know which parts of his personality are him and which aren't? What am "I" then?


I would probably say that we can classify our personality as the difference in sets of what anyone with our ability and in the same situation would do, and what we do.

That being said, our ailments could probably be categorized by affecting all aspects of our life equally and in full measure of time (i.e something that doesn't fluctuate across the years is probably an untreated ailment).


You're the little sane bit sitting on top herding the various cats of brain lobes, psychological disorders and just plain basic insanity into acting as a useful whole every day.

A sufficiently self-managing nutter is indistinguishable from a genius.


That's an excellent question to which I am fairly convinced there is no answer, but it's a good one to think about in any case.


>even when everything in the world points towards doing the work being the best thing for you, you just can't be bothered.

This is also a perfect description of depression.


This is just the mod SDK, it isn't the Source engine code.


In one of the articles I read on this (sorry no link) I'm pretty sure it mentioned their new philosophy, for hardware and software, was rapid iteration.


They're implementing it at the moment, release channel FF doesn't have the capability yet.


Looking nice, hope they continue to improve performance and stability too.

Having a huge amount of issues with the flash plugin at the moment, not sure if that's their fault or Adobe's though.


Firefox can't do anything to prevent a plugin from crashing. That and I'd wager most of the Flash movies are very poorly coded, resulting in exceptions that could've been prevented. You can tell by browsing the web with a debug build of Flash.


Isn't it possible to just check the source code of FireFox addons?

Would be good if somebody could confirm if that is the case.


Can anyone confirm/deny if the article points to any specific code? It's down.


Didn't they have to lure him away from the UK to be in a position to abduct him in the first place though?

So really it depends where you are and what you're willing to do.


Honestly I think my next monitor will have to be a 120Hz 4k screen.

I'm not sure I could take the step back down to 60Hz after getting used to it, for me it's a much bigger deal than resolution.


Why is it so important? Is it for gaming, maybe? Because I am pretty sure it's impossible to notice that while working, because there is no flicker between frames anyway. And gaming on a 4K screen (at PC viewing distance) is pretty crazy.


Yes. Quake players swear by 120Hz+ screens. We're all geeks and nerds who still use CRT monitors because those new hip 'flat' things have too much input lag and too low refresh rate. In a fast-paced game like quake, these things make the difference between winning a duel and losing one.

Also in the fighting game community (King of Fighters, streetfighter) input lag is a killer. if your monitor has lag, you won't be able to chain your combos as well as the pros.

For working and document editing and browsing, 50Hz with slow input time is fine, but for 'serious' gaming it's not, I guess.

It's also one of the arguments I hear why PC gamers consider themselves 'more awesome'. "Why would I play on an xbox, on one of those TVs with the horrible refresh rate, and all the post-processing that adds milliseconds of delay"


I see. But I suppose that this kind of monitors are not made for gamers, anyway. Ultra-high resolution is good for sharp text/CAD/etc rendering. A game is just too fast-paced and textures aren't that high-resolution anyway.


If you are talking about Quake, fans have made plenty of high resolution textures, there is a 2.5GB HRTP at moddb: http://www.moddb.com/mods/quake-epsilon-build

Compare that to the original game data being around 30MB.

Interestingly, fan texture projects almost always have super high fidelity that normal games don't. Most of the Epsilon textures are in the 4kx4k neighborhood.


Gamers may swear by them, but that does not mean there is an actual difference. Have there been any double-blind tests between 60hz and 120hz?


If I ever get fu money I'm going to start double-blind testing everything - screen refresh rates; mouse rates; coffee preparation methods; speaker cables[1]; bitrates for media; compression for media; everything I can.


A site focused on conducting controlled double-blind tests would be amazing.


Blind Busters


Nice idea, I'm planning on using my fu money to run TV adverts countering the non-sense claims made by other 'hydro-nano-bollocks' pushing adverts.


You won't be able to buy the time to run those adverts, unfortunately.

Kalle Lasn[1] of Adbusters fame has been trying to do it for years, and the networks just say "We won't run adverts contrary to our big sponsors". Yes, even PBS and the CBC say this.

http://en.wikipedia.org/wiki/Kalle_Lasn


Can't say for gaming, but 120Hz is great for watching movies/anything at 24fps. With a 60Hz screen you gotta do something to make that 4:5 ratio work (either holding frames, or attempting to interpolate/interlace). At 120Hz, you got a nice clean integer ratio, so you can watch 24fps 'as it is'.


Anecdotally... The way I perceive framerate is related to the resolution. Take a horizontal panning shot at 720p/60hz... a vertical line might move 2 or 3 pixels per frame. At 2560*1600/60hz the same line jumps 4 or 6 pixels per frame and starts to appear discontinuous in vision. High contrast makes it more obvious. Motion blur removes this in exchange for input lag, but high frame rate is a better solution.


I've had numerous discussions with people about 60Hz vs 120Hz, including one where a friend cited some study (sorry no link) which claimed that, perceptually, people couldn't tell the difference between these two refresh rates. I was incredulous. I'd been playing Quake at a high level for a number of years and anything lower than a v-sync'd 120Hz setup was painful, on the flip-side, playing 120fps@120Hz (CRT) felt so fluid, like water (hard to describe, you have to play on the two setups to feel it).

I haven't had the same experience on any flat panel display I've used till now but I'm on the lookout for a good 120Hz LCD gaming monitor in the hope I get the same experience again.


I recently took a resolution downgrade to get a vg248qe for the 120hz with lightboost. See here: http://www.blurbusters.com/zero-motion-blur/lightboost/

If you ever have a chance to play with a monitor at 120hz, especially with lightboost, it's so obvious that you wouldn't feel the need to have a double-blind test. If you're not playing games, it's most noticeable when scrolling down a page in your browser or moving the mouse.


It's very easy to spot the difference for anyone, even the uninitiated.

Just launch your favorite first person shooter, and rotate the view very quickly. Of course the game needs to be running at framerates above 60 FPS, and with v-sync off. (Or at framerates above 120FPS, with vsync.) The difference between 120 Hz and 60 Hz will be immediately visible through the "smoothness" of the rotation.


And gaming on a 4K screen (at PC viewing distance) is pretty crazy.

Crazy Awesome.


Well yes, but I'd rather play at 1920x1080 and not have to sell a kidney for a graphics card that can drive the 4K display.


The next generation of graphics card will probably be able to take the dubious honor of being able to run Crysis (2007) at 1920x1080 / 60fps on a single GPU...

[1] http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review...


You still have that option, I can't imagine any 4k displays would fail to support running at a 2x multiplier given that it is designed specifically for that.


I notice the difference whenever my desktop has quietly switched back to 60hz from 120Hz even in non-gaming applications. I'm not going to claim that it is important. But, the wet-glass smoothness of 120Hz makes me happier :)


Are you using a CRT monitor?

I can't even tell the difference between 120Hz and 60Hz on a monitor, simply because the display doesn't flicker.

If you're using your monitor for gaming, well, that's a different story.


Is there a technology aside from VGA that can drive 4k @ 120Hz?


I'd like to point out that as far as I'm aware modern French speakers rarely use ne pas and favour simply using pas to indicate a negative.

Not to mean that says anything about double negatives, but French probably shouldn't be used as an example of them.


Thanks for the note! Although I don't think it makes it a bad example, there's no requirement that a double construction always be used in order to count. It being perceived as correct when used is enough.


I think the weirdest thing is that id Software have all their games ported for Linux (Excluding Rage) but haven't released them yet.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: