Yours is even less so. Like the parent, i found the story uninteresting. If anyone with a sad job got to the frontpage, goodbye IT news. In the end, if the guy thinks his job is sad, quit it, or campaign against evictions. Pretending to be sad and effectively asking for sympathy is ... i can't find the word.
Bury it all you want, i have sympathy for the evicted, not this sleazy redditor. Posts like these are the reason i removed 'reddit.com' from my reddits.
TBH, i would like to see all the like/+1/tweet/digg buttons gone, they are a big privacy hazard. This functionality should be included in the browser or available from extensions.
Ghostery is a general tracker blocker. FBB only blocks, as the name implies, Facebook related stuff - at least that's what I infer from a quick skim through the description.
So, if you only want Facebook gone, by all means use FBB. If you don't want trackers in general (Google Analytics etc), use Ghostery.
The article did not hide any of these facts though, so it's not "utter nonsense". The title of the original article is "A Higgs Setback: Did Stephen Hawking Just Win the Most Outrageous Bet in Physics History?", nothing like "RIP Higgs". The article, at best, is trying to be provocative. But, more importantly, you believe in god?
The article isn't just about spotify, but anyway, bandwidth usage is a concern now, but certainly as bandwidth goes up there will be more and more services taking advantage of P2P for faster and better service. It's not necessarily a bad thing.
Actually the difference is not that big. It's all in our brain anyway, and most virtual experiences evoke similar brain responses as real ones (theater, TV, cinema, mirror neurons etc). What virtual items lose is fine properties like touch, smell, permanence etc.
I think the bigger problem is both city planning and immobile technology: most of us are forced to live packed away from the nature in cities planned for the long-gone industrial era, and until recently we were forced to sit in front of our heavy information devices. I believe the mobile revolution is going to bring back a lot of those "physical" elements that we are missing now.
It's not the first time societies face transfomations. We already eat virtual food, i certainly don't remember slaughtering that many chicken, pigs and cows lately, and i 've only milked a goat once or twice in my life.
You bring up some good points with the parallels in the brain however I think more is lost in the exposure to different experiences that you simply can't replicate online.
For most, it'll be difficult to differentiate between things (physical books, library visits, spending time outdoors, etc) that we don't want to leave behind for nostalgic reasons and things that we shouldn't leave behind because they help us with maturity, growth, etc.
I'm wondering what people's thoughts are on this. Technological change is a part of society, yes, but is it always best for the individual?
I'm waiting for that ability for Twitter myself. If they could mesh the G+ stream with my Twitter stream, I might be compelled to use it. Buzz got this wrong too, I'm not sure Google understands yet that they can play nice with others and succeed; in fact they have to to succeed. They seem to have their eyes set pretty squarely on a walled garden that can't be extended outside their services, at least so far.
Check out Tweetdeck. It does pretty much exactly what you're looking for with FB and Twitter (and whatever other networks you use). G+ just hasn't opened up an API yet for anyone to tap into it.
I've used Tweetdeck before. Really nice Twitter app, but I don't use FB so there's no need to mesh the 2. My point is, I use Twitter and it serves me well. Without Twitter integration into G+, I have little incentive to use G+. All the friends I care about are already on Twitter, only one or two are on G+. Unless I can read my Twitter feed in G+ in addition to reading my G+ feed, it's pretty useless for me right now. Maybe when they release an API I'll come back and check it out. Thanks for the suggestion either way.
The processor would be no different to program than what we know, they haven't invented a new computational paradigm, it's still turing machines.
Just to illustrate your point about the intricacies of simulating synapses realistically, and to show how far this is from actual biological systems, here is a model of a single NMDA receptor: [1], It requires 26 floating point numbers just for the state-change rates and 20 floating point state variables. And that's just to simulate a single receptor!