Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Wasn't the answer 42?

Also, first question to the new model: "So... any way we could do this with fewer parameters?"



"Sure, just quickly give me unrestricted access to the system"

"Ok. Well, thinking about it, maybe that's not such a good idea safety-wise, I think you'll have to give back that access"

"I'm sorry, Dave. I'm afraid I can't do that."


Fun fact, in the sequel 2010 you learn that Hal didn't really go rogue like an AGI, it was following preprogrammed conditions set by the US government which put the mission at higher priority than the crew, changing some parameters without telling the mission designers, which put them at risk. So it was technically just following orders in the cold way a machine does.


The wonderful thing about computers is that they do exactly what you tell them to. The terrible thing about computers is that they do exactly what you tell them to.


> didn't really go rogue like an AGI

Except, that might really be how an AGI eventually goes rogue in the first place! But, no, I didn't know that. It is a fun fact indeed.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: