Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can ask the AI to generate a picture of horrid things, and it will oblige.


You can generate things the author doesn't like. And since you're doing it on your video card at home, nobody can stop you.


If a computer model could produce the world's best porn, would that be a good or bad thing? Many harmful effects of porn would be amplified, but it would reduce the exploitation of real people in the industry. A moral question society will soon face I think.


I think the bad things would be targeted use of realistic images: for example, imagine the horrible things some people experience in school multiplied by “leaked” photos, or someone’s abusive ex distributing “proof” of their infidelity / unsuitability to have custody, etc. There’s a theory that over time people would stop believing everything they see but there’s still plenty of time for millions of tragedies before that happens, if it ever does. Forensics is going to be a growth industry.


I really don't understand why people are pretending the licensing makes any difference here.


I, meanwhile, find AI research distasteful because it’s written in Python.


Yeah we need Trigger Warnings on any ML repo that uses Python. The trauma that comes from dealing with Python dependency management is really hurtful and non-inclusive.


Which I don't think is the end of the world. I can draw horrid things on my iPad in Procreate or even just a pencil on paper. What's new here is ease of access and hyper realism. This is more a problem for fake news than just generating bad/shock images which were already possible.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: