Hacker Newsnew | past | comments | ask | show | jobs | submit | procedural_love's commentslogin

That's a frequency illusion, also called the Baader-Meinhof Phenomenon, not confirmation bias.


My health. The past several years have been a journey of finding a diet and lifestyle that minimizes the pain and discomfort.

The other half of that is, when genetics and circumstance have such a crippling impact on one's ability to live a "normal life", how do you cope with it emotionally and spiritually? That's the more challenging half of what I struggle with.


This is a line of reasoning that some physicists are exploring: https://arxiv.org/abs/1703.00058

> "Can the theory that reality is a simulation be tested? We investigate this question based on the assumption that if the system performing the simulation is finite (i.e. has limited resources), then to achieve low computational complexity, such a system would, as in a video game, render content (reality) only at the moment that information becomes available for observation by a player and not at the moment of detection by a machine (that would be part of the simulation and whose detection would also be part of the internal computation performed by the Virtual Reality server before rendering content to the player). Guided by this principle we describe conceptual wave/particle duality experiments aimed at testing the simulation theory."

Here's the kickstarter they (successfully) ran to fund some of the experiments: https://www.kickstarter.com/projects/simulation/do-we-live-i...


How do you associate a token holder with an actual person or organization?

Is there a method for doing this built into the protocol, or would that be a responsibility for the implementer?

I agree that transparency could be a great benefit of this technology, but if a "wealthy token holder" can create several puppet accounts with their own tokens, throwing a vote can be made to look "organic". Does DIRT do anything to prevent this?

(Thanks btw, it's great to see you active in the comments.)


I don't quite understand his point. Is it that awareness is an important product of a liberal arts education that shouldn't be undervalued?

"This is Water" speaks to that idea, but it seems disingenuous to claim that a liberal arts education is a particularly effective way of raising awareness. It often seems to take the automatic thoughts that he projects on the audience and replace them with other automatic thoughts, rather than providing tools for people to break that cycle in themselves and other people.

But, if I'm misunderstanding his point, please make me aware. David Foster Wallace is someone that other people I respect show respect to (Jason Wilkes in Burn Math Class), but this speech left me puzzled.


The irony of this comment is that part of his point is that you are at risk for missing his point in the absence of liberal arts education :)

His point is that the liberal arts method of teaching critical thinking is designed to force the student to comprehend and empathically understand the lived experience of other individuals and their concomitant perspective in a given situation. If one is incapable of abstracting their perception from the context, they are forever doomed to only understand the world around them as-is.

The unenlightened fish sees only the world as he or she experiences it, but those equipped with the right cognitive skills are able to recognize the milieu or ether that constitutes the social world. The ability to recognize the "water" empowers the observer to be aware of the current and the current's effect on other fish.

This is his point about controlling one's internal voice in the grocery store line. The unenlightened fish will simply rage against the water, angry that it is muddied by those ahead swimming incorrectly. The enlightened fish understands that each fish is engaged in its own struggle against the current. That struggle is muddying the waters, and recognizing that fact allows the enlightened fish to understand why those ahead are stirring up the sediment. They are simply trying, as the enlightened fish is, to fight the current. Understanding this allows the enlightened fish to hold them blameless. Holding them blameless (or at least understanding them as fish) grants the power to control how one feels about the muddied waters and the responsible fish.

Hope that helps!


Thanks. I'm still not convinced that a liberal arts education is an effective way of fostering this type of thinking as compared to practicing a variety of meditation styles.


This article itself may be paid advertising, perhaps by the Finland marketing board. They even mention VisitFinland.com in the final paragraph.

"Science says quiet is good for you."

"Finland is a place that is quiet."

"You should go to Finland."

It's not even subtle but I didn't see it mentioned in these comments at all.


If it was a paid ad, it was successful, because now I want to go to Finland.


> We assume, too, that face swapping is the end game, but it’s clearly just the beginning.

Isn't the end game an endless stream of personalized content for everyone? Wherein the entire corpus of human-created media becomes a training set for our fantasies.

It is interesting how entertainment is again pushing the boundary of technology. Soon enough this push to make face editing tools for porn more accessible to everyone will allow anyone to:

1) Replace their ex-husband's face in their old family videos with their new husband's face.

2) Create a viral video of Donald Trump murdering someone.

3) Be the star of their favourite movie, porn or otherwise. (What's the effect this would have on people's memories, when they actively see themselves doing everything James Bond does, for instance? Shooting people, being generally powerful, and "getting the girl"?)


Speaking of the effect this would have on peoples memories, there's also the potential to use these tools to gaslight [1] someone.

An abuser could make images where a person was at an event they were never at, or with a person they never met.

> "You've totally met Steve before, here's a photo of you with him, how do you not remember?"

An abuser could even more effectively tear down someones reality than ever before. If they were having an affair with someone they just met, they could claim to be old school friends catching up, just insert them into an old photo.

Obviously, it's not all bad. There is the potential for this to be used for good as well, but I'm a pessimist.

[1] https://en.wikipedia.org/wiki/Gaslighting


I mean, there's presumably a very short window for that before photo evidence becomes unconvincing to people. The first time a Senator gets "exposed" for some misdeed but proves the evidence is fake, "there's a photo of this" loses its punch. The surrealism of seeing a fake recreation of oneself might have some impact, but we handled ultra-realistic paintings alright.

It does touch on an interesting point, though: we've had roughly 100 years in which photo and audio recreations of events constitute "hard evidence" beyond our ability to fully falsify. It appears that within the next ~20 years we'll lose that reliability - footage of a politician making a dirty deal or a businessman engaging in conspiracy will become deniable not just as a misleading edit, but as outright fabrication.

What do we do at that point? Do smartphone videos get automatically hashed and uploaded to a blockchain somewhere, so that we can prove when the video came into being? Do we return to an 1850s sense of news, where claims effectively cease to be falsifiable except via personal experience? Are we ready for any of this?


In today's political climate, even with incontrovertible evidence, all you need to do is shout out at the top of your lungs: "FAKE NEWS", and it doesn't matter any more.


Giving in to those who shout so is exactly their goal.

We, as technologists, should come up with solutions that allow such shouts to become meaningless.


If you make shouts meaningless you make screams of help meaningless


I meant shouts of deliberate misrepresentation.


As national political news to a cynical, disinterested audience, fine.

But that's thinking too small: what happens when every recording of corporate misdeeds, every photo of a cheating spouse, even a child's baby pictures, lose their solidity as evidence? What happens to anonymous online conversations when "pics or it didn't happen" becomes "it didn't happen"?


The other side of that, detection of frauds, also benefits from current work in the ML/AI/etc.

Adversarial algos sound quite interesting. It might mean that we'd have to look at the evidence of fraud in a whole new way though, because the thing that gives away the fake might not be obvious.


> I mean, there's presumably a very short window for that before photo evidence becomes unconvincing to people.

Conversely, the people who want to believe false things (and under the right circumstances that's most of us) will be easier to convince. What is truth anyway?


That's not really true. Maybe fifty years when cameras were of a sufficient quality that touchups would be noticed compared to a photorealistic painting.


My guess is that signatures will be forged via NN before pictures.


I agree, but I wonder how much this matters? Are signatures still counted as a hard form of identity verification all that often?


Don't shoot the messenger, but here are a couple more techno-morality scenarios:

* A browser extension that detects if the social media profile you're viewing face-matches any revenge-porn that's out there, and serves it to you

* A phone app that undresses people, or [woman < AGE], or whoever, in real-time via AI-guided compositing. Will this be considered as offensive as putting a mirror on your shoe?

* Digital VR girl/boy-friends ala the movie "Her", except with the face, body, and voice of anyone you choose

Suddenly all these things seem very close at hand.


I think it'd probably be safer putting a _lower_ limit on the age for your undressing app.


Yeah, because the people that are going to be virtually undressing minors are going to be stopped by DRM. I mean seriously the world is about to get a lot weirder. These tools aren't even hard to use currently with little to no programming experience. Sure it's tough to get good results but that's going to change quick. How we adjust as a planet and society is going to be a real growth experience ... or utter chaos. Maybe both?


I think we’re going to come to consider these things normal and mundane.

Anyone can see you fake naked at any time. Meh who cares.

Anyone can put you in any random video. Meh who cares.

Anyone can ... meh

We used to think it scandalous/offensive for someone to take photos of us. Now it’s just part of being outside. We don’t even think about the fact that everyone walka around with a camera.


If it gets good enough that it becomes difficult to identify real security cam footage vs edited footage (which could be easier than you think considering the low quality/resolution of security cameras), there could be serious legal consequences.

What if a suspect's lawyer plays the same CCTV video and shows the arresting officer committing the crime and says, "See. Anyone could have made this evidence." You'd then have to prove chain of custody and it can get incredibly hairy .. but only if you're rich and famous enough to hire those lawyers and make that argument.


We're going to need to digitally sign everything at the time of production to prove to they're not forgeries then.


We already have the capability to do special effects and superposition in real time. My phone can do it with simple shapes in snapchat already, including with proper perspective and depth scaling.

Imagine what will be possible in even 5 years with good hardware. Deep fakes in real time, digitally signed.


I'd imagine that the ability to fake and tell apart fakes will scale with computational resources, and so we will also have progressively stronger signatures that cost more computational power to generate.

This is basically already the premise of PoW -- it's hard to fake out the network and the chain of hashes show you exactly how much computational work was put into demonstrating veracity.

This doesn't remove the ability to fake things, but it imposes a price. If you really want to show something is real, dump a bunch of computation into computing hashes.


Exactly. I think we're going to stop using video as evidence of anything.


Only if you are in the first world and even then if you are some subgroup of people.

I know of and read about people commuting suicide or being killed because their honor was harmed.

This is not meh for many billions of people.


> This is not meh for many billions of people.

I agree. It’s not meh for many people.

I think that the more common it becomes, the more social mores are going to change to accomodate it. Once upon a time dressing like people do today was scandalous now it’s not.

You know, like wearing a straw hat too late in the year leading to riots. https://en.m.wikipedia.org/wiki/Straw_Hat_Riot


Well there was an app named "NameTag" back in 2014 that promised to find pictures online from a potential match [1], and some russian dude was matching pictures of strangers in the Moscow metropolitan train with online available pictures (I cannot find the article anymore unfortunately)

[1] http://www.ibtimes.com/nametag-facial-recognition-app-checks...


You should contact the Black Mirror writers and give them your ideas.


Things are going to get very weird in porn, when you don’t have convince a human to actually do it. I have to assume that early adopters will also be people with predilections which are unserved, or illegal. If people worry about their kids seeing disturbing porn now, imagine when it’s AI generated, photorealistic rape, snuff, child porn. Illegal or not, if it’s purely virtual law enforcement is going to focus on the subset of crimes which involve actual human victims.


> If people worry about their kids seeing disturbing porn now, imagine when it’s AI generated, photorealistic rape, snuff, child porn.

There was a time when it was quite easy to find (without even trying for that specific content) photorealistic rape, snuff, bestiality, and child porn on the public web, without any AI involved.

> Illegal or not, if it’s purely virtual law enforcement is going to focus on the subset of crimes which involve actual human victims.

Actual prosecutions for virtual (generally not photorealistic) child porn in various jurisdictions demonstrate that this is not a hard and fast rule.


Animations or fiction of obscene content are not illegal in the US and Japan. They are illegal in the UK (a man was sentenced for Simpsons's porn) and many other countries.

Now with added realism, these lines could become blurry and we could see some of these issues brought up again.


> Animations or fiction of obscene content are not illegal in the US

Citation please. There is nowhere near enough precedent to draw such a conclusion in the US. The defendants in these cases often end up pleading guilty.

US v Hanley, US v Red Rose Stories, etc.


Hmm .. seems things have changed quite a bit since I last read up on this. It seems to vary by state:

https://en.wikipedia.org/wiki/Legal_status_of_drawn_pornogra...


But the legal reasoning right now that the children are harmed in the making of it and their victims, dead or not, suffered through the making and suffers through continued distribution of it. I'm sure there'll be some landmark cases soon enough.


I think this progression will do no more than force an existing moral question into the open. What is the moral quality of a thought?

Personally I believe that even unspoken thoughts can have a strong moral dimension for the individual, though of course I see no legal dimension.

One aspect of this will be does our indulgence of our own negative fantasies weaken our capability to act rightly when presented with a real world moral choice and does that make us culpable...or more culpable if we make a wrong choice.


> One aspect of this will be does our indulgence of our own negative fantasies weaken our capability to act rightly when presented with a real world moral choice and does that make us culpable...or more culpable if we make a wrong choice.

It'll be really interesting to see more data come out about this. This concept is pretty much at the core of the video game violence debate which is still somewhat ongoing.


It depend - if it’s executive decision making many hours after playing an immersive game - people figure out what’s fantasy and what is not.

I suspect that if it’s 2 seconds after pulling off a VR headset after being in a photorealistic world which had no forced errors then people would be very confused.


> imagine when it’s AI generated, photorealistic rape, snuff, child porn. Illegal or not, if it’s purely virtual law enforcement is going to focus on the subset of crimes which involve actual human victims.

In the US, all of that is already illegal. If you put yourself in a position where what you possess is indistinguishable from the real thing, the courts err on the side of the potential victim.

Law enforcement's priorities are not going to change; they don't distinguish between what's virtual or not. If it looks like CP, you can't point to a producer with valid 2257 documentation and it isn't obviously a cartoon, you're cooked.


This will change.

The 2257 law follows the legal reasoning that as a porn producer, you have the burden of proving your innocence. You have to show the proof that the person in your image or video is a real, live _adult_ person, and if you cannot, it is assumed that the person is a real, live _child_ person.

A landmark case will come along where a jury will decide that this new technology introduces reasonable doubt into this thinking. When this happens, the _government_ will then have the burden of proving that the person in the video or image is a real, live _child_ person.


How does "possession" work when everything's in the cloud and instantly accessible to anyone?


If it's in your Dropbox, it's presumed to be yours.

If you upload it to reddit, congratulations, you're no longer a consumer. You're a distributor. New charges apply.

If you just browse, you're sort of safe, but you better hope your browser isn't caching anything to disk. Forensic reconstruction still constitutes possession. But nobody just browses.


You're not "sort of safe" if you browse. The US has ISP reporting laws, as does Australia, South Africa, France and others:

http://chartsbin.com/view/q4y

It's a weird situation because in the UK, they simply block the content (no freedom of speech). In the US, we have freedom of speech so ISPs can't block anything. But they do have to report if you visit a site that contains illegal material or transmit it, plain text, through their services.

Child pornography is a strict liability crime to, so intent doesn't matter. Say you download something from /r/gonewild and the girl is 16, but she looks 20 to you. Too bad. You're not in violation of the law and can be put on a sex offender list.

Many people probably have illegal content without even realizing it. That's another reason why encrypting all your devices is so important.


It's going to get very legally interesting when someone puts some child porn into the Ethereum blockchain.


>Illegal or not, if it’s purely virtual law enforcement is going to focus on the subset of crimes which involve actual human victims.

I'm not convinced. https://en.wikipedia.org/wiki/United_States_v._Handley


Will they really? There are already several thought crimes.


If we really had the Holodeck from Star Trek TNG, one of the first five uses would most likely be porn and/or a brothel.


Isn't the end game an endless stream of personalized content for everyone?

We can keep going. Why would that be desirable? It hits the right chemical buttons in the brain. Drag it out far enough and we're really aiming at being blissed out brains in jars being fed shots of endorphins at the right intervals.


I think I'm partial to a variant of wireheading that sort of linearly shifts our perception of pain and pleasure. Getting an arm hacked off is like a bad headache, normal undesirable things are like a minor ache, normal day-to-day is like a fun night out, and orgasm is like ... I don't know, heroin I guess.


I don't know if the human brain could handle that. If you have too many fun nights out, they start to wear. If you take too many drugs, they start to lose the magic, they just become normal.

You need the highs with the lows.


I suppose if we got to that point, you could replace that portion of the brain with cybernetics, save the state before having taken any drugs then every so often, flash it back to the beginning?


If you haven't read Peter F Hamilton's Commonwealth saga, you might appreciate it. Memory editing plus effective immortality is an interesting concept.


Best expressed by the philosophy of Butters:

https://vignette.wikia.nocookie.net/southpark/images/b/b3/A_...



> ex-husband's face in their old family videos with their new husband's face

Calm down Charlie Brooker.


Shut up Nathan Barley.


The technology for all of these exists, it's just a matter of motivation.

see: https://www.youtube.com/watch?v=ttGUiwfTYvg


People have been able to put celebrity faces on porn photos for decades.

I don't see how this is significantly different.

We could create a viral picture of Trump killing someone now.


Number 3 is really exciting to me. Think about every movie making you the star of the action. That'd be insane!


Play any blockbuster FPS from the last decades to get a taste of this "future".


I really think it'd be more fun to watch it in some ways. Like in FPS I'm typically "anonymous guy" or "random name given by the creators". If it was me watching myself in some random movie I think it'd be pretty awesome.


I made a game that uses SDFs for a game jam. Here's a video:

https://www.youtube.com/watch?v=EEjee1CRko0

The game designs that SDFs allow haven't been explored fully, so I'm picking away at it game jam by game jam. ;)


This is awesome! I wanted to do this for a procedural generation algorithm during a game jam but didn't have the time to figure it out.


Hey Zach, I'm in a similar boat to you, although less prolific. I read your post as saying "I'm trying to figure out what to do next", and this is some food for thought in that regard. Personally, moving forward I intend to do what you call a "project deep dive", but from a slightly different angle.

There was another highly upvoted post on HN last week titled "Things I’ve Learned from Reading IndieHackers" [0]. The article itself had a lot of interesting advice, distilled from interviews with people who had taken side projects and turned them into profitable businesses.

In the comments, soneca had some insightful additions [1]. And we are increasingly in an attention-driven economy [2]. But while attention is being widely harvested through exploiting addictive tendencies [3], that does not motivate me. Reading Hooked [4] was interesting, but also terrifying.

For me, bringing this together means my course of action is as follows: Build an audience around content which enriches the lives of people. Way easier said than done. And some may write it off as hopelessly naive.

But while Facebook and friends scour the dopamine landscape for user attention, building an audience around long-term content that challenges and enriches the lives of people seems wonderfully contrarian to me. And it's in that contrarian tenet that I find solace. Take the slow and the long road, and you can build a sustainable source of value.

In the grand scheme of things, spending a few years of my life creating short interactive games which explore philosophy and society is the path that makes the most sense to me. Like I said, food for thought in your journey forward.

[0]: https://news.ycombinator.com/item?id=14803468

[1]: https://news.ycombinator.com/item?id=14803468#unv_14804395

[2]: https://medium.com/the-mission/the-enemy-in-our-feeds-e86511...

[3]: http://www.paulgraham.com/addiction.html

[4]: https://www.amazon.ca/Hooked-How-Build-Habit-Forming-Product...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: