Hacker Newsnew | past | comments | ask | show | jobs | submit | nasmorn's commentslogin

Also 220 is in the same ballpark as going to two movies as a family with snacks. Three would already be a stretch.

I really liked GitHub and I would also pay more for it, but that does not seem to be a priority. On safari the whole PR review is barely useable any longer because of bad performance without gaining any discoverable to me new features. Obviously a lot of man hours went in to ruining the product but I can’t understand why

On the upside I guess they use git internally as well so maybe they could just find a usable commit and revert all the crap they changed

Very fittingly in German one can describe messing something up as vergurken. Which basically means to cucumber something, which I assume the good people at the university of tsatziki are very qualified to teach about.

Highly entertaining.

In Austria at least you cannot even charge your car with properly priced electricity unless you have your own outlet. At some chargers it is more expensive to charge through the night (because of blocking fees over 4-5h) even though we basically always have the lowest prices then.

I think you are off by about 3 orders of magnitude as my Austrian flat need about 7MWh a year for heating and 3MWh of electricity. I could generate 24kWh per year on an indoor bicycle.

The industry tries to use generative tools for assets since forever. Textures, terrain, foliage and eventually even parametric human models. I don’t see transformers much different.

Oh I do. Because with a tree generation library the trees do look random (they are because the algorithm has a random component).

"AI" assets all look the same.


This observation seems innately vulnerable to sampling error and confirmation bias. "Good" generative AI almost by definition is difficult to notice.

Then I've never seen any of that :) Graphics or text.

Or you don't know you did

The stochastic nature, minimal amount of care required to get something going, and the inefficiency, just to name a few.

I should probably do that too. I once wrote an email that to me was just filled with impersonal information. The receiver was somebody I did not personally know. I later learned I made that person cry. Which I obviously did not intend. I did not swear or call anyone names. I basically described what I believe they did, what is wrong about that and what they should do instead.

I hear you. LLMs have been a major improvement in my life in that regard.

If someone cries about an email you sent, the problem isn’t with you.

It is the large landmass covering about half of the northern hemisphere.

I get the joke on OP's typo for is/in, but even then you answered "what" instead of OP's "where".

It’s miscommunication all the way down

AI doesn’t program better than me yet. It can do some things better than me and I use it for that but it has no taste and is way too willing to write a ton of code. What is great about it compared to an actual junior is if i find out it did something stupid it will redo the work super fast and without getting sad

Too willing to write a ton of code - this is absolutely one of the things that drives me nuts. I ask it to write me a stub implementation and it goes and makes up all the details of how it works, 99% of which is totally wrong. I tell it to rename a file and add a single header line, and it does that - but throws away everything after line 400. Just unreliable and headache-inducing.

Amazon raised like 10million. People complained about lack of dividends but that was at least money earned

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: