Hacker Newsnew | past | comments | ask | show | jobs | submit | npongratz's commentslogin

Previous discussions:

https://news.ycombinator.com/item?id=39884821 195 points | 2 years ago | 66 comments

https://news.ycombinator.com/item?id=23902124 196 points | 5 years ago | 100 comments


Same results for me. Absolutely awful, vision consistently began failing by becoming noticeably blurry about 8 to 9 hours after taking night lenses out, and I couldn't drive at night because of headlight and streetlight halos even after "topping off" with those uncomfortable lenses during the day. As an enthusiastic night sky observer, trying to use those lenses was depressing.

I gave up after extended tries with three different lenses (I think it was six to nine months total), with my highly experienced doctor consulting with different manufacturers and researchers from around the country. Turns out my pupils naturally open up too wide, made worse by corneas that apparently are not thick enough to retain the reshaping all day. These issues, incidentally, make me ineligible for the popular cut-n-burn style of eye surgery.

On the bright side, it was indeed completely reversible and I've suffered no effects of any kind after about two days of non-use. That was a bit over a decade ago.


Not completely unheard of but I get your point :). Babylon 5's pilot's animations (and I believe opening credits) was rendered in 1993 on sixteen souped-up A2000s, each with 32 MB of RAM.

https://www.generationamiga.com/2020/08/30/how-24-commodore-...


That's pretty cool. I know 32 megs was technically possible with the right boards, I just didn't know any normal person that had one. I had an A3000 with 5 megs (4 megs fast, 1 megs chip) and I thought it was bad ass for the time.


I had a CyberstormPPC with a 604e/200MHz and a 68060/50MHz, and 128MB of RAM onboard. There was also a DKB3128 with another 128MB of RAM.

“Big” Amigas weren’t common, but they definitely existed.

Oh, and I was a college student at the time.


CyberstormPPC was $1000 when it came out in 1997. Thats more than Pentium 200, good motherboard, case, sound card, graphics and 3Dfx accelerator. 128MB was $400-800 and not even top end systems shipped with that much.


Yes, but I didn’t want a Pentium 200. I wanted a fast Amiga.


That must've been an awesome machine. You were a god among Amigans!


I had moved on to Linux / x86 by that point.


> The AI race seems more defined by spending more money than the other guy, regardless of results.

Reminds me of the much-vaunted, then widely maligned and derided, burn rate metric of the dot-com bubble.


> You can always get a plugin if something is missing.

To my great consternation, I have not found this to be true in the cloud version:

https://jira.atlassian.com/browse/JRACLOUD-72631

Special thanks to Matt Lachman for keeping up the good fight every (business) day.


Huh - that seems a very basic missing feature in the cloud version. We use bog-standard self-hosted JIRA and markdown editing is basic working functionality. People also add mermaid diagrams/charts to the issue. As well as custom diagram plugins, excel sheets and a whole gamut of documents.


Words mean things. Please don't call it "unlimited" if you limit it.


> (After 8 months they told me to pick up all my gear, they found nothing, but thanks for traumatising my kids)


Don't forget yourself, the breadwinner of the household!


From TFA, they're using LIDAR results to determine features suggesting indigenous farming practices. Researchers are doing the same in tropical rainforests around the world, where there's far more vegetation, finding similar evidence of intensive agriculture.

I highly doubt weeds, extensive though they might be, would wipe clean the evidence they've found in the landscape.

> I find it very hard to believe that we can find evidence of intensive cultivation after 3,600 years in such a wet area.

Perhaps I'm missing something. I'm no expert, and have merely skimmed through, but the earliest date I could find in the PDF linked from the fine article was 400 BCE [0], so around 2400 years. That's still a lot, but definitely not 3600 years.

[0] "While there is evidence of maize in the Upper Peninsula as early as 400 BCE (7), intensive cultivation, like we clearly see at Sixty Islands is typically not undertaken until roughly 1000 CE."

https://www.science.org/doi/suppl/10.1126/science.ads1643/su...


There are no native water buffalo in Alaska. Are you perhaps thinking of American bison?


> There are no native water buffalo in Alaska.

yea, that just shows you how vicious those mosquitos are!


> Everyone loves pie!

Oh gosh, no! Count me among those who greatly dislike pie charts in almost every context.

"Almost never use a pie chart for data"

https://theconversation.com/heres-why-you-should-almost-neve...

https://news.ycombinator.com/item?id=38912534


In early 1990s I worked at a mid-sized software company making software to help big businesses do big-business stuff. One day, another programmer pops into my cube and says:

"Hey, how do you draw a 3d pie chart?"

"What?" I asked. "Why?"

"Well, Excel can do them. And somebody saw one. Now they want our software to draw them."

"Are you serious?"

"Yes, I need it like now. I'm supposed to demo it later today."

So I get out a sheet of paper, draw some triangles, and work out that projection math. He walks away with the paper. Half an hour later, he calls me over.

"Hey, it's a 3d pie chart!"

And there it was: On a screen I was all too familiar with, where the 2d pie chart used to be, was a squat 3d pie chart, looking like it was a fat inch thick. Of course, there was too much margin above and below, because of the flatter aspect ratio, but hey, it was 3d and it was good enough for a demo.

I think that was the first day I realized that programming can be used for evil.


3d pie charts are literally the devil.

It's 3d pie charts, dot charts, pie charts, and any chart that has the origin set to greater than 0 in descending order of evilness.

We need a ## anniversary edition of Lying with Statistics.


If your uptime (or your percentage score in a class) is best understood/consumed on a chart with an origin of zero, you’re having a very bad time.


If you’re comparing your uptime before and after a fix, you’re lying to everyone by not showing the origin.

Also why are people looking at uptime unless there was an outage? At which point you do need to show zero anyway.


What you should really track and chart is your downtime.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: