Hacker Newsnew | past | comments | ask | show | jobs | submit | karmakaze's commentslogin

I thought it might be AI but the composition is so much worse.

I like how odd it looks on the whole--it captures the moment we're in.


I've lost extended attributes of files on a number of systems because they aren't always included by default in file operations. I don't trust everything that could possibly write/update a file to preserve them--common 'safe' pattern is to rename as .bak and write new file contents without EA to original name. Nor do I trust myself from archiving without using the right options in a hurry.

Yes concerning. The possibility of losing them between filesystems is worrying too.

One rule of thumb is `mv` will keep the attributes by default (given similar filesystem) and everything else needs tweaking/extra args. There's a section on it over here:

https://wiki.archlinux.org/title/Extended_attributes#Preserv...


But what's the market share of foldable to non? That's the real question. The iPhone Air has 100% of the super thin phones (excluding camera bump) but they still cut production soon after launch.

Estimates are 1.5% to 1.6%, and I heard one even forecast "nearly 5% by 2028".


> “We need a User Service”

This is an XY problem statement. We need Y to do X (the following):

> “We need these operations on user data: create, read, update, delete, authenticate, authorize”


> This is an XY problem statement

no, it isn't. we need both, they're different aspects of the same thing. hyperfocusing on the former and disregarding the latter is just as bad as doing the inverse, and is exactly the problem i was describing.


Leaving him there was an even bigger mistake that Apple allowed and never corrected--Alan Dye had to correct that himself. Any dis post-departure only points the blame back at Apple's management.

Or PReV. Amazing project!

I remember trying to get NeXTSTEP 3.3 running on x86 hardware, it was so fussy with the hardware it supported that I had to take apart 3 computers from the office as well as a personal (not mine) Everex cube PC. (That's just for what was used, I'd taken apart way more to try and fail with so many cuts on my hands.) Then there were so many precise moments where you had to hit keys, eject floppy, or other hardware shenanigans that it felt like playing Dragon's Lair.

Was finally able to get it to boot to 2-bit grayscale on a DECpc with a LocalBus video card and some kind of SCSI drives. [After a few days, I had to return the parts to the users' PCs]

The NeXTSTEP desktop was nice. Interface Builder though blew my mind.


I don't know if it's only because he left, but there are stories about how bad he was in his position for such a long time. If that's true, Apple must be blamed for keeping him there--until he voluntarily left. WTF?

TBH I don’t know much about the Mac eco except that I have been using a couple of MacBook Pro for work for the recent 5 years. My humble experience says hardware is easy but software is hard. Might be counterintuitive but that’s honestly what I felt.

With hardware there are only so many sanely quantifiable ways someone might use, abuse, or hack up your product. And you don’t have to care about some or all of them. Someone might desolder an Apple silicon chip successfully and do something neat with it, but they’re unlikely to use it to power an MRI machine.

But software - even inside the business that makes an application people will still find entirely surprising, realistically unpredictable ways to use it. Let alone the customers/users/tinkerers.

At a former place I worked we had one customer who was smart enough to be technically correct about how our software worked to use it in the most insane manner any of us had seen, and which no one had ever contemplated. Not even in a way that was sane to test manually or with automation. (I’m being a bit vague because it’d be very identifiable broadly and specifically.) Eventually we had to say “yes you can use it this way, but you’d end up paying far more than you should and the experience is going to be awful.” (Even sales agreed on the former!)


Yup, if we're in a simulation pretty much all bets are off. Mandela Effect could merely be update patches. It could patch a proof that our world is not a simulation.

I've only used a few integrated productivity apps and generally found them less useful to me than less integrated ones. The clearest comparison is Notion which has so many ways to store information that I'm always misplacing knowledge. The places where I can effectively store and retrieve knowledge is Google Drive/Docs or GitHub projects/issues/PRs.

I tend to also not like individual parts of an integrated productivity tool but everyone is expected to use most everything. Basically the bundled MS Teams problem. Integrated productivity isn't selling you the best of each, they're selling you the best integration of maybe-not-your-favorite tools. Just my hot take, YMMV.


Yes. I don't think Linux would have succeeded if written in a language other than C. Today is a different story.

Yes it matters to me as an end user if my web browser is more or less likely to have vulnerabilities in it. Choice of programming language has an impact on that. It doesn't have to be Rust, I'd use a browser written in Pony.

If I were making something that had to be low-level and not have security bugs, my statement would be:

> I’m not smart enough to build a big multi-threaded project in a manual memory-managed language that doesn't have vulnerabilities. I want help from the language & compiler.

The size and longevity of the team matters a lot too. The larger it gets the more problematic it is to keep the bugs out.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: