Hacker Newsnew | past | comments | ask | show | jobs | submit | jitl's commentslogin

people want their programs to go as fast as possible, WASM can be better than writing JS but it’s not as fast as actually native code by a wide margin, especially if you want to do io

Look into libeatmydata LD_PRELOAD. it disables fsync and other durability syscalls, fabulous for ci. Materialize.com uses it for their ci that’s where i learned about it.

This system allows playing unmodified production x86 executables on arm64. It doesn’t have anything to do with the developers.

That's great, but begs the question: why not just compile the games for ARM?

Because this works for the enormous back catalog of games that already exist, many of which I bet companies no longer have the code or a working build system for, and for new games it doesn't require the developers to do anything because many (most?) of them wouldn't bother

They may provide an option for developers to distribute a native ARM build (which some are already building for Quest titles that can be brought over to Steam Frame) but one of Steam's main advantages is their massive x86 games catalog so they certainly don't want to require that


So Valve won't need to convince developers to do anything expensive and old games will also work. Most games on Steam Deck aren't tested by the original developer at all.

Windows on ARM games are extremely rare. Linux native means dealing with Linux desktop APIs and poor support in commercial engines.


You need to convince all developers that all 117,881 Steam games need be recompiled for ARM. Hopefully they have a working build environment, have appropriate libraries built for ARM, still have the source code, and are able to do the testing to see if the same code works correctly on ARM.

Think back to the x86 32->64 bit transition, but much worse, since ARM is more niche and there are more arch differences.

You need all your 85 3rd party middlewares and dependencies (and transitive dependencies) to support the new architecture. The last 10% of which is going to be especially painful. And your platform native APIs. And your compilers. And you want to keep the codebase still working for the mainstream architecture so you add lots of new configuration combos / alternative code paths everywhere, and multiply your testing burden. And you will get mystery bugs which are hard to attribute to any single change since getting the game to run at all already required a zillion different changes around the codebase. And probably other stuff I didn't think of.

So that's for one game. Now convince everyone who has published a game on Steam to take on such a project, nearly all of whom have ages ago moved on and probably don't have the original programmers on staff anymore. Of course it should also be profitable for the developer and publisher in each case (and more profitable & interesting than whatever else they could be doing with their time).


It's a chicken and egg problem. Lack of ARM PCs due to software support, lack of software support due to negligible market share.

Same argument can be applied to Linux. Why not just compile the software for Linux. Not that the most companies couldn't do it, it's just not worth the hassle for 1-3% of userbase. Situation with Linux also demonstrates that it's not enough to have just the OS + few dozen games/software for which hardware company sponsored ports, not even support for 10 or 30% of software is enough. You need a support for 50-80% of software for people to consider moving. Single program is enough reason for people to reject the idea of moving to new platform.

Only way to achieve that is when a large company takes the risk and invests in both, build a modern hardware and also builds an emulation layer to avoid the complete lack of software. Emulator makes the platform barely usable as daily driver for some users. With more users it makes sense for developers to port the software resulting in positive feedback loop. But you need to reach a minimum threshold for it to happen.

Compilation for ARM isn't the biggest issue by itself. You also need to get all the vendors of third party libraries you use to port them first. Which in turn might depend on binary blobs from someone else again. Historically backwards compatibility has been a lot more relevant on windows, but that's also a big weakness for migration to new architecture. A lot more third party binary blobs for which the developers of final software don't have the source code maybe somewhere down the dependency tree not at the top. A lot more users using ancient versions of software. Also more likely that there developers sitting on old versions of Visual Studio compared macOS.

If you compare the situation with how Apple silicon migration happened. * Releasing single macBook model with new CPU is much bigger fraction of mac hardware market share compared to releasing single Windows laptop with ARM cpu.

* Apple had already trained both the developers and users to update more frequently. Want to publish in Apple Appstore your software need to be compiled with at least XCode version X, targeting SDK version Y. Plenty of other changes which forced most developers to rebuild their apps and users to update so that their Apps work without requiring workarounds or not stand out (Gatekeeper and code signing, code notarization, various UI style and guideline changes)

* XCode unlike Visual Studio is available for free, there is less friction migrating to new XCode versions.

* More frequent incremental macOS updates compared to major Windows versions.

* At the time of initial launch large fraction of macOS software worked with the help of Rosetta, and significant fraction received native port over the next 1-2 years. It was quickly clear that all future mackBooks will be ARM.

* There are developers making macOS exclusive software for which the selling point is that it's macOS native using native macOS UI frameworks and following macOS conventions. Such developers are a lot more likely to quickly recompile their software for the latest version of macOS and mac computers or make whatever changes necessary to fit in. There is almost no Windows software whose main selling point is that it is Windows native.

* Apple users had little choice. There was maybe 1 generation of new Intel based Apple computers in parallel with ARM based ones. There are no other manufactuers making Apple computers with x86 CPUs.


...because there are thousands upon thousands of games that will never be compiled for ARM?

Just look at all the "native macOS" games from the 2010s that are completely unplayable on modern Macs. Then look at all the Windows games from the 1990s that are still playable today. That's why.


You’d run FEX with WINE/Proton, no windows needed. If you did use a VM, I’d think it would be a Linux VM. But, Linux VM on macOS could already use Apple’s Rosetta2 for x86_64-to-arm64 translation.

Speaking of which, maybe you could just run the games with Apple’s WINE “game porting toolkit” direct with Rosetta2. Worth a Google.

EDIT: indeed, you can already play x86 windows games on Mac using software written by Apple: https://gist.github.com/Frityet/448a945690bd7c8cff5fef49daae...


There’s like 100x more JS developers than C# developers. JS can also run code very quickly, where with an AOT language, you need to AOT compile it. For tool calls, eval-as-a-service, running in browser JS is far ahead of C#.

So, everyone who can hack some JS is now a developer? The '100x' claim is obviously exaggerated. C# is certainly one of the most used programming languages there is.

You can run also C# code very quickly, but have the option (but not the need) to AOT compile it. I would say the only real edge JS has is the ability to run natively in the browser. It was built for that purpose, and in my opinion, that is where it should have stayed.


it starts fast and does better job than nodejs for their product

for a 98% read use case like blog SQLite is ideal

run computer programs probably

also hot air rises, and the vent is on the bottom

Why are they still even in the cases? I would assume that shucking them would ever so slightly improve thermals.

Aluminium designed in as heat sink perhaps? But not sure if this is so relevant in an actively cooled DC setting as passive buried under cables at a home workspace!

We were a little team, doing the API dev, the infra, the hardware and countless things. The product is improving and it was impossible to even imagine remoing the case as we spent so many time wiring everything, going to DC and setting up everything. Yes the team is actually doing all of this, even going to DC installing the macs in the racks.

But I agree that for a big scale, this is a good solution. (cf: github)


Agree, removing the case is a lot of effort. Github does a lot of fancy things but you might want to consider how much they're charging by the minute as well

Our way is quite efficient & we're able to quickly adapt to new HW gens


Had a similar problem in T43 HDD.

Fans can move a lot more air than convection.

idk how we can blame some JavaScript and html inside Firefox causing a Wayland crash as Discord’s fault. They’re like 9000 layers of abstraction away from whatever SIGSEGV caused the crash

> idk how we can blame some JavaScript and html inside Firefox causing a Wayland crash as Discord’s fault

I don't see anyone talking about a Wayland crash, it's about Discord crashing.


Whoops, I thought I was replying inside this thread tree: https://news.ycombinator.com/item?id=46059256

> I tried Wayland earlier today on my home lab with Plasma and FreeBSD. It seemed pretty great for a bit and ran my monitor at 120Hz.

> Until it hard crashed my machine after I opened discord in firefox. Konqueror crashed on opening.

I lost track of the indentation on my phone


> inside Firefox

I assume others are talking about the standalone client. Which, to be fair, I assume is also an Electron app but that's Chromium, not Firefox.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: