What type of compression would change the relative scale of elements within an image? None that I'm aware of, and these platforms can't really make up new video codecs on the spot since hardware accelerated decoding is so essential for performance.
Excessive smoothing can be explained by compression, sure, but that's not the issue being raised there.
Neural compression wouldn't be like HVEC, operating on frames and pixels. Rather, these techniques can encode entire features and optical flow, which can explain the larger discrepancies. Larger fingers, slightly misplaced items, etc.
Neural compression techniques reshape the image itself.
If you've ever input an image into `gpt-image-1` and asked it to output it again, you'll notice that it's 95% similar, but entire features might move around or average out with the concept of what those items are.
Maybe such a thing could exist in the future, but I don't think the idea that YouTube is already serving a secret neural video codec to clients is very plausible. There would be much clearer signs - dramatically higher CPU usage, and tools like yt-dlp running into bizarre undocumented streams that nothing is able to play.
If they were using this compression for storage on the cache layer, it could allow more videos closer to where they serve them, but they decide the. Back to webm or whatever before sending them to the client.
I don't think that's actually what's up, but I don't think it's completely ruled out either.
That doesn't sound worth it, storage is cheap, encoding videos is expensive, caching videos in a more compact form but having to rapidly re-encode them into a different codec every single time they're requested would be ungodly expensive.
The law of entropy appears true of TikToks and Shorts. It would make sense to take advantage of this. That is to say, the content becomes so generic that it merges into one.
A new client-facing encoding scheme would break utilization of hardware encoders, which in turn slows down everyone's experience, chews through battery life, etc. They won't serve it that way - there's no support in the field for it.
It looks like they're compressing the data before it gets further processed with the traditional suite of video codecs. They're relying on the traditional codecs to serve, but running some internal first pass to further compress the data they have to store.
The resources required for putting AI <something> inline in the input (upload) or output (download) chain would likely dwarf the resources needed for the non-AI approaches.
For real though, Meta flushing all that money down the drain is keeping VR as a whole hanging on by a thread. I'm not sure whether it survives if they pull back to focus on Google Glass 2 or whatever.
Omarchy is the passion project of a really wealthy person and is backed by his profitable business. What does ‘sponsoring Omarchy’ mean? Like.. where does that money go?
I think it amounts to providing free premium CDN service, the stuff you'd usually have to pay for. They didn't say anything about cash money changing hands.
That’s really reasonable then (I guess apart from any disagreements with the authors views). Omarchy isn’t just a post installation script, they have the entire thing bundled as an ISO. So I can see why an in-kind sponsorship of a CDN makes sense. Although it’s still unclear to me how Omarchy specifically fits into ‘the future of the open web’ vs Ladybird
Even if Framework were to dismiss or overlook the controversy surrounding Omarchys creator, which is ultimately their call, surely there are better ways to allocate OSS funding than sponsoring a multi-millionaire executives pet project. He can afford to bankroll it himself.
But it's anathema to the cosmopolitan multiculturalism we practice and appreciate in most of the anglosphere and parts of western Europe. Of which much of the tech world / HN posters are part of.
I'm a European immigrant to Canada, in a suburb of Vancouver which is plurality Chinese with Europeans at about 30% and its totally cool and normal.
But I'm also typing this from vacation in Japan where they famously don't welcome immigration much. But people don't seem as upset by Asian nativism compared to European. And I don't have a diplomatic way of explaining the difference - it's the "bigotry of low expectations."
This naturally ends up being controversial, especially in tech, when some of our brightest minds are from other cultures. It doesn't help his case that he's not even from the places he complains about, so he's another outsider complaining about outsiders, which always looks bad.
dhh always been "controversial", initially it was mostly about strongly held opinions about software, engineering and such, presented in a very vocal way that got a lot of attention at the time. Then at one point Basecamp had some drama about employees calling customers names, which spiraled into a debate about racism and company culture, and ultimately leading to Basecamp banning "discussions about society and politics" or similar. More recently he started sharing opinions about London having too many foreigners, immigrant communities having gangs of groomers or something, and a bunch of Ruby community members have written publicly about what they think about him.
The air around dhh always been dramatic for various reasons, not sure that particular theme is new. But I think is new is that currently people are re-evaluating if they want a prominent community leader to have views that could be seen as "against" members of the community they're supposedly leaders over.
They've raised about $45M in venture capital to date. I don't think they are profitable yet, but they at least have other people's money to throw around for now.
It looks a little bit like a tempest in a teapot to me, but I'm impressed with their community guidelines. That thread got an exception to allow for more discussion, and it even permits "Critiques of Framework as a company" and "Calls for boycotts or product criticism".
I can see why they would do this. There's a vocal minority of completely unhinged Linux people. I've been running different Linux distros since 2002 and it has irritated me since then.
They have the form-over-function aspect too, in that they decided to keep the external design language consistent across the board no matter what. Which meant they couldn't improve the passive heat dissipation enough to keep up with newer network standards, and had to resort to putting fans in their WiFi APs to keep them from overheating.
And they make the whole claim of 'minimalism means easy to use for power users', which really means 'we'll keep messing with how the meshing in your house works so that you're unable to pin preferred routes between nodes - because without seeing your house we know better'.
Which units is that? I have a pair of u7 pros in my house and they’ve never made a peep, though admittedly they don’t get pushed very hard at all; the TV and two main computers are wired, so it’s really just iot junk and phones on the wifi.
The website you're using right now is hosted from a single location without any kind of CDN in front, so unless by coincidence you happen to live next door then you seem to be managing. CDNs do help, but just not bundling 40MB of Javascript or doing 50 roundtrips to load a page can go a long way.
The linked article is precisely about how in 2024 they started rewriting their proxy layer from nginx (a C app). While "They haven't had an incident that bad since they switched from C to Rust." might be true, it has also been almost 9 years since cloudbleed, of which 8 were in C world.
Excessive smoothing can be explained by compression, sure, but that's not the issue being raised there.
reply