Hacker Newsnew | past | comments | ask | show | jobs | submit | more jsheard's commentslogin

H266/VVC has a five year head-start over AV2, so probably that first unless hardware vendors decide to skip it entirely. The final AV2 spec is due this year, so any day now, but it'll take a while to make it's way into hardware.

H266 is getting fully skipped (except possibly by Apple). The licensing is even worse than H265, the gains are smaller, and Google+Netflix have basically guaranteed that they won't use it (in favor of AV1 and AV2 when ready).

Did anybody, including the rightsholders, come out ahead on H265? From the outside it looked like the mutually assured destruction situation with the infamous mobile patents, where they all end up paying lawyers to demand money from each other for mostly paper gains.

MBAs got to make deals and lawyers got to file lawsuits. Everyone else got to give them money. God bless the bureaucracy.

Why, the patent office did. There are many ideas that cannot be reinvented for the next few decades, and thanks to submarine patents it is simply not safe to innovate without your own small regiment of lawers.

This is a big victory for the patent system.


The patent office getting $100k or whatever doesn't sound like a win for them either.

I'm not sure what you mean by "patent system" having a victory here, but it's not that the goal of promoting innovation is happening.


For smart TVs Netflix is obviously a very important partner.

VVC is pretty much a dead end at this point. Hardly anyone is using it; it's benefits over AV1 are extremely minimal and no one wants the royalty headache. Basically everyone learned their lesson with HEVC.

It is being used in China and India for Streaming. Brazil chose it with LCEVC for their TV 3.0. Broadcasting industry is also preparing for VVC. So it is not popular as in Web and Internet is usage, but it is certainly not dead.

I am eagerly awaiting for AV2 test results.


Right, I know some studios have used it for high quality/resolution/framerate exports too, so it is definitely not dead. But still a dead end, from everything I've seen. No one seems to want to bother with it unless it is already within the entire pipeline. Every project I've seen that worked with it that went to consumers or editors ended up running into issues of some sort that they ended up using something else entirely and any VVC support basically abandoned or deprecated. It's a shame because VVC really is pretty awesome, but the only people using it seem to be those that adopted it earlier assuming broader support that never materialized.

If it has a five year start and we've seen almost zero hardware shipping that is a pretty bad sign.

IIRC AV1 decoding hardware started shipping within a year of the bitstream being finalized. (Encoding took quite a bit longer but that is pretty reasonable)


https://en.wikipedia.org/wiki/Versatile_Video_Coding#Hardwar...

Yeah, that's... sparse uptake. A few smart TV SOCs have it, but aside from Intel it seems that none of the major computer or mobile vendors are bothering. AV2 next it is then!


When even H.265 is being dropped by the likes of Dell, adoption of H.266 will be even worse making it basically DOA for anything promising. It's plagued by the same problems H.265 is.

Dell is significant in the streaming and media world?

Dell and HP are significant in the "devices" world and they just dropped the support for HEVC hardware encoding/decoding [1] to save a few cents per device. You can still pay for the Microsoft add-in that does this. It's not just streaming, your Teams background blur was handled like that.

Eventually people and companies will associate HEVC with "that thing that costs extra to work", and software developers will start targeting AV1/2 so their software performance isn't depending on whether the laptop manufacturer or user paid for the HEVC license.

[1] https://arstechnica.com/gadgets/2025/11/hp-and-dell-disable-...


On the same line, Synology dropped it on their NAS too (for their video, media etc ... Even thumbnails, they ask the sender device to generate one locally and send it, the NAS won't do it anymore for HEVC)

Dell is dropping it to save 4 cents per device, so users will have to pay $1 to Microsoft per user instead. Go figure.

Also you can just use Linux, Dell / HP have no control over the actual GPU for that, I think they just disabled it in Windows level. Linux has no gatekeepers for that and you can use your GPU as you want.

But this just indicates that HEVC etc. is a dead end anyway.


Sounds like they need something akin to audio volume normalization but for video. You can go bright, but only in moderation, otherwise your whole video gets dimmed down until the average is reasonable.

I was about to write that. The algorithm need to be chosen, what is mostly used for audio gain normalization? Rolling average?

Actually I don’t even agree with that. I don’t want to be flash banged.

Every type of DRAM is ultimately made at the same fabs, so if one type is suddenly in high demand then the supply of everything else is going to suffer.

Wait, really? For CPUs each generation needs basically a whole new fab, I thought… are they more able to incrementally upgrade RAM fabs somehow?

Not quickly but if somebody puts enough money on the table, the fabs change too. All about cost and return. Micron just axed their brand crucial (end customer RAM and SSD) because they will only sell to database centers from now on.

Crazy times.


The old equipment is mothballed because china is the only buyer and nobody wants to do anything that the Trump admin will at some point decide is tariff-worthy. So it all sits.

I wonder if Apple will budge. The margins on their RAM upgrades were so ludicrous before that they're probably still RAM-profitable even without raising their prices, but do they want to give up those fat margins?

I know contract prices are not set in stone. But if there’s one company that probably has their contract prices set for some time in the future, that company is Apple, so I don’t think they will be giving up their margins anytime soon.

> I wonder if Apple will budge.

Perhaps I don't understand something so clarification would be helpful:

I was under the impression that Apple's RAM was on-die, and so baked in during chip manufacturing and not a 'stand alone' SKU that is grafted onto the die. So Apple does not go out to purchase third-party product, but rather self-makes it (via ASML) when the rest of the chip is made (CPU, GPU, I/O controller, etc).

Is this not the case?


Apple's RAM is on-package, not on-die. The memory is still a separate die which they buy from the same suppliers as everyone else.

https://upload.wikimedia.org/wikipedia/commons/d/df/Mac_Mini...

That whole square is the M1 package, Apple's custom die is under the heatspreader on the left, and the two blocks on the right are LPDDR packages stacked on top of the main package.

https://wccftech.com/apple-m2-ultra-soc-delidded-package-siz...

Scaled up, the M2 Ultra is the same deal just with two compute dies and 8 separate memory packages.


Sadly everything in the general direction of RAM or SSD chips is getting more expensive because a lot of production capacity is redistributed to serve AI chips and everything around.

Even lower end GPUs are getting more expensive even if they are not really useful for AI. But they still contain <some> chips and ram which is in high demand.

So yes, Apple will likely also have to pay higher priceses when they renew their contracts.


RAM upgrades are such a minor, insignificant part of Apple's income - and play no part in plans for future expansion/stock growth.

They don't care. They'll pass the cost on to the consumers and not give it a second thought.


I'd like to believe that their pricing for ram upgrades are like that so the base model can hit a low enough of a price. I don't believe they have the same margin for the base model compared to the base model + memory upgrade.

I read online that Apple uses three different RAM suppliers supposedly? I wonder if Apple has the ability to just make their own RAM?

Apple doesn't own any foundries, so no. It's not trivial to spin up a DRAM foundry either. I do wonder if we'll see TSMC enter the market though. Maybe under pressure from Apple or nvidia...

There are no large scale pure play DRAM fabs that I’m aware of, so Apple is (more or less) buying from the same 3 companies as everyone else.

Apple doesn't own semiconductor fabs. They're not capable of making their own RAM.

on one hand they are loosing profit, on the other hand they are gaining on market share. They will probably wait a short while to assess how much they are willing to sacrifice profits for market share

I am fully expecting a 20%+ price bump on new mac hardware next year.

Not me. It’s wildly unusual for Apple to raise their prices on basically anything… in fact I'm not sure if its ever happened. *

It’s been pointed out by others that price is part of Apple's marketing strategy. You can see that in the trash can Mac Pro, which logically should have gotten cheaper over the ridiculous six years it was on sale with near-unchanged specs. But the marketing message was, "we're selling a $3000 computer."

Those fat margins leave them with a nice buffer. Competing products will get more expensive; Apple's will sit still and look even better by comparison.

We are fortunate that Apple picked last year to make 16gb the new floor, though! And I don't think we're going to see base SSDs get any more generous for a very, very long time.

* okay I do remember that Macbook Airs could be had for $999 for a few years, that disappeared for a while, then came back


It’s 4D chess my dude, they were just training people to accept those super high ram prices. They saw this coming I tell you!

PG got suspended for directing his followers to Mastodon: https://x.com/alexisohanian/status/1604604968677392386

They did walk that policy back due to the backlash, but they really wanted to do it.


Paying users are also explicitly given priority in the reply section, which naturally hands a megaphone to the type of user that is more willing to give money to Elon Musk and wear the "I gave money to Elon Musk" badge.

To be fair, Samsung's divisions having guns pointed at each other is nothing new. This is the same conglomerate that makes their own chip division fight for placement in their own phones, constantly flip-flopping between using Samsung or Qualcomm chips at the high end, Samsung or Mediatek chips at the low end, or even a combination of first-party and third-party chips in different variants of ostensibly the same device.

To be honest, this actually sounds kinda healthy.

It's a forcing function that ensures the middle layers of a vertically integrated stack remain market competitive and don't stagnate because they are the default/only option

Sears would like to have a word about how healthy intra-company competition is.

Sears had horizontal market where all of it did basically the same thing. Samsung is a huge conglomerate of several completely different vertical with lots of redundant components.

It makes absolutely no sense to apply the lessons from one into the other.


Sears had horizontal market where all of it did basically the same thing. Samsung is a huge conglomerate of several completely different vertical with lots of redundant components.

Sears was hardly horizontal. It was also Allstate insurance and Discover credit cards, among other things.


Ok. And if it did divide on the borders of insurance and payment services, the reorganization wouldn't have been complete bullshit and may even have been somewhat successful.

I think what the GP was referring to was the "new" owner of Sears, who reorganized the company into dozens of independent business units in the early 2010s (IT, HR, apparel, electronics, etc). Not departments, either; full-on internal businesses intended as a microcosm of the free market.

Each of these units were then given access to an internal "market" and directed to compete with each other for funding.

The idea was likely to try and improve efficiency... But what ended up happening is siloing increased, BUs started infighting for a dwindling set of resources (beyond normal politics you'd expect at an organization that size; actively trying to fuck each other over), and cohesion decreased.

It's often pointed to as one of the reasons for their decline, and worked out so badly that it's commonly believed their owner (who also owns the company holding their debt and stands to immensely profit if they go bankrupt) desired this outcome... to the point that he got sued a few years ago by investors over the conflict of interest and, let's say "creative" organizational decisions.


This happened at a place where I worked years ago, but not as 'on purpose.' We were a large company where most pieces depended on other pieces, and everything was fine - until a new CEO came in who started holding the numbers of each BU under a microscope. This led to each department trying to bill other departments as an enterprise customer, who then retaliated, which then led to internal departments threatening to go to competitors who charged less for the same service. Kinda stupid how that all works - on paper it would have made a few departments look better if they used a bottom barrel competitor, but in reality the company would have bled millions of dollars as a whole...all because one rather large BU wanted to goose its numbers.

Why is that a bad thing? If an internal department that’s not core to their business is less efficient than an external company - use the external company.

Anecdote: Even before Amazon officially killed Chime, everyone at least on the AWS side was moving to officially supported Slack.


I guess it depends on circumstances, but it boils down to each department only cost others some marginal cost in practice.

Imagine a hosting company and a dns company, both with plenty of customers and capacity. The hosting company says... I'll host your DNS site, if you provide DNS to our hosting site. Drop in the bucket for each.

One year the DNS company decides it needs to show more revenue, so will begin charging the hosting company $1000/yr, and guess what the hosting company says the same. Instead, they each get mad and find $500/yr competitors. What was accomplished here?

Further, it just looks bad in many cases. Imagine if Amazon.com decided AWS was too expensive, and decided to move their stuff off to say, Azure only. That wouldn't be a great look for AWS and in turn hurts...Amazon.

I do get your point, but there are a lot of... intangibles about being in a company together.


There is more politics than you think within Amazon Retail about moving compute over to AWS. I’m not sure how much of Amazon Retail runs on AWS instead of its own infrastructure (CDO).

I know one project from Amazon got killed because their AWS bill was too high. Yeah AWS charges Amazon Retail for compute when they run on AWS hardware.

https://www.lastweekinaws.com/blog/the-aws-service-i-hate-th...


As a rule, organizations are created to avoid the transaction costs on those detail tasks. If you externalize every single supporting task into a market, you will be slowed down to a drag, won't be able to use most competitive advantages, and will pay way more than doing them in house.

But removing the market competition is a breeding ground for inefficiency. So there's a balance there, and huge conglomerates tying their divisions together serves only to make the competitive ones die by the need to use the services of the inefficient ones.


My four years at AWS kind of indoctrinated me. As they said, everytime you decide to buy vs build, you have to ask yourself “does it make the beer taste better”?

Don’t spend energy on undifferentiated heavy lifting. If you are Dropbox it makes sense to move away from S3 for instance.


to put a finer point on it, it wasn't just competition or rewarding-the-successful, the CEO straight up set them at odds with each other and told them directly to battle it out.

basically "coffee is for closers... and if you don't sell you're fired" as a large scale corporate policy.


Yes, this is what I was referring to. I should have provided more context, thanks for doing so.

That was a bullshit separation of a single horizontal cut of the market (all of those segments did consumer retail sales) without overlap.

The part about no overlaps already made it impossible for them to compete. The only "competition" they had was in the sense of TV gameshow competition where candidates do worthless tasks, judged by some arbitrary rules.

That has absolutely no similarity to how Samsung is organized.


Nokia too

The opposite, nepotism, is very unhealthy, so i think you're correct.

Not sure that the opposite of transfer pricing is nepotism. As far as I know it’s far more common for someone who owns a lake house to assign four weeks a year to each grandkid , than to make them bid real money on it and put that in a maintenance fund or something. Though it’s an interesting idea, it’s not very family friendly


I genuinely can't tell if this is sarcasm? Or do you live somewhere where this is taught?

Yeah, makes absolute sense.

A bit like Toyota putting a GM engine in their car, because the Toyota engine division is too self-centered, focusing to much on efficiency.


You mean toyota putting bmw engine (supra). Your statement is contradicting as Toyota has TRD, which focuses on the track performance. They just couldn't keep up with the straight six perf+reliability when comparing to their own 2jz

> toyota putting bmw engine (supra).

Or Toyota using a Subaru engine (Scion FRS, Toyota GT86)


Buying a Supra is stupid. Either buy a proper BMW with the b58/Zf8 speed and get a proper interior or stop being poor and buy an LC500.

Better yet, get a C8 corvette and gap all of the above for a far better value. You can get 20% off msrp on factory orders with C8 corvettes if you know where to look.


Isn't this how South Korean chaebols work?

They operate with tension. They're supposed to have unified strategic direction from the top, but individual subsidiaries are also expected to be profit centers that compete in the market.


I worked with some supply chain consultants who mentioned "internal suppliers are often worse suppliers than external".

Their point was that service levels are often not as stringently tracked, SLA's become internal money shuffling, but the company as a whole paid the price in lower output/profit. The internal partner being the default allows an amount of complacency, and if you shopped around for a comparable level of service to what's being provided, you can often find it for a better price.


I think this is a good time to reference a comic showing software companies' various "Org Charts", especially the one for Microsoft.

https://goomics.net/62


> two versions of the same phone with different processors

That's hilarious, which phone is this?


Basically every Galaxy phone comes in two versions. One with Exynos and one with Snapdragon. It's regional though. US always gets the Snapdragon phones while Europe and mostly Asia gets the Exynos version.

My understanding is that the Exynos is inferior in a lot of ways, but also cheaper.


In the past using Snapdragon CPUs for the U.S. made sense due to Qualcomm having much better support for the CDMA frequencies needed by Verizon. Probably no longer relevant since the 5G transition though.

Not one phone, they did this all over the place. Their flagship line did this starting with the Galaxy S7 all the way up to Galaxy S24. Only the most recent Galaxy S25 is Qualcomm Snapdragon only, supposedly because their own Exynos couldn't hit volume production fast enough.

"Galaxy S II" and its aesthetics was already a mere branding shared across at least four different phones with different SoCs, before counting in sub-variants that share same SoCs. This isn't unique to Samsung, nor is it a new phenomenon, just how consumer products are made and sold.

1: https://en.wikipedia.org/wiki/Samsung_Galaxy_S_II


The S23 too was Snapdragon only, allegedly to let the Exynos team catch some breath and come up with something competitive for the following generation. Which they partly did, as the Exynos S24 is almost on par with its Snapdragon brother. A bit worse on photo and gaming performance, a bit better in web browsing, from the benchmarks I remember.

The S23 was also Snapdragon-only as far as I know[1]. The S24 had the dual chips again, while as you say S25 is Qualcomm only once more.

[1]: https://www.androidauthority.com/samsung-exynos-versus-snapd...


This is the case as recent as of S24, phones can come with exynos or snapdragon, with exynos usually featuring worse performance and battery life

I might be out of date, but last I knew, it was "most of them."

International models tended to use Samsung's Exynos processors, while the ones for the North American market used Snapdragons or whatever.


Several high end Galaxy S's AFAIK.

That’s really good business. Everyone is pushing to be the best rather than accepting mediocrity.

> The SOTA cheats use custom hardware that uses DMA to spy on the game state.

And the SOTA anti-cheats now use IOMMU shenanigans to keep DMA devices from seeing the game state. The arms race continues.


They also require TPM, which I think facilitates remote attestation for secure boot.

Valve recently said outright that they have no VR titles in development.

https://www.roadtovr.com/valve-no-first-party-vr-game-in-dev...


They also said there was nothing coming for SteamDeck in terms of better hardware about a week before they launched the OLED.

Did they? AFAICT what they actually said was not to expect a faster Steam Deck any time soon, which was true, because the OLED version had basically the same performance as the original and in the two years since they still haven't released anything faster.

https://www.theverge.com/2023/9/21/23884863/valve-steam-deck...


OLED has the same HW as the LCD, with only very minor differences

Maybe they finished it...

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: