H266/VVC has a five year head-start over AV2, so probably that first unless hardware vendors decide to skip it entirely. The final AV2 spec is due this year, so any day now, but it'll take a while to make it's way into hardware.
H266 is getting fully skipped (except possibly by Apple). The licensing is even worse than H265, the gains are smaller, and Google+Netflix have basically guaranteed that they won't use it (in favor of AV1 and AV2 when ready).
Did anybody, including the rightsholders, come out ahead on H265? From the outside it looked like the mutually assured destruction situation with the infamous mobile patents, where they all end up paying lawyers to demand money from each other for mostly paper gains.
Why, the patent office did. There are many ideas that cannot be reinvented for the next few decades, and thanks to submarine patents it is simply not safe to innovate without your own small regiment of lawers.
VVC is pretty much a dead end at this point. Hardly anyone is using it; it's benefits over AV1 are extremely minimal and no one wants the royalty headache. Basically everyone learned their lesson with HEVC.
It is being used in China and India for Streaming. Brazil chose it with LCEVC for their TV 3.0. Broadcasting industry is also preparing for VVC. So it is not popular as in Web and Internet is usage, but it is certainly not dead.
Right, I know some studios have used it for high quality/resolution/framerate exports too, so it is definitely not dead. But still a dead end, from everything I've seen. No one seems to want to bother with it unless it is already within the entire pipeline. Every project I've seen that worked with it that went to consumers or editors ended up running into issues of some sort that they ended up using something else entirely and any VVC support basically abandoned or deprecated. It's a shame because VVC really is pretty awesome, but the only people using it seem to be those that adopted it earlier assuming broader support that never materialized.
If it has a five year start and we've seen almost zero hardware shipping that is a pretty bad sign.
IIRC AV1 decoding hardware started shipping within a year of the bitstream being finalized. (Encoding took quite a bit longer but that is pretty reasonable)
Yeah, that's... sparse uptake. A few smart TV SOCs have it, but aside from Intel it seems that none of the major computer or mobile vendors are bothering. AV2 next it is then!
When even H.265 is being dropped by the likes of Dell, adoption of H.266 will be even worse making it basically DOA for anything promising. It's plagued by the same problems H.265 is.
Dell and HP are significant in the "devices" world and they just dropped the support for HEVC hardware encoding/decoding [1] to save a few cents per device. You can still pay for the Microsoft add-in that does this. It's not just streaming, your Teams background blur was handled like that.
Eventually people and companies will associate HEVC with "that thing that costs extra to work", and software developers will start targeting AV1/2 so their software performance isn't depending on whether the laptop manufacturer or user paid for the HEVC license.
On the same line, Synology dropped it on their NAS too (for their video, media etc ... Even thumbnails, they ask the sender device to generate one locally and send it, the NAS won't do it anymore for HEVC)
Also you can just use Linux, Dell / HP have no control over the actual GPU for that, I think they just disabled it in Windows level. Linux has no gatekeepers for that and you can use your GPU as you want.
But this just indicates that HEVC etc. is a dead end anyway.
Sounds like they need something akin to audio volume normalization but for video. You can go bright, but only in moderation, otherwise your whole video gets dimmed down until the average is reasonable.
Every type of DRAM is ultimately made at the same fabs, so if one type is suddenly in high demand then the supply of everything else is going to suffer.
Not quickly but if somebody puts enough money on the table, the fabs change too.
All about cost and return.
Micron just axed their brand crucial (end customer RAM and SSD) because they will only sell to database centers from now on.
The old equipment is mothballed because china is the only buyer and nobody wants to do anything that the Trump admin will at some point decide is tariff-worthy. So it all sits.
I wonder if Apple will budge. The margins on their RAM upgrades were so ludicrous before that they're probably still RAM-profitable even without raising their prices, but do they want to give up those fat margins?
I know contract prices are not set in stone. But if there’s one company that probably has their contract prices set for some time in the future, that company is Apple, so I don’t think they will be giving up their margins anytime soon.
Perhaps I don't understand something so clarification would be helpful:
I was under the impression that Apple's RAM was on-die, and so baked in during chip manufacturing and not a 'stand alone' SKU that is grafted onto the die. So Apple does not go out to purchase third-party product, but rather self-makes it (via ASML) when the rest of the chip is made (CPU, GPU, I/O controller, etc).
That whole square is the M1 package, Apple's custom die is under the heatspreader on the left, and the two blocks on the right are LPDDR packages stacked on top of the main package.
Sadly everything in the general direction of RAM or SSD chips is getting more expensive because a lot of production capacity is redistributed to serve AI chips and everything around.
Even lower end GPUs are getting more expensive even if they are not really useful for AI.
But they still contain <some> chips and ram which is in high demand.
So yes, Apple will likely also have to pay higher priceses when they renew their contracts.
I'd like to believe that their pricing for ram upgrades are like that so the base model can hit a low enough of a price. I don't believe they have the same margin for the base model compared to the base model + memory upgrade.
Apple doesn't own any foundries, so no. It's not trivial to spin up a DRAM foundry either. I do wonder if we'll see TSMC enter the market though. Maybe under pressure from Apple or nvidia...
on one hand they are loosing profit, on the other hand they are gaining on market share. They will probably wait a short while to assess how much they are willing to sacrifice profits for market share
Not me. It’s wildly unusual for Apple to raise their prices on basically anything… in fact I'm not sure if its ever happened. *
It’s been pointed out by others that price is part of Apple's marketing strategy. You can see that in the trash can Mac Pro, which logically should have gotten cheaper over the ridiculous six years it was on sale with near-unchanged specs. But the marketing message was, "we're selling a $3000 computer."
Those fat margins leave them with a nice buffer. Competing products will get more expensive; Apple's will sit still and look even better by comparison.
We are fortunate that Apple picked last year to make 16gb the new floor, though! And I don't think we're going to see base SSDs get any more generous for a very, very long time.
* okay I do remember that Macbook Airs could be had for $999 for a few years, that disappeared for a while, then came back
Paying users are also explicitly given priority in the reply section, which naturally hands a megaphone to the type of user that is more willing to give money to Elon Musk and wear the "I gave money to Elon Musk" badge.
To be fair, Samsung's divisions having guns pointed at each other is nothing new. This is the same conglomerate that makes their own chip division fight for placement in their own phones, constantly flip-flopping between using Samsung or Qualcomm chips at the high end, Samsung or Mediatek chips at the low end, or even a combination of first-party and third-party chips in different variants of ostensibly the same device.
It's a forcing function that ensures the middle layers of a vertically integrated stack remain market competitive and don't stagnate because they are the default/only option
Sears had horizontal market where all of it did basically the same thing. Samsung is a huge conglomerate of several completely different vertical with lots of redundant components.
It makes absolutely no sense to apply the lessons from one into the other.
Sears had horizontal market where all of it did basically the same thing. Samsung is a huge conglomerate of several completely different vertical with lots of redundant components.
Sears was hardly horizontal. It was also Allstate insurance and Discover credit cards, among other things.
Ok. And if it did divide on the borders of insurance and payment services, the reorganization wouldn't have been complete bullshit and may even have been somewhat successful.
I think what the GP was referring to was the "new" owner of Sears, who reorganized the company into dozens of independent business units in the early 2010s (IT, HR, apparel, electronics, etc). Not departments, either; full-on internal businesses intended as a microcosm of the free market.
Each of these units were then given access to an internal "market" and directed to compete with each other for funding.
The idea was likely to try and improve efficiency... But what ended up happening is siloing increased, BUs started infighting for a dwindling set of resources (beyond normal politics you'd expect at an organization that size; actively trying to fuck each other over), and cohesion decreased.
It's often pointed to as one of the reasons for their decline, and worked out so badly that it's commonly believed their owner (who also owns the company holding their debt and stands to immensely profit if they go bankrupt) desired this outcome... to the point that he got sued a few years ago by investors over the conflict of interest and, let's say "creative" organizational decisions.
This happened at a place where I worked years ago, but not as 'on purpose.' We were a large company where most pieces depended on other pieces, and everything was fine - until a new CEO came in who started holding the numbers of each BU under a microscope. This led to each department trying to bill other departments as an enterprise customer, who then retaliated, which then led to internal departments threatening to go to competitors who charged less for the same service. Kinda stupid how that all works - on paper it would have made a few departments look better if they used a bottom barrel competitor, but in reality the company would have bled millions of dollars as a whole...all because one rather large BU wanted to goose its numbers.
Why is that a bad thing? If an internal department that’s not core to their business is less efficient than an external company - use the external company.
Anecdote: Even before Amazon officially killed Chime, everyone at least on the AWS side was moving to officially supported Slack.
I guess it depends on circumstances, but it boils down to each department only cost others some marginal cost in practice.
Imagine a hosting company and a dns company, both with plenty of customers and capacity. The hosting company says... I'll host your DNS site, if you provide DNS to our hosting site. Drop in the bucket for each.
One year the DNS company decides it needs to show more revenue, so will begin charging the hosting company $1000/yr, and guess what the hosting company says the same. Instead, they each get mad and find $500/yr competitors. What was accomplished here?
Further, it just looks bad in many cases. Imagine if Amazon.com decided AWS was too expensive, and decided to move their stuff off to say, Azure only. That wouldn't be a great look for AWS and in turn hurts...Amazon.
I do get your point, but there are a lot of... intangibles about being in a company together.
There is more politics than you think within Amazon Retail about moving compute over to AWS. I’m not sure how much of Amazon Retail runs on AWS instead of its own infrastructure (CDO).
I know one project from Amazon got killed because their AWS bill was too high. Yeah AWS charges Amazon Retail for compute when they run on AWS hardware.
As a rule, organizations are created to avoid the transaction costs on those detail tasks. If you externalize every single supporting task into a market, you will be slowed down to a drag, won't be able to use most competitive advantages, and will pay way more than doing them in house.
But removing the market competition is a breeding ground for inefficiency. So there's a balance there, and huge conglomerates tying their divisions together serves only to make the competitive ones die by the need to use the services of the inefficient ones.
My four years at AWS kind of indoctrinated me. As they said, everytime you decide to buy vs build, you have to ask yourself “does it make the beer taste better”?
Don’t spend energy on undifferentiated heavy lifting. If you are Dropbox it makes sense to move away from S3 for instance.
to put a finer point on it, it wasn't just competition or rewarding-the-successful, the CEO straight up set them at odds with each other and told them directly to battle it out.
basically "coffee is for closers... and if you don't sell you're fired" as a large scale corporate policy.
That was a bullshit separation of a single horizontal cut of the market (all of those segments did consumer retail sales) without overlap.
The part about no overlaps already made it impossible for them to compete. The only "competition" they had was in the sense of TV gameshow competition where candidates do worthless tasks, judged by some arbitrary rules.
That has absolutely no similarity to how Samsung is organized.
Not sure that the opposite of transfer pricing is nepotism. As far as I know it’s far more common for someone who owns a lake house to assign four weeks a year to each grandkid , than to make them bid real money on it and put that in a maintenance fund or something. Though it’s an interesting idea, it’s not very family friendly
You mean toyota putting bmw engine (supra). Your statement is contradicting as Toyota has TRD, which focuses on the track performance. They just couldn't keep up with the straight six perf+reliability when comparing to their own 2jz
Buying a Supra is stupid. Either buy a proper BMW with the b58/Zf8 speed and get a proper interior or stop being poor and buy an LC500.
Better yet, get a C8 corvette and gap all of the above for a far better value. You can get 20% off msrp on factory orders with C8 corvettes if you know where to look.
They operate with tension. They're supposed to have unified strategic direction from the top, but individual subsidiaries are also expected to be profit centers that compete in the market.
I worked with some supply chain consultants who mentioned "internal suppliers are often worse suppliers than external".
Their point was that service levels are often not as stringently tracked, SLA's become internal money shuffling, but the company as a whole paid the price in lower output/profit. The internal partner being the default allows an amount of complacency, and if you shopped around for a comparable level of service to what's being provided, you can often find it for a better price.
Basically every Galaxy phone comes in two versions. One with Exynos and one with Snapdragon. It's regional though. US always gets the Snapdragon phones while Europe and mostly Asia gets the Exynos version.
My understanding is that the Exynos is inferior in a lot of ways, but also cheaper.
In the past using Snapdragon CPUs for the U.S. made sense due to Qualcomm having much better support for the CDMA frequencies needed by Verizon. Probably no longer relevant since the 5G transition though.
Not one phone, they did this all over the place. Their flagship line did this starting with the Galaxy S7 all the way up to Galaxy S24. Only the most recent Galaxy S25 is Qualcomm Snapdragon only, supposedly because their own Exynos couldn't hit volume production fast enough.
"Galaxy S II" and its aesthetics was already a mere branding shared across at least four different phones with different SoCs, before counting in sub-variants that share same SoCs. This isn't unique to Samsung, nor is it a new phenomenon, just how consumer products are made and sold.
The S23 too was Snapdragon only, allegedly to let the Exynos team catch some breath and come up with something competitive for the following generation. Which they partly did, as the Exynos S24 is almost on par with its Snapdragon brother. A bit worse on photo and gaming performance, a bit better in web browsing, from the benchmarks I remember.
Did they? AFAICT what they actually said was not to expect a faster Steam Deck any time soon, which was true, because the OLED version had basically the same performance as the original and in the two years since they still haven't released anything faster.
reply