While the mental image of eating roadkill is also unappetizing to me, I have to admit my reaction here is irrational.
Eating roadkill isn’t much different from eating wild game you hunted — except with roadkill, it was someone else and their car that killed it accidentally, rather you and a gun intentionally.
If you didn't see it die you don't know what it died of. Shooting something healthy and then dressing it while fresh is different from finding windfall after some unknown amount of time.
This is just one of literally thousands of resources answering this exact question. There are other resources to help evaluate other potential consumption risks. There's no need to pretend that the only animals people can eat are the ones they witnessed being killed; people do otherwise, and have for millennia.
> So if the disk isn't alive, the file on it isn't alive, the inference software is not alive - then what are you saying is alive and thinking?
“So if the severed head isn’t alive, the disembodied heart isn’t alive, the jar of blood we drained out isn’t alive - then what are you saying is alive and thinking?”
- Some silicon alien life forms somewhere debating whether the human life form they just disassembled could ever be alive and thinking
Just because you saw a "HA - He used an argument that I can compare to a dead human" does not make your argument strong - there are many differences from a file on a computer vs a murdered human that will never come back and think again.
> In 2025, after a £12m investment, YASA opened the UK's first axial-flux super factory, in Oxfordshire.
It’s a little sad to me that fundamental innovations in electromechanical engineering like this get just a few million in investment, yet if this had been yet another derivative software startup with “AI” in the pitch, they’d probably have 10x+ or more investments being thrown at them.
Seems to me everyone wants to invest, instead, into something that can be "web scale" with low marginal cost, that is, natural monopolies. There is not enough anti-trust enforcement.
The thing is, you can simultaneously be completely correct about the market being insane, while also entirely wrong in expecting it to behave in a sane way.
Cue the famous quote: “The market can remain irrational longer than you can remain solvent.”
I have a vague theory that as the amount of wealth inequality increases in a system along with “money printing” (lending, hypothecation, etc where the wealthy are permitted privileged leverage and risk), the more detached markets become from reality in general. In such a case, an increasing majority of the money circulating has no need to be grounded in anything close to the common basic needs and values that most normal people have to live with.
Instead, most important to such wealth is to tap into the source of inflation to be on the winning side of that. This becomes a game of its own, where an investment’s connection to reality or fundamental value is mostly irrelevant compared to how it leverages or monopolizes the state-created and privately created instruments of “money printing” (sketchy lending, rehypothecation, etc.) and other such “games” that only the wealthy are allowed in on.
> Cue the famous quote: “The market can remain irrational longer than you can remain solvent.”
It's not necessarily about things being (ir)rational, but about 'psychology' and the multi-player system that is The Market™. Because it's all very well and good to buy and sell individual products (securities) on their merits, but one also has to take into account what other people's ideas on them is as well (as you are buying/selling from them).
This factor has been known about for almost a century:
> A Keynesian beauty contest is a beauty contest in which judges are rewarded for selecting the most popular faces among all judges, rather than those they may personally find the most attractive. This idea is often applied in financial markets, whereby investors could profit more by buying whichever stocks they think other investors will buy, rather than the stocks that have fundamentally the best value, because when other people buy a stock, they bid up the price, allowing an earlier investor to cash out with a profit, regardless of whether the price increases are supported by its fundamentals and theoretical arguments.
Index funds won't necessarily save you. 7.5% of the S&P 500 is NVidia, 7% is Microsoft, etc. Almost 40% of the S&P 500 is in the top 10 stocks, and of the top 10, only #9 Berkshire Hathaway is not big into AI.
Index funds aren't supposed to save you from a market setback. In a correction or crash, you will lose money. They merely save you from the total ruin that can come with leverage, or from thinking you can outplay the stocks or options market as an amateur.
I'm glad to see OP's comment voted to the top, b/c it models good thinking. He knows what he doesn't know, and so he sticks to index funds.
Also -- I don't know anybody who still buys S&P 500 funds, now that there are broader funds available. None of the funds for Canadians that GP listed is limited to the S&P 500, so it's unclear why you would respond as if that's the index he's talking about.
- he's overweighted on Canada. Being Canadian themselves, that's a double risk. If Canada does poorly, the chance of his livelihood being affected is high. Investments should be anti-correlated from livelihood risks.
- despite being 30% in Canada, VEQT has 2.5% in NVDA. By itself that's fine, but once you add similar amounts for MSFT, GOOG, META, AAPL, BCOM, etc, it becomes a significant portion of the index.
The point of an index fund is to be diversified. If one sector crashes but other sectors do well you're still fine. The OP will lose significant money if either Canada or AI crashes, even if the rest of the world is doing well.
Vanguard/Blackrock could set the allocation to whatever they wanted, but it's a conscious choice. Absolute returns are not necessarily the only consideration (if they are, perhaps buy a NASDAQ fund).
Depends on the index. The usual ones are indeed market cap weighted, and so adopt the overvaluation of bubble stocks, but there are indexes which are weighted otherwise, in an attempt to avoid that. One example is the RAFI fundamental family of indexes:
They are pretty cagey about the exact formula, but they do say that
> Security weights are determined by using fundamental measures of company size (adjusted sales, cash flow, dividends + buybacks, and book value) rather than price (market cap).
The top ten holdings in their US index are (rank - company - weight):
1 Apple 4.1
2 Microsoft 3.4
3 Alphabet 3.3
4 Berkshire Hathaway 2.3
5 Amazon 2.2
6 Meta Platforms 2.2
7 JPMorgan Chase 2.1
8 Exxon Mobil 2.0
9 Bank Of America 1.4
10 Chevron 1.3
Whereas those of their benchmark, the Solactive GBS United States Large & Mid Cap Index, whatever that is, are:
1 Nvidia 7.1
2 Microsoft 7.0
3 Apple 5.7
4 Amazon 4.0
5 Alphabet 3.7
6 Meta Platforms 3.1
7 Broadcom 2.4
8 Tesla 1.7
9 JPMorgan Chase 1.5
10 Eli Lilly 1.3
Glad to see a fellow fundamental indexer on HN! As a US based investor, I personally invest in the RAFI US broad market fundamental index (FNDB ETF) which does keep up with the Vanguard US total market over the past 10 years except the bubbly years of 2020/2021 & 2024/2025, even with a higher expense ratio.
In my case, after observing the Covid-19 craziness in market, I decided to dig further on value strategies and discovered this gem from Research Affiliates in Journal of Portfolio Management circa 2012, which completely convinced me on the concept of fundamental indexation as a superior alternative to market-cap weighted total market index.
2012 was a long time ago. I'm more inclined to Value myself, but has it held up?
I threw together a quick comparison with that tool (handy, thanks) of Vanguard Growth vs Vanguard Value and it's not too pretty. Sure, Value is less volatile, but...
I mean I guess we'll see what happens when the music stops again, but it resembles the same issue as being "right" about a market drop -- that you can be right, but the timing is such that it nevertheless would have been more lucrative to be invested the whole time anyway
My comment was originally much larger, but I trimmed it because it was muddying my original point.
Yes, you can choose an index fund that's not cap-weighted S&P 500. However, any index fund that didn't have a substantial portion of its investments in NVDA and friends did very poorly over the last few years.
So either way, you're screwed.
- If your index has a lot of NVDA et al, you're exposed to lots of risk.
- If it doesn't, your investment values are currently a lot lower than they otherwise have been.
So ideally you would be in cap-weighted S&P now and for the last few years, and switch just before the seemingly inevitable crash.
But that's no longer "put it in an index fund and forget about it".
You're certainly right that indexes like these don't benefit from the bubble!
But it's not the case that they "did very poorly". Forgive the UK sources, but compare HMWO (an MSCI World ETF) [1] and PSRW (a RAFI All World 3000 ETF) [2]. These are world indexes, but that's 70% US or something. For the last five years:
The concentration isn't the 'fault' of indexing per se. There are two styles of indexes used by funds/etfs.
1. Most indexes are market capitalization weighted indexes... which can lead to the high concentrations we currently see.
2. There are also equal weighted indexes. These are less popular for a multitude of reasons, not the least of which is the expense associated with keeping the fund equal weighted (the fund has to periodically - eg quarterly - buy/sell stocks to bring everything back to 'equal'
I am currently in the process of moving a portion of my allocation into an equal weight sp500 fund precisely because I want to lower my exposure to the largest ten stocks in the sp500.
Another way to accomplish that would be to buy a market capitalization weighted index consisting of mid size or small cap stocks - thus avoiding the concentration of the top 10. But that changes the overall portfolio in other ways (small cap factor). I decided to use equal weighting a portion of my large cap holdings because I feel it is a more precise way to address the very specific problem I am addressing without adding other variables.
So first off, picking individual winning stocks is hard because new information that determines pricing comes in randomly, so good luck getting information edge on your counter-party:
Those winning stocks also change over time: what used to be a winning choice can become a losing choice, so it's not like you can really set and forget things.
So index funds, buying all companies (especially if you go for more total market, like US Russell 3000), allow you to sidestep all of these risks. You are basically buying companies that service the entire economy, so as long as the economy is doing reasonably well the earnings of the companies will do reasonably well.
So yes, the S&P 500 is highly concentrated, but that is not the only index. Diversification is generally not a bad idea:
> I have a vague theory that as the amount of wealth inequality in increases in a system along with excess money printing (lending, hypothecation, etc where the wealthy are permitted privileged leverage and risk), the more detached markets become from reality in general.
If you want to make it less vague, you can read Keynes.
It's inequality that is the important one, money printing doesn't impact it (except for it impacting inequality). In simple language, people don't want to spend all their money on consumption (the "demand is infinite" you see on econ101 is an approximation), and so when only two dozen people have all the money there aren't many things you can sell and turn a profit. But those people still want to invest all the money they aren't using, there is just nothing to invest into.
At the turn of the 19th to the 20th century, explaining this was a huge open problem in economics.
I had no idea Keynes had similar ideas, so I definitely should read his work (and economics literature in general).
I probably should generalize my thoughts though to say “expectation of economic growth” (instead of just “money printing”) seems to me necessary to yield “opaque market insanity”, as opposed to “transparent evil sanity”.
As a thought experiment, consider a (practically impossible) scenario where there is universally no expectation for long-term economic growth/contraction — regardless of whether it’s “real” or just monetary. Then by definition, a long term market simply cannot exist at all. No amount of wealth inequality can cause market insanity if there is no (long-term) market at all.
Wealth inequality in such a situation can still yield hoarding, domination, conquest, control, scams, manipulations, etc. But I wouldn’t call that “market insanity” so much as “evil sanity”.
In practice, the real impact of wealth inequality on the common people would likely be the same either way. However, without long term economic growth/inflation, the “sane evil” of the greedy wealth can no longer hide behind the veil of “market insanity”.
Humanity have always had markets, investment has been a thing for thousands of years, while economy growth wasn't something people expected until something around the middle ages.
You probably won't get a lot to support that idea on the literature.
The Romans had private ownership of land, mines and early industrial concerns (e.g., steelmaking) 2000 y ago. They had a legal system to protect the interests of investors. Also, professionals specialized in providing various financial services.
And capitalism with unlimited ability to print money has only existed for 53 years. And capitalism with actual unlimited money printing has only existed for 17 years. These aren't ancient systems or part of fundamental human nature; they're modern experiments.
The restriction on how much money banks can print has only some 200 years. A while before that, banks invented unlimited money printing, and a while before that banks were completely reinvented as regulated entities because money was coming and going without control...
I don't know how the ancient civilizations handled non-metalic money, I know that on the Middle age it was a famous kingdom killer because most kings couldn't refrain from creating infinite money.
Calling these things modern experiments even though nothing fundamental has changed since the Roman age seems pretty foolish.
If anything, the experiments you're talking about are just the logical consequences of doing the same thing over and over.
After all, the experiments never seemed to modify the problematic element, all they did was increase the quantity according to the logic of accumulation.
In fact, isn't it remarkable that the last 2000 years have produced the exact same pattern over and over again?
The logic is always the same. Money from period A can be carried over to period B. This means there is too little money during period A and too much during period B.
Since period A is perpetually today, and period B is perpetually tomorrow, one could get the idea to at least fix period A, which isn't as stupid as the Austrian economists would like to tell you. But fixing today through quantity means there is even more money carried over to tomorrow. The problem is being fixed with more of itself. It certainly isn't being fixed by having a competing system for trade.
Abandoning gold, fractional reserve banking, QE, etc all exist due to the fundamental mistake of making it possible to carry something that is time and location bound away from the time and location it is bound to.
Reintroducing a gold standard doesn't change this logic. It just makes it slightly more visible.
When you look at Arrow-Debreu models, you see the assumption is that utility maximizing economic agents will spend their entire budget on either present utility (consumer goods) or future utility (investment goods). The concept of carrying money from one period to another doesn't exist and is inherently incompatible with equilibrium and yet you don't see economists warning us about the carrying over of past balances into the future with the exception Keynes and Wolfgang Stützel. Not even Marx thought that this was problematic. Even the Austrian economists know the problem, as they argue that the single individual with the lowest time preference should own the entire planet and that the real problem is the national central bank (which happens to be quite small in contrast to world domination).
The problem and its half baked attempts at solutions is at least as old as Christianity. Possibly all the way back to mesopotamia.
Money printing does directly impact inequality, via the Cantillon effect; in most cases, the printed money is put into the system in a way that disproportionately increases prices of assets that are held disproportionately by the wealthy.
True investment is when you put capital into a project or endeavor that is expected to earn rewards beyond its future sale price. You open a restaurant, and sell meals for more than the cost to make them. If your only hope is that 3 years from now you can sell the restaurant for more than you bought it for, it's no investment. Even if gold will be worth more, it won't make more of itself.
A post-truth environment adds to the ickyness of the feeling: on top of the bubbles, we've got RFK Jr. deciding the fate of biotechnology companies. Having a tech bubble at the same time science is being vandalized at NIH and in universities looks pretty damn dark.
Not just RFK Jr. The rest of the government requiring a 15% kickback from Nvidia and AMD to approve GPU sales to China, and the CEO of Intel being told to resign.
I feel like I'm going to be able to tell my adult kids "Yeah, when I was younger the Republicans were the party of free trade and government non-intervention in private industry..."
Conservatives have never been that party. They've always been the part of making the rich richer and the powerful more powerful by whatever means seem to work today. In the past free trade seemed to do that. Now arbitrary trade restrictions seem to do that. Or at least they feel so.
Yeah its important to decompose those two sources (among others) of "money printing". The obvious one people think most about is when our federal government does it. But a more concerning one is: Enforced banking reserve ratios. If a bank holds a trillion dollars in assets and is allowed to hold a reserve ratio of 10%, they can print $10T out of thin air, because they're allowed to issue debt up to that amount.
As far as I'm aware, in 2020 the reserve requirement in the US was set to 0%, and it has not been changed since then.
Yes, but with the “revolving door” between private financial institutions and government financial policy/regulation, there’s little real distinction anymore between the two.
Those private banks can print that money out of thin air because government allows them to. And the government officials (many formerly financial executives) allow them to because they “have to” to prevent “disastrous” private banking/financial collapse.
But if you or I wanted to play the same games to print our own money they way they do? No, that would be wrong and dangerous and illegal!
So it’s pretty clear that both government and private financial institutions are tightly coupled partners in a mostly corrupt, intentionally obfuscated shell game that primarily serves to keep money and power steadily flowing into the hands of the already wealthy and powerful.
Just look at who is actually held accountable for financial crimes. Some individual trader that finds and exploits some glitch that allows them to profit from the wealthy? Straight to jail. High ranking institutional powers (government and private) that implement often illegal schemes that continuously siphon wealth from common people into their hands? Slap on the wrist at most.
Reserve ratios have not been the major capital constraint for a very long time. Loan to deposit ratios have been. And those have stayed in normal bounds since the great financial crisis. Both bank regulators and importantly bank investors keep on top of this because they are the ones to lose the most if it gets out of wack.
The reserve requirement had to be loosened because banks became too conservative, largely because their investors were skittish about ldr.
You think that there might be a "quality to the quantity" wrt deposit levels after years in an ultra-low interest rate environment? IIRC, deposits in SVB, FRB, and Signature combined outsized WaMu, with the difference being that WaMu was one of the largest banks in the country, while the average person had not heard of Silicon Valley Bank et al. until the day "Silicon Valley Bank Fails," lit up headlines.
At least in the svb and frb case high ldr contributed to the bank runs that ended them. But note in this context frb at .96 was seen as bad and svb at 1.6 was disastrous(.7 is good).
Thats the real brake on money creation by banks, not the reserve requirement.
This is a misconception afaik, yes there is no longer a literal percent reserve requirement but banks are still required to be “adequately capitalized”, the metric is just more complicated now.
Doesn't our federal government set reserve ratios? They may not be creating money, but by setting the ratio (and other limits), they at least have a strong influence on creation.
Another source in the last half a decade or so is digital coin "mints."
Quick look on coin market puts the total market value at ~$3 trillion. Yet that money effectively was created from nothing. It's basically money printing.
With credit cards that accept digital coins as financial sources it's also started to affect the actual markets significantly.
From the charts shown, markets have also gotten quite a bit frothier, with larger swings and spike / drops in margin, since 2020 when coin valuations really took off.
Personal view, it probably also contributes since it's less "real" from a certain perspective. Just digital numbers to wager, that don't really mean the same as mortgaging your house. "Eh, just wager like 10 or 20 digi-coins on margin." Except that's like $1-2 million these days.
Notably, very little of the US economy is plausibly basic needs (a roof over ones head, basic nutrition, actual basic medical care, etc.). The vast, vast majority is essentially luxury goods and services, but Americans have been conditioned to think what the rest of the world considers luxury is actually basics.
If Americans actually cut back to actual basics (a fixer upper small house in a less desirable area), shared a older used car instead of buying several new ones (or a big truck!), made homecooked stews and beans and rice instead of eating out all the time or prepackaged food, stopped buying the latest fancy phones, took care of their health instead of gastric bypasses, dialysis, etc.
Hell, even if the average American stopped taking expensive vacations!
The world economy would likely collapse overnight, no joke. And it would likely be uglier than the Great Depression domestically.
> a fixer upper small house in a less desirable area
A lot of an area's desirability has to do with crime rate. Bulgaria has a homicide rate of 1.088, and the US 5.763. So what would be considered a very safe, friendly neighborhood in the US, would be average or worse in Bulgaria. In this sense, "luxury" is flipped - what Bulgarians would consider basic, would be "luxuriously safe" in the US.
Inner cities and specific (relatively uncommon!) rural areas (often in the Deep South) are what are dangerous in the US, and paradoxically even inner cities are often expensive to live in. Here is a map of homicide rate on a county by county basis [https://commons.m.wikimedia.org/wiki/File:Map_of_US_county_h...].
People often move to LCOL areas anyway to escape the crime and high costs of the cities when there are economic issues in the US.
"Cost of living" correlates fairly well with the available jobs and incomes in a region. You generally can't move to a LCOL region while also having a high paying job. Which is why the possibility of full remote work was so exciting for so many people.
> I have a vague theory that as the amount of wealth inequality in increases in a system along with excess money printing (lending, hypothecation, etc where the wealthy are permitted privileged leverage and risk), the more detached markets become from reality in general.
Except that the Gilded Age, which had some of the highest levels of wealth concentration and inequality, was during the period of the Gold Standard where money could not be 'printed excessively'. And this was true not just in the US but most of the major countries in the world.
Further, while wealth inequality has risen in the US under the non-gold fiat system (to levels similar to the Gilded Age), other countries do not have as much wealth inequality even though they are also non-gold fiat.
You are right - it's about the inequality of holding the money, not the rate at which it's printed. Money printing is relevant to the extent it mostly flows towards the already rich. If money was printed and distributed to everyone evenly it would have the opposite effect.
That can easily be explained away in that the wealth concentration was a symptom of vertically integrated hard network based implementations (railroads, logistics, shipping, extraction), and the Gold Standard may have braked some level of wealth inequality acceleration and centralization to a degree, but that the trusts and business structuring were the cause moreso than any inherent tendency toward gold as basis to full fiat.
That explains why we're seeing what we're seeing now. It's all about network monetization.
If the very most you need to live on is 10 million. You can gamble the rest. Buy apartments, jack up the rents like crazy worst thing to happen to you is that people may move out. Buy stocks on margin, win some loose some. The real economy your play toy.
Not sure why you're being downvoted. Word "gamble" too inciting? Maybe if you'd used "much more risky investments" instead, but I'm not here to quibble about your language but to agree with you and extend what you're saying. I actually think the $10 million number is also relative to age because someone who's 30 and "merely" a millionaire can and will invest up the risk ladder as if they're a 50-year-old risking their above-$10 million capital. And the population of people who are millionaires vs decamillionaires is of course a healthy multiple so there's a lot more risk appetite than the relatively small number of decamillionaires would suggest.
As an aside I feel like there's this terrible trend where folks focus so much effort and energy worrying about whether billionaires should exist, whether they should be taxed more aggressively, etc. that we've lost the plot on just how much loot even a net worth of $10+ million is. And at the risk of me writing a too-long comment (bad habit), think of the risk appetite someone has when their decamillionaire parents pass away, and they're given, sometimes overnight, millions of extra dollars. Sure, maybe they'll buy a house, but oftentimes those funds go straight into the market. With boomers starting to leave this mortal coil and their trillions of dollars being passed down you can start to understand why the market seems disconnected from historical fundamentals.
Intelligence and consciousness are two different things though, and some would argue they may even be almost completely orthogonal. (A great science fiction book called Blindsight by Peter Watts explores this concept in some detail BTW, it’s a great read.)
> That's roughly how it works in C, and I know that it's also UB there if you do it wrong, but one thing is different: It doesn't really ever occupy my mind as a problem. In Rust it does.
UB doesn’t occupy the author’s mind when writing C, when it really should. This kind of lazy attitude to memory safety is precisely why so much C code is notoriously riddled with memory bugs and security vulnerabilities.
There is an important difference for this case though. It C it’s fine to have pointers into uninitialized memory as you as you don’t read them until after initializing. You can write through those pointers the same way you always do. In Rust it’s UB as soon as you “produce” an invalid value, which includes references to uninitialized memory. Everything uses references in Rust but when dealing with uninitialized memory you have to scrupulously avoid them, and instead write through raw pointers. This means you can’t reuse any code that writes through &mut. Also, the rules change over time. At one point I had unsafe code that had a Vec of uninitialized elements, which was ok because I never produced a reference to any element until after I had written them (through raw pointers). But they later changed the Vec docs to say that’s UB, I guess because they want to reserve the right to use references even if you never call a method that returns a reference.
This stopped being much of a problem when MaybeUninit was stabilized. Now you can stick to using &MaybeUninit<T> / &mut MaybeUninit<T> instead of needing to juggle *T / *mut T and carefully track converting that to &T / &mut T only when it's known to be initialized, and you can't accidentally use a MaybeUninit<T> where you meant to use a T because the types are different.
It's not as painless as it could be though, because many of the MaybeUninit<T> -> T conversion fns are unstable. Eg the code in TFA needs `&mut [MaybeUninit<T>] -> &mut [T]` but `[T]::assume_init_mut()` is unstable. But reimplementing them is just a matter of copying the libstd impl, that in turn is usually just a straightforward reinterpret-cast one-liner.
I don’t get the difference. In both C and Rust you can have pointers to uninitialized memory. In both languages, you can’t use them except in very specific circumstances (which are AFAIK identical).
There are two actual differences in this regard: C pointers are more ergonomic than Rust pointers. And Rust has an additional feature called references, which enable a lot more aggressive compiler optimizations, but which have the restriction that you can’t have a reference to uninitialized memory.
I agree with you. My point is that the additional feature (references) creates a new potential for UB that doesn’t exist in C, and that justifies the “doesn't really ever occupy my mind as a problem” statement being criticized upthread. You can’t compare C to Rust-without-references because no one writes Rust that way. It’s not like C++-without-exceptions which is a legitimate subset that people use.
It's an open question whether creating a reference to an uninitialized value is instant UB, or only UB if that reference is misused (e.g. if copy_to_slice reads an uninitialized byte). The specific discussion is whether the language requires "recursive validity for references", which would mean constructing a reference to an invalid value is "language UB" (your program is not well specified and the compiler is allowed to "miscompile" it) rather than "library UB" (your program is well-specified, but functions you call might not expect an uninitialized buffer and trigger language UB). See the discussion here: https://github.com/rust-lang/unsafe-code-guidelines/issues/3...
Currently, the team is leaning in the direction of not requiring recursive validity for references. This would mean your code is not language UB as long as you can assume `set_len` and `copy_to_slice` never read from 'data`. However, it's still considered library UB, as this assumption is not documented or specified anywhere and is not guaranteed -- changes to safe code in your program or in the standard library can turn this into language UB, so by doing something like this you're writing fragile code that gives up a lot of Rust's safety by design.
That's right. Line 3 is undefined behaviour because you are creating mutable references to the uninit spare capacity of the vec. copy_to_slice only works with writing to initialized slices. The proper way for you example to mess with the uninitialized memory on a vec would be only use raw pointers or calling the newly added Vec::spare_capacity_mut function on the vec that returns a slice of MaybeUninit
Yes, this is the case that I ran into as well. You have to zero memory before reading and/or have some crazy combination of tracking what’s uninitialized capacity or initialized len, I think the rust stdlib write trait for &mut Vec got butchered over this concern.
It’s strictly more complicated and slower than the obvious thing to do and only exists to satisfy the abstract machine.
No. The correct way to write that code is to use .spare_capacity_mut() to get a &mut [MaybeUninit<T>], then write your Ts into that using .write_copy_of_slice(), then .set_len(). And that will not be any slower (though obviously more complicated) than the original incorrect code.
As I wrote in https://news.ycombinator.com/item?id=44048391 , you have to get used to copying the libstd impl when working with MaybeUninit. For my code I put a "TODO(rustup)" comment on such copies, to remind myself to revisit them every time I update the Rust version in toolchain.toml
Valgrind doesn’t tell you about UB, just if the code did something incorrect with memory and that depends on what the optimizer did if you did write UB code. You’ll need Miri to tell you if this kind of code is triggering UB which works by evaluating and analyzing the mid level of compiler output to check if Rust rules about safety are followed.
But that’s precisely NOT the problem that exists in OPs code. It’s a problem Valgrind will detect if and only if the optimizer does something weird to exploit the UB in the code which may or may not happen AND doesn’t even necessarily happen on that line of code which will leave you scratching your head.
UB is weird and valgrind is not a tool for detecting UB. For that you want Miri or UBSAN. Valgrind’s equivalent is ASAN and MSAN which catch UB issues incidentally in some rare cases and not necessarily where the UB actually happened.
I suspect that the main reason it doesn't really occupy the author's mind is that even though it's possible to misuse read(), it's really not that hard to actually use it safely.
It sounds like the more difficult problem here has to do with explaining to the compiler that read() is not being used unsafely.
The reason this particular UB doesn't need mindspace for C programmers is because it's not even meaningful to do anything with the parts of the buffer beyond the written length.
Most other UBs related to datums that you think you can do something with.
This function now works with both initialized and uninitialized data in practice. It also is transparent over whether the output buffer is an `u8` (a byte buffer to write it out into a `File`) or `u16` (a buffer for then using the UTF16). I've never had to think about whether this doesn't work (in this particular context; let's ignore any alignment concerns for writes into `out` in this example) and I don't recall running into any issues writing such code in a long long time.
If I write the equivalent code in Rust I may write
The problem is now obvious to me, but at least my intention is clear: "Come here! Give me your uninitialized arrays! I don't care!". But this is not the end of the problem, because writing this code is theoretically unsafe. If you have a `[u8]` slice for `out` you have to convert it to `[MaybeUninit<u8>]`, but then the function could theoretically write uninitialized data and that's UB isn't it? So now I have to think about this problem and write this instead:
...and that will also be unsafe, because now I have to convert my actual `[MaybeUninit<u8>]` buffer (for file writes) to `[u8]` for calls to this API.
Long story short, this is a problem that occupies my mind when writing in Rust, but not in C. That doesn't mean that C's many unsafeties don't worry me, it just means that this _particular_ problem type described above doesn't come up as an issue in C code that I write.
That's a fair workaround for my specific example. But I believe it's possible to contrive a different example where such a solution would not be possible. Put differently, I only tried to convey the overall idea of what I think is a shortcoming in Rust at the moment.
Edit: Also, I believe your code would fail my second section, as the `convert` function would have difficulty accepting a `[u8]` slice. Converting `[u8]` to `[MaybeUninit<u8>]` is not safe per se.
Yeah, you’d need to do something like accept an enum that is either &mut [u8] or &mut [MaybeUninit<u8>], and make a couple of impl From<>’s so callers can .into() whatever they want to pass…
But I don’t think this is really a shortcoming, so much as a simple consequence of strong typing. If you want take “whatever” as a parameter, you have to spell out the types that satisfy it, whether it’s via a trait, or an enum with specific variants, etc. You don’t get to just cast things to void and hope for the best, and still call the result safe.
I really wish Apple would make a MacBook Air variant with display quality on par with the iPad Pro or MacBook Pro (ProMotion/120hz and XDR/HDR, at least). The screen quality is the only reason I currently use the Pro despite its chunkier weight, since the local compute/memory of the Air is already plenty for me (and most users).
The iPad Pro proves that weight and battery life is no excuse here for the lack of state-of-the-art display tech in the MacBook Air. And as for cost — the base 14” MacBook Pro M4 (at $1600) isn’t significantly more expensive than the 15” MacBook Air M4 configured with same CPU/RAM/SSD (at $1400).
It’s really quite a shame that the iPad Pro hardware is in many way a better MacBook Air than the MacBook Air, crippled primarily by iOS rather than hardware.
I know Apple wants to differentiate ProMotion as a Pro feature, but even non-tech people I know are wondering why Android phones run smoother than iPhones. Stuff that would be completely unheard of purely because of how noticeable 60hz vs 120hz is.
Actual reputational damage is going on because of these poor decisions, I’m not surprised iPhones are struggling to obtain new market share. They just look like old and slow phones to most normal people now, “look how nice and smooth it looks” is such an easy selling point compared to trying to pretend people care about whatever Apple Intelligence is.
> but even non-tech people I know are wondering why Android phones run smoother than iPhones. Stuff that would be completely unheard of purely because of how noticeable 60hz vs 120hz is.
Are they? I'm a tech person and I can barely notice it at all. And I don't think I have a single non-tech friend who is even aware of the concept of video refresh rate.
Whenever there's something that doesn't feel smooth about an interface, it's because the app/CPU isn't keeping up.
I've honestly never understood why anyone cares about more than 60hZ for screens, for general interfaces/scrolling.
(Unless it's about video game response time, but that's not about "running smoother".)
Yes, human visual perception exists along a spectrum of temporal, spatial, and chromatic resolution that varies from person to person — I’ve even met some people who can’t perceive the difference between 30hz and 120hz, while to me and most people I know, the difference between 60hz and 120hz is enormous.
So you could make the same argument against high DPI displays, superior peak screen brightness, enormously better contrast ratio, color gamut, etc. Also speaker quality, keyboard quality, trackpad quality, etc.
Where does this argument end? Do you propose we regress to 60hz 1080p displays with brightness, contrast, and viewing angles that are abysmal by modern standards? Or is the claim that the MacBook Air’s current screen is the perfect “sweet spot” beyond which >99% of people can’t tell the difference?
I think the market data alone disproves this pretty conclusively. Clearly a significant enough percentage of the population cares enough about image quality to vote with their wallets so much so that enormous hardware industries continue to invest billions towards make any incremental progress in advancing the technology here.
To be fair, I think there’s strong data to support that modern “retina”-grade DPI is good enough for >99% of people. And you can argue that XDR/HDR is not applicable/useful for coding or other tasks outside of photo/video viewing/editing (though for the latter it is enormously noticeable and not even remotely approaching human visual limits yet). But there’s plenty of people who find refresh rate differences extremely noticeable (usually up to at least 120hz), and I think almost anyone can easily notice moderate differences in contrast ratio and max brightness in a brightly lit room.
It’s not imagined though, I use my partner’s phone sometimes and every time I used it I thought it was broken because the UI jitter was so jarring at 60Hz. Actually I’m still not convinced her phone isn’t broken. Also her flashlight resets to the lowest brightness EVERY time it’s cycled.
If the UI jitter on their phone was "so jarring", it's not because it's 60 Hz. It's because the phone's CPU isn't keeping up.
Like, nobody watches a video filmed at 60 fps and then watches their favorite TV show or a motion picture at 24 fps and says "the jitter was so jarring". And that's at less than half the rate we're even talking about! Similarly, even if you can tell the difference between 60 and 120 Hz, it's not jarring. It's not jittery. It's pretty subtle, honestly. You can notice it if you're paying attention, but you'd never in a million years call it "jarring".
I think a lot of people might be confusing 60 Hz with jittery UX that has nothing to do with the display refresh rate. Just because the display operates at a higher refresh rate doesn't mean the CPU is actually refreshing the interface at that rate. And with certain apps or with whatever happening in the background, it isn't.
> Like, nobody watches a video filmed at 60 fps and then watches their favorite TV show or a motion picture at 24 fps and says "the jitter was so jarring". And that's at less than half the rate we're even talking about!
Those have motion blur.
> Similarly, even if you can tell the difference between 60 and 120 Hz
I don't know why you're phrasing this so oddly doubtful? Being able to tell the difference between 60hz and 120hz is hardly uncommon. It's quite a large difference, and this is quite well studied.
> If the UI jitter on their phone was "so jarring", it's not because it's 60 Hz. It's because the phone's CPU isn't keeping up.
No, it's not. This isn't about dropped frames or micro-stutters caused by the CPU. It's about _motion clarity_.
You can follow the objects moving around on the screen much better, and the perceived motion is much smoother because there is literally twice the information hitting your eyes.
You can make a simple experiment — just change your current monitor to 30hz and move the mouse around.
Does it _feel_ different? Is the motion less smooth?
It's not because your computer is suddenly struggling to hit half of the frames it was hitting before; it's because you have less _motion information_ hitting your eyes (and the increased input lag; but that's a separate conversation).
60->120fps is less noticeable than 30->60fps; but for many, many people it is absolutely very clearly noticable.
> Like, nobody watches a video filmed at 60 fps and then watches their favorite TV show or a motion picture at 24 fps and says "the jitter was so jarring".
People absolutely complain about jitter in 24fps content on high-end displays with fast response times; it is especially noticeable in slow panning shots.
Google "oled 24fps stutter" to see people complaining about this.
It's literally why motion smoothing exists on TVs.
If you switch from 60hz to 30hz you absolutely notice. I wouldn’t think it’s wrong to say it is jarring.
30hz is still perfectly usable, but you constantly feel as if something is off. Like maybe you have a process running in the background eating all your CPU.
I imagine going from 120hz to 60hz is the same thing. It should be theoretically indistinguishable, but it’s noticeable.
That's bs. You will immediately notice the difference when going from let's say 120 hz down to 60 hz on a fast gaming pc even if you're just dragging windows around. Everything feels jarring to say the least compared to higher refresh rates and it has absolutely nothing to do with the CPU. It's because of the refresh rate.
It's same thing going from 120 hz to 60 hz on a phone while scrolling and swiping.
It's quite interesting though that there are people out there who won't notice the huge difference. But hey, at least they don't have to pay premium for the increase performance of the screen.
It’s deeply flawed logic at best (or an intentional red herring at worst) to cite the existence of pseudoscience discussed elsewhere, as an argument against real science being discussed here.
There is a well-understood science to both auditory and visual perception, even more concretely so for the visual side. The scientific literature on human perception in both categories is actively used in the engineering of almost every modern (audible/visual) device you use every day (both in hardware design, and software such as the design of lossy compression algorithms). We have very precise scientific understanding of the limits (and individual variation) of human visual and (to a slightly lesser extent) auditory perception and preferences.
That’s why I specifically emphasized “perception and preferences”. Believe it or not, the science covers both - both what people can perceive, and what people care about and value.
It continues to amaze me years later how many people happily enjoyed watching 4:3 content stretched to 16:9, before 4:3 mostly disappeared from broadcasts.
If you try using a 60hz screen after a 120hz one, it will feel very sluggish and choppy. As long as you don't get used to 120hz, you'll be fine with 60hz.
I've never really felt this way, and have used all kinds of screens of various resolutions, sizes, technologies, etc. For 99% of the typical use cases (chats, email, doom scrolling, etc.) there just is not a big enough perceptible difference for most buyers.
Screen refresh rate arguments are starting to have hints of audiophile discussions.
I flatly will not buy any monitor, laptop, phone, tablet, or TV with a refresh rate below 120hz. I had 120hz 1080p over DVI-dual link in 2010. I can accept graphically demanding games going down to ~50 fps, but for UI interactivity and navigation, I'll take 120hz+ only.
I also (hopefully) don't have to interact with any UI while the movie is playing, but if I did, I'd want that UI running at 120hz. Maybe TV streamers will start advertising 120hz output soon. Maybe I should just replace my streamer with a spare PC that can output 120.
> Maybe TV streamers will start advertising 120hz output soon.
120 Hz won’t make a difference on a TV box, imo, as abysmal state of their UI is far greater of a problem. High refresh rate is nothing when a transition takes seconds and when scroll is jittery even by 60 Hz standards. :(
That's actually not a bad recommendation if you want to keep sanity as a tech enthusiast :). Otherwise you start noticing how much stuff still hasn't been upgraded to support high dpi and high refresh rate and you can't go back
The good news is that human brain is amazing and will probably revert to reasonable perception if you use your non retina 60Hz screen for long enough :)
Yeah I think when they say non-tech people they mean a subset of people who know a bit about refresh rates (example being avid PC gamers for instance), but I'd still say the vast majority of people cannot tell 60 to 120. That or its not something they think about.
Certainly if they had both side by side they may be able to notice a difference, but in everyday use it makes no real difference to the vast majority of people. Anecdotally even though I do use Android myself, everyone around me still think iPhones look the smoothest (albeit most of them have never even touched a quality phone running android)
It's one of those things where once you have used it, you will notice it. Given most iOS users aren't swapping between pro and non pro models, it's not something you think about.
Just tried ProMotion vs. 60Hz on MBP, no/very little difference I can see. Sure it's just me but for me all the claims here are way exaggerated/psychological, almost like audiophiles being able to "hear" stuff that doesn't exist in a blind test.
It's baffling to me that some people claim to not see the difference. It's literally light and day to me. It's like someone looking at a low DPI screen and a high DPI screen and not being able to tell the difference.
Same. The suggestion that it’s like audiophiles totally missed the mark, because lots of audiophile claims do not stand up to double blind tests. I can guarantee that 60 vs 120 hz blind tests would be insanely easy to pass if there was window movement or scrolling or basically anything but static frames.
Are you sure the underlying application and the OS are even rendering 120Hz all the time? The panel being able to was enough to convince some people they're seeing "smooth scrolling" when it was actually 60Hz saving battery. That's the analogy to audiophiles.
As one of the upthread comments mentioned, this is something that probably varies with sensitivity between people.
But I am quite confident I'd be able to tell 60/120hz with a 100% accuracy within 5s of being able to interact with the device.
Probably under a second on an iPhone, ~2s on a Mac with a built-in display and slightly longer on iPads and bigger displays. Add ~2 extra second if I'm using a mouse instead of a trackpad.
I'm generally ok with 60Hz (the difference isn't that significant to me). But I can definitely see the difference in a head-to-head comparison with fast moving content. The easiest way to see it for me was to move the cursor around quickly. With 60Hz there are much more visible "jumps" between positions. With 120Hz it animates much more smoothly.
In this case it really is just you. I can tell a high-refresh-rate display from across the room. I can tell if someone’s iPhone is a Pro even if the person is sitting five meters away from me on a moving bus.
On the other hand, my MacBook has a 120 Hz display and both my iPad Mini and iPhone Mini are 60 Hz, and even though the difference is night and day, I don’t really MIND using them. It’s just not that cool.
>Yeah I think when they say non-tech people they mean a subset of people who know a bit about refresh rates (example being avid PC gamers for instance)
no, he didn't say that. he said they comment on the difference between apple and android (their perception). you have to take that as a given.
that "it's because refresh rate" is his hypothesis, so yes argue that, but not by changing his evidence.
I switch between refresh rates ranging from 60hz and 240hz every day and while I certainly notice the difference, unless I’m running games I adjust and forget about it in seconds. While VRR 120hz+ on all Apple device screens would be nice it’s not exactly a dealbreaker… it’s not like rendering my IDE with 2x+ extra frames changes much of anything.
I run Windows daily at work on a 60Hz display. I recently got my son a gaming PC complete with a 144Hz monitor. I was genuinely confused why Windows itself “felt” so much better. Just dragging windows around seemed like magic. It’s not that the UI is lagging on my machine, it’s more the smoothness of things when they move around. It makes everything seem faster, despite us timing various things and finding no actual performance differences.
In seriously surprised you can't tell, it feels significantly smoother for me to see a high refresh rate display. 60hz just looks sluggish/slow and wrong to me now. I had a side by side of the same monitor (was at a lan) and was watching my friend play and couldn't understand why his game looked so laggy untill I realise he had high refresh rate off. Turned on 144hz and it was so much better
That may have very little to do with refresh rate itself, and far more to do with the image processing and latency introduced by the monitor in different video modes.
On smartphones you interact with the UI in a more direct way, which probably makes the input latency even more obvious.
For me 120Hz is noticeable immediately when scrolling, though I also don’t find it important enough to warrant a higher price aside from gaming.
What I find more important is a high pixel density, though on phones that’s less of an issue as with PC screens - I have yet to find one comparable to the ones in current iMacs.
It just feels more "fluid" and real, and then you get used to it and 60Hz feels jittery. I have an iPad pro, and its honestly made me consider going with an iPhone Pro (I still have just the non-pro model), although not quite yet. However, I notice a huge difference between scrolling on my phone and scrolling on my ipad.
Its the same thing about retina vs. the previous resolutions we had put up with. Yes, you don't need them for text, but once you get used to it for text you don't want to go back.
I actually call BS on the "not-being-able-to-tell".
I will give you that most people outside of this websites audience will not be able to _tell_ it's because of the refresh rate.
But I am quite confident if you take most of 120hz iPhone users phones out of their hand, turn on low battery mode, most will be immediately able to tell that something _feels_ off.
> I actually call BS on the "not-being-able-to-tell".
I actually call BS on your BS.
I don't believe that people are standing with two phones in their hand - an Android and an iPhone - and comparing them the way that people here are suggesting. I don't think I have ever seen anyone do that IRL, and I don't believe anyone actually does it.
People go to the Apple Store to get their iPhone or to some other store to get their Android phone, because they are interested in either platform, and absolutely not thinking about hopping from one to the other based on some imperceptible screen-refresh 'smoothness'.
i used an android phone for a year with a 90 fps display. When I switched back to an iphone, it felt slow to me. i couldn't tell what the problem was, the brand new phone just felt sluggish. a year later when using my partners iphone pro, i realised that the sluggishness must be because of the refresh rate.
i think once you get used to 90 or 120 fps, then 60fps will just feel choppy. no need to compare them side by side.
Have never heard anyone in my life that isn’t an engineer comment on Pro Motion. Not even in an accidental sort of “hmmm why does my phone just feel faster” kind of way.
This is a feature that really only matters to the Hacker News crowd, and Apple is very aware of that. They invest their BOM into things the majority of people care about. And they do have the Pro Motion screens for the few that do.
Even I — an engineer - regularly move between my Pro Motion enabled iPhone and my regular 60Hz iPad and while I notice it a little, I really just don’t see why this is the one hill people choose to die on.
You have to understand that your own perceptual experience is not identical to that of all other humans. Without recognizing that, we will inevitably end up talking past each other endlessly and writing each other off as { hallucinating, lying, exaggerating, etc } for one of us claiming to perceive something important that the other does not.
It would be no different than arguing about whether we need all three primary colors (red, green, blue) with someone who is colorblind (and unaware of this). Or like arguing whether speakers benefit from being able to reproduce a certain frequency, with someone who is partially or fully deaf at that frequency. And I truly mean no disrespect to anyone with different perception abilities in these or any other domains.
Recognizing that large differences exist here is essential to make sense of the reality - that something that seems completely unimportant or barely noticeable to you, could actually be a hugely obvious and important difference to many others (whether it’s a certain screen refresh rate, the presence of a primary color you cannot perceive but others can, an audio frequency you cannot hear but others can, or otherwise).
This is why I led with this part, unrelated to my own perception:
> Have never heard anyone in my life that isn’t an engineer comment on Pro Motion. Not even in an accidental sort of “hmmm why does my phone just feel faster” kind of way.
I would also argue the crowd that insists everyone needs Pro Motion is doing exactly what you accuse me of -- assuming their needs and perception must also be everyone else's. When clearly the market has said otherwise, given Apple's success for many, many years with 60Hz screens.
> I would also argue the crowd that insists everyone needs Pro Motion is doing exactly what you accuse me of -- assuming their needs and perception must also be everyone else's.
I am not seeing this alleged crowd of people insisting that everyone needs 120hz/ProMotion. This seems to be a red herring.
I am seeing a crowd of people (including myself) saying that we experience 120hz/ProMotion as a huge improvement over 60hz, so much so that we will never buy a product without this ever again (so long as we have the choice).
I furthermore claim that while not everyone is a member of this crowd (obviously), it represents a sufficiently large share of the device-buying population to justify steering billions of dollars of hardware and software industry to support this, which evidently has happened and increasingly continues to happen.
If this crowd were an insignificant minority as you seem to imply, then 120hz displays would be a fad that fades away in all but the most niche markets (e.g. pro gaming), and yet we’re seeing precisely the opposite happen — 120hz displays are growing in popularity by expanding broadly into increasingly non-niche consumer device products everywhere, from laptops to tablets to phones.
> When clearly the market has said otherwise, given Apple's success for many, many years with 60Hz screens.
Arguing that the market doesn’t want/need it now because Apple succeeded without it in the past, is completely absurd — just as nonsensical as trying to argue that computers don’t ever need any more memory because they sold just fine with less in the past.
Well I guess if you don’t see it it doesn’t exist.
Apple sells Pro Motion displays. If it matters to you, you can buy them. They aren’t refusing to serve this market, they just don’t prioritize it with their lower cost products.
120Hz on Snapdragon/Mediatek Android phones works great with little impact to battery life. Pixels are hobbled by the poor power efficiency of their Tensor chips.
genuine question; why would you do that lol?
phones easily get full day battery nowadays, and flagships get 2 day battery if your usage is anything but insane
>I’m not surprised iPhones are struggling to obtain new market share
Apple has >80% of the total operating profit in the smartphone market. The new entry level phone went up in price $200. Why do you think they do/should care about market share?
Their stock price is currently suffering because of how poorly iPhones are doing in places with real competition (i.e. China), the iPhone 16 when put side by side with Chinese phones just makes the iPhone look like a cheap knock-off. Even 90hz would fix most of these problems (and the panel is more than capable of it).
They're fixing it on the iPhone 17 because of the above reasons, but it shows how badly their market research teams are doing that they even remotely thought it was acceptable on the 15, let alone the 16.
> People who have been left behind by Apple's push towards phablets
It's my impression that Apple really tried to service this market - that last model was probably the iPhone 13 mini. I assume that there just isn't enough demand for smaller phones to justify the effort to develop them.
I was honestly hoping that we'd get a small phone as the iPhone SE 4. But it seems like that's not to be. At least, if the 16e is the closest we'll get to an SE in the near term.
yup, I bought a 13 mini and was happy that Apple was one of the companies that supported this form factor. That being said, the 13 mini sales numbers speak for themselves and I understand why this kind of phone isn't released every year. I'm holding out that Apple recognizes that most of the users of the 13 mini aren't serial upgraders and will continue refreshing the segment every 5 years or so
> I'm holding out that Apple recognizes that most of the users of the 13 mini aren't serial upgraders and will continue refreshing the segment every 5 years or so
I loved my iPhone 13 mini for the 3-years it was my daily driver. But yeah, the mini line is probably dead.
Yeah I'm holding out that they've decided to just refresh the small form factor on a slower cadence. I also have a 13 mini, we'll see how long I can hold out.
I was curious about the SE4 since I had an SE2 and Verizon let me trade in the SE2 for the SE3 for free. Based on the rumors of what the SE4 was going to be, we did get an SE4, it was just rebranded as the 16e. The rumor was they were gonna get rid of the button and go with the more recent iPhone style and such. I wonder if they will rebrand the Apple Watch SE as an Apple Watch 10e or something along those lines.
Unfortunately the 12 and 13 mini were badly timed when stores closed for COVID. Actually holding one of them to use it is really what sells the smaller size, IMO.
I have my 12 mini still but it’s showing its age. Probably have to suck it up and get a big phone next upgrade.
Where do they go? Apart from random Chinese vendors like Unihertz who sell low-spec devices and you're lucky if you get one version update, the smallest Android phones I've seen are Samsung Galaxy phones, which are about the same size as an iPhone 16. Asus and Sony used to make small phones, but they've stopped in the last couple of years in favor of making phablets.
Interesting, although in my head I’d class that in the same way as the folding screens; iPhones that don’t have the dimensions you want, one way or another
I have an Android phone. I could afford an iPhone, I don't care about folding screens, and my laptop is a MacBook Pro. I have an Android phone (1) because a substantial fraction of what I do on my phone is browsing the web, and Android lets me run Firefox which has markedly better ad-blocking, (2) because the phone I had before this one was an Android phone (mostly for reason 1, as it happens, but that's not particularly important here) and switching is inconvenient, and (3) because one of the reasons why I could easily afford an iPhone if I wanted one is that I have always preferred not to spend money for which I don't get substantial benefit, and it doesn't look to me as if iPhones are so much better than Android phones as to be worth paying a premium for.
This may be partly because I'm not in the US; my impression is that "people who can afford iPhones buy iPhones so if you don't you're impoverished or weird" is much more a thing in the US than in Europe.)
(I also thought "struggling to obtain new market share" was a weird take, and ditto "just look like old and slow phones". I am not disagreeing with that part of what you posted.)
* People who think a phone is a boring generic device, and it doesn't make sense to prefer any particular brand or pay more than $X.
* People who are used to Android and have better things to do than migrating to another ecosystem.
In the past, the lack of proper dual-SIM iPhones was a common enough reason to prefer Android. But it's less of an issue today, as eSIMs have become mainstream.
In a highly competitive environment everyone wants to show their blue upper-middleclass bubble.
I think it's sad that something kids can't control becomes such a social anxiety inducing thing forcing parents into buying something they might not be able to afford.
Luckily where I'm from we don't use the "sms app" to communicate
I’ve considered trying an ultralight PC laptop with a superior screen. But the sad state of reality is that:
(1) Windows these days feels like a constant battle against forcibly installed adware / malware.
(2) Linux would be great, but getting basic laptop essentials like reliable sleep/wake and power management to work even remotely well in Linux continues to be a painful losing battle.
(3) Apple’s M series chips’ performance and efficiency is still generations ahead of anyone else in the context of portable battery-powered fanless work; nobody else has yet come close to matching apple here, though there is hope Qualcomm will deliver more competition soon (if the silicon’s raw potential is not squandered by Microsoft).
Just because Apple’s competition has been complacent and lagging for many years, doesn’t render irrelevant any feedback to Apple regarding what professional laptop users would like.
> (2) Linux would be great, but getting basic laptop essentials like reliable sleep/wake and power management to work even remotely well in Linux continues to be a painful losing battle.
This comment shows up in every single thread about Linux laptops, but my Thinkpd X1 Nano gen 1 with an intel i5 running arch Linux KDE Plasma had this issue solved out of the box when I purchased it in 2021. The only thing that didn’t work was the 5G modem, but I believe that has been implemented now. Surely 4 years later we can agree that the complaint is outdated right?
You don't buy a PC and try to run MacOS on it do you? Then why do people keep buying random laptops and then complaining when Linux doesn't run on it? You buy a laptop from a vendor who designs them to run Linux out of the box.
Also, Apple's power management isn't flawless either. It used to be fantastic, but I've never, ever seen a laptop that has to charge for 15 minutes before you can even boot it from a flat battery. This seems to happen if I leave my laptop powered off for more than a few days. Like, turned completely off, not sleeping with the lid shut.
> Then why do people keep buying random laptops and then complaining when Linux doesn't run on it? You buy a laptop from a vendor who designs them to run Linux out of the box.
Because:
(1) Laptop models designed to run Linux out of the box are very scarce, with very few options to choose from.
(2) Of the few that do exist, I’ve never seen any even remotely close to being competitive with Apple’s laptops (in terms of hardware quality, and good performance with excellent power efficiency / fanless / thermals / battery life).
Part of that is due to Apple’s monopoly on the superiority of their M series chips. But the rest I assume comes from less R&D investment generally in the Linux laptop space due to it being such a small niche, unfortunately.
Because some people would pay the same price (or even more) as a MacBook Pro to have a great screen in a thinner, lighter laptop that shouldn't cost Apple that much more to make.
Like how the MacBook Air was originally a premium-priced product instead of an entry-level product in Apple's lineup.
how about because it's ridiculous that a $2200 laptop cannot correctly show photos taken by the company's own $600 phone? People mentioned being stuck at 60hz, but it's also one of the few remaining non-xdr displays that Apple offers.
I wish for that machine too; and the price delta between the Macs is why I expect this will never happen. And unfortunately, I'd rather spend the extra bucks than go back to 60hz.
Apple seems quite content with making 120hz a feature of "Pro" models across the line (iPads, iPhones, Macs).
As others have said, they do this on purpose. It's the same with memory. I'd probably switch from a Pro to an Air if I could get 64gig ram (for LLM work) but they'd rather charge me $4800 instead of ~$3200 (guessing the price given the top end 32gig Air is $2800)
It's frustrating because I'd prefer a lighter device. In fact, even the Air isn't that light compared to its competition.
I'd happily pay +$500 ($5300) for Macbook Air PRO if it was effectively the same specs as Macbook Pro but 1.5lbs lighter.
I have absolutely no problem paying a premium for an upgraded display. The problem is that Apple does not offer that option for the MacBook Air.
The MacBook Pro has an amazing screen, which is why I bought the MBP. But the MBP compromises increased weight (which I don’t want) in exchange for more performance (that I simply don’t need). And we know this compromise is not needed to host a better display, as evidenced by the existence of the iPad Pro.
Don’t get me wrong, the MacBook Pro is a fantastic product and I don’t regret buying it. It just feels like a huge missed opportunity on Apple’s part that their only ultra-lightweight laptop is so far behind in display tech vs their other non-laptop products (like the iPad Pro which is lighter still, just crippled due to iOS limitations).
I would gladly pay even more than the price of my MacBook Pro for a MacBook Air with a screen on par with the iPad Pro or MacBook Pro. Or even for an iPad Pro that runs OSX!
A pro will still be a good 2.5x the speed compared to the Air due to memory bandwidth. It would be rather silly to spring for that amount of memory for that purpose, anything more than say a 14B param model will be painful.
It's actually quite crazy that we need to get those bulky pro models just to get the basics like better screens and more memory. The performance between the Air and Pro is anyways pretty much the same, except for long running tasks where pro benefits from active cooling.
Wonder if we are going to see some changes here with the upcoming M5 models.
I don't think they really need to. You can have the base model with the same specs, but let me configure it with a better display. I can currently spec up a Mac Mini without any problem.
The second option is to bring back the MacBook brand for entry level devices and use the Air brand for "Pro" devices that don't require active cooling.
Hell, I would be happy if Apple at least enabled the virtualization instructions that are already available in the Mx chips inside the iPads, and allowed e.g.: something like UTM in Apple Store with Hypervisor support. It would be a good differentiator between the cheaper iPads running Ax chips vs the more expensive iPads running Mx.
Considering the powerful hardware, the form factor and the good keyboard (I have a used Apple Magic Keyboard paired with our iPad Air M2), if I could virtualize an actual Linux distro to get some job done in the iPad it would be great. But no, you are restricted to a cripped version of UTM that can't even run JIT and because of that is really slow because of that.
You’re right that it’s not inherently unique to EVs, but it started with EVs and now this new dangerously fragile design (of having a single monolithic computer console handle display and control of everything from critical drive modes and gauge display, to non critical things like music and playing fart noise jokes) is infecting ICE cars too (e.g. BMWs new touch screen AC controls and unified touch screen dashboards rolling out to all new cars, Audi doing something similar now, etc. — all following after Tesla, but with crappier software).
I’ve owned and driven EVs from several brands. Prior to this, I could pretty much always expect the following from my car:
1. The drivetrain always operates normally and safely (aside from some actual mechanical failure) with no computer glitches.
2. I can always see my speed and gear selector state on a dashboard somewhere, even when (not if) the infotainment screen crashes and reboots. I’ve had (2010-2020ish era) Lexus, Audi, and others have infotainment glitches, crashes, and reboots, but the speedometer, drive train, and AC all had physical controls running on isolated systems and so they always continued to work through a reboot or glitch of the infotainment.
3. The AC is always operating (aside from some actual mechanical failure) with no computer glitches or lag to my ability to control it. I consider this a critical safety system given that many drive in climates with weather that can be dangerously hot or cold.
In pretty much every EV I’ve owned, none of these have been true except maybe #1, and that is pretty sad to say that the only thing that hasn’t happened is my entire cars wheels locking up on the highway (and yet still this is reported happening for many EV brands, Tesla, Audi, and Porsche at least come to mind where I’ve read stories).
It’s insane to me that it’s even possible for the cars computers rebooting to entail AC shutting down, not being able to see your speed, etc. If this EVER happens, the entire vehicle line should legally require a recall until it’s guaranteed this won’t happen. We have ways of guaranteeing computer systems don’t fail like this to extremely high probability — car companies only don’t do it because it’s expensive and more complex than just throwing all the same crappy software into one single system rather than designing multiple isolated fault tolerant systems.
Less horrible but still shockingly bad regression is how almost all modern cars AC is controlled through an often laggy computer system (not to mention the almost universally despised move of AC controls to touch screens, instead of physical controls). Maybe not so laggy on Tesla, but in my experience both BMW and Audi have AC control touch screens which sometimes respond but occasionally can have 1-10 second random lags before anything responds. Presumably due to garbage collector lag or something. But this is also a mild safety issue since the lack of predictable behavior from common controls makes it very distracting when trying to so something so common and simple as adjusting the temperature that should just be as simple as a simple physical button or knob.
Do they think Copilot/ChatGPT are usernames of contributor accounts which can be banned from a project/repo?
It seems the underlying issue here is bad code submitted ultimately by human contributors. Consistently thorough code review and testing will always be necessary.
How do code review and testing address the licensing concerns identified by the project? If you read their policy (which is two paragraphs, one of which is quoted directly in the article), it's pretty clear that the quality of the code is besides the point.
I’m not claiming that quality or licensing concerns are irrelevant, but that:
1. It will not be possible to reliably detect whether code was LLM assisted or not.
2. Humans are not always 100% truthful.
3. All the concerns cited here also applies to human-written code anyway.
So attempting to treat LLM coding assist tools as a special case here is going to be a losing battle. To solve these issues, we’re gonna have to come up with code review processes and tools that apply to ALL code up for review.
Those same quibbles applied to the policy before the addition of the LLM section: how does the NetBSD project detect if I copy & paste a bunch of code from my day job into a patch submission (and then lie about it)? Obviously, they can't. I, personally, don't feel like it's a failure of the policy if it relies on your contributors acting in good faith, because:
a) many people are acting in good faith, and their behavior will change as a result of this policy;
b) if someone wants to be a jerk and use an LLM after they were told not to, and is at some later time found out, it makes it easier for the org to act quickly and in a fair and consistent manner;
c) [more speculative as to the motives of the NetBSD project] normative statements by well-regarded institutions are useful in setting an example for other organizations to follow, so there is some political utility regardless of the practical efficacy of these rules.
I think this is a misinterpretation of what has been debunked. Low fat diets are definitely healthier for more people because they almost always correlate with lower calorie intake. It’s not the quality of it being a fat that makes it bad (this is what recent studies revisiting fats are saying). It’s other qualities of fats that are problematic. For example, consider equal calorie intake between a protein source (like a steak) and a fat. Which do you think is going to fill you up more? Which do you think is healthier?
Fat is calorically dense. If you're trying to lose weight, you need to be careful about eating calorically dense foods. And if you want to eat more protein to maintain muscle while losing weight, there is a zero-sum trade-off between fat and protein.
Interesting archaic theory, as me and my wife just lost 40lbs each by eating a ton of fat and protein, haha. If you want to lose weight, stay away from sugar, for example carbs
High fat high protein (with low carbs) can definitely be successful, as long as you’re mindful of your approach and manage net calorie intake. But for lots of people it is easier to just eat what they do, but make minor tweaks to reduce the amount of fat as a way of reducing calories while still being satisfied by what they’re eating. Different ways for different people.
Weight gain/loss depends on calories, so you can eat anything to either gain or lose weight as long as it's the proper quantity (although if you want to gain muscle then you also need enough proteins).
Eating roadkill isn’t much different from eating wild game you hunted — except with roadkill, it was someone else and their car that killed it accidentally, rather you and a gun intentionally.