Hacker Newsnew | past | comments | ask | show | jobs | submit | threatripper's commentslogin

Let's reiterate what a comment is.

Ad

Let's give some unrelated examples of popular comments.

Ad

<Continue Reading>


That would only confuse potential buyers. You have to design everyday products for non-technical people.


Not only that, it doesn’t stop unscrupulous manufacturers from just printing whatever they want.


How could a max speed rating possibly be worse than a blank plug end?


This reminds me of the empty box in a corner of the workshop with the written note "Don't remove!".


Won't these super old kernels basically turn into forks after some time that are maintained and even extended for special purposes?


From a developer's perspective, yes.

From a user's perspective, they just keep working.


I get like 3 hours on my MBP when I use it. MacBooks have better runtime only when they are mostly idle, not when you fully load them.


Can confirm, when developing software (a big project at $JOB) getting 3h out of a M3 MBP is a good day. IDE, build, test and crowdstrike are all quite power hungry.


I wonder how much of that is crowdstrike. At $LASTJOB my Mac was constantly chugging due to some mandated security software. Battery life on that computer was always horrible compared to a personal MB w/o it.


Exactly. Antiviruses are evil in this sense - crippling battery life significantly.

Wherever possible, I send “pkill -STOP” to all those processes, and stall them and thus save battery…


The firewall on that computer killed the battery (with repeated crashing). It also refused to work with a USB Ethernet adapter so I could only use wifi. It was clearly a product meant to check a security box, written by a company that knew nothing about Macs, bought by Enterprise Windows admins. It was incredibly frustrating. (The next version of MacOS moved firewalls away from in-kernel to extensions. I like to think it was my repeated crash logs that made the difference.)

I half wonder if that’s part of the issue with Windows PCs and their battery life. The OS requires so much extra monitoring just to protect itself that it ends up affecting performance and battery life significantly. It wouldn’t be surprising to me if this alone was the major performance boost Macs have over Windows laptops.


> crowdstrike

It is incredible that crowdstrike is still operating as a business.

It is also hard to understand why companies continue to deploy shoddy, malware-like "security" software that decreases reliability while increasing the attack surface.

Basically you need another laptop just to run the "security" software.


Allegedly, crowdstrike is S-tier EDR. Can’t blame security folks to want to have it. The performance and battery tax is very real though.


Ever since Crowdstrike fucked up and took out $10 billion worth of Windows PCs with a bad patch, most of the security folks I know have come around to the view that it is an overall liability. Something lighter-touch carries less risk, even if it isn't quite as effective.


there's a few different reasons: - its pushed by gov (it gives full access to machines, huge backdoor) - its not actually the worst of its kind, sadly - their threat database is good (ie it will catch stuff) - it lets you look at everything on the machine (not the only one, but, its def. useful) - its big - cant be faulted for "we had it and we got pwned" - yep, sad as well

If operating systems weren't as poop as they are today, this would not be necessary - but here we are. And I bet you major OS manufacturers will not really fix their OSes without ensuring its just a fully walled garden (terrible for devs.. but you'll probably just run a linux vm for dev on top..). Bad intents lead to bad software.


I concur.

The only portable M device I heavily used on the go was my iPad Pro.

That thing could survive for over a week if not or lightly used. But as soon as you open Lightroom to process photos, the battery would melt away in an hour or two.


If we assume that intel gets successful with 18A with their x86 processors, would they even have the money to finance the node after that? And the node after that which gets exponentially more expensive?

In the past x86 raked in enough money to burn a lot of it on new fab tech but non-x86 has grown immensely and floods TSMC with money. The problem for intel is that their fab tech was fitted to their processor architecture and vice versa. It made sense in the past but in the future it might not. For the processor business it may be better to use TSMC for production. For the fab it may be necessary to manufacture for many customers and take a premium for being based in a country in need. So, a split-up may be inevitable and this fabbing a competitive ARM chip surely helps in attracting more customers. Customers who may pay a premium for political and security reasons.


Apple, Nvidia and US govt can provide the required funds if they have confidence in its ability to deliver. These companies will benefit from breaking current monopoly of TSMC.


Samsung is already in a much better position for this. They have external customers and experience facilitating them. Unlike Intel's track record which doesn't inspire confidence at all.


Intel has something Samsung doesn't. It's a US company operating mostly on US soil so the US government has a vested interest to keep this strategic asset going for as long as possible.


Tech hardware is a cutthroat business, tech companies are gonna order at Intel if it has something that others don't on a business point of view: more performing, cheaper, faster delivery.

The US government can wish and encourage all they want, as long as Samsung, TSMC and any other produces better chips for less, the money will flow there.


Governments can keep companies working for as long as they want. Usually that makes them less competitive over time though and it is all done at the cost of the tax-payer and adjacent industries.

The Chaebol model of Korea is a way to spin it while avoiding the less competitive part by forcing the companies to compete internationally while keeping the domestic market locked into the Chaebol offering.

For example the US gov could force (or subsidize) all datacenters in the US to use intel chips made in intel foundries located in the US. But on the international market intel would need to compete with its rivals.

This is all theoretically possible, but very hard to pull off politically. And it is not necessarily good for the country long term and certainly a tax to the country citizens/adjacent-companies in the short term.


If a government finds a sector or company to have strategic importance they will not let it die. The rest is free-market absolutism that never comes to be. I believe today more than ever the US considers Intel to be of strategic importance.

> the money will flow there

Which money? The CHIPS act [0] isn't only for the ones who produce "better chips for less".

[0] https://en.wikipedia.org/wiki/CHIPS_and_Science_Act


The fact that US taxpayers will subsidize Intel does not mean that Nvidia, Google, AMD, etc are gonna other their chips there.


A little subsidy will not do it. We're talking about at least 100, 200, 400, 800 Billion Dollars in the next process generations. If it's government money, then maybe 2x-10x that to get the work done.


> Apple, Nvidia and US govt can provide the required funds if they have confidence in its ability to deliver.

Given Apple's history with Intel's ability to deliver, I'm guessing the confidence there isn't high.


Are you referring to 5G radio modems or another chip?


Probably Intel’s fumble when Apple asked them for better performance per watt for the laptop CPUs and whether they wanted the iPhone CPU business back in 2006.


A more recent motivation might be Apple's switch to in-house ARM for MacOS for similar reasons.


Well, they’re already funding so much ARM custom design, it’s not that incremental to tweak and scale for their laptops.


Probably the Intel CPUs in Macbooks before Apple made the push for the M1 - circa the Intel quad core era where their laptop chips had major heat issues... ~2012 IIRC?


I’m not defending Intel here, but those Intel MacBooks never had appropriate thermal design or headroom for the processor’s operating specs.


I think the theory is that they had an appropriate thermal design for cpus which were supposed to ship but never did.


I wouldn't count on either to save Intel as it still is (i.e with the fab business still attached to the CPU/GPU business). While it's true that having Intel fabs as a second source would be nice for them to alleviate the dependency on TSMC, they are also competing with Intel on the CPU/GPU side.

My guess is, they're gonna let Intel rot a little further while doing their best to pressure for Intel to split off their fab biz (as AMD had done back then), and then invest just in the fab.


> Apple, Nvidia and US govt can provide the required funds

When the first tough about investing is to go to big corporations and the goverment instead of going to investors is a telling about how nowadays the economy works.

I love that the Orange guy has opened the door to the nationalization of big tech. I hope that the next president is bolder on this regard. If all these companies depend on monopolies to exists, they should be state owned/controlled.


Yep, that's exactly what they did with TSMC. Foundries don't just build massive production lines and hope someone will use them, even TSMC.


Yeah, everyone is focused on TSMC as the company with the secret sauce, but really it’s Apple. Whichever foundry Apple goes with gets the majority of leading edge transistor volume.


Amazon and Google probably as well?


In the past AMD needed to survive for antitrust reasons. Now x86 is losing in relevance now as alternatives are established. Nobody needs to keep intel alive.


AMD also received many Hail Marys as a result of Intel’s anticompetitive behavior. Directly via payouts Intel and partners had to make, and indirectly via companies being more willing to work with them for their GPU expertise and better (out of desperation) licensing/purchase agreements.

Intel can’t rely on the same. They haven’t been directly impacted by another larger company, they rely too much on a single technology that’s slowly fading from the spotlight, and they can’t compete against AMD on price.

Maybe if they ended up in a small and lean desperation position they could pivot and survive, but their current business model is a losing eventuality.


AMD could not afford their own foundries anymore. The same is likely to happen to intel. The CPU business may be sold off to some other company, so x86 and intel will "survive" for sure but they will rely on other fabs to produce and they will milk the legacy cow instead of holding the overall performance crown.


Did you completely ignore the last paragraph?

As I said, AMD survived by going into a lean pivot out of desperation. Intel has that opportunity as well, but the deck is stacked against them due to their size and over-reliance on specific IPs.


Which alternatives? Other than Apple, where can I get a non-x86 desktop?


1. Desktop market share is shrinking and shrinking. 2. https://system76.com/desktops/thelio-astra-a1.1-n1/configure 3. NVidia N1x is not yet for sale but benchmarks are promising.


1) Shrinking compared to what? The moment you want to do any serious work or gaming, you need a desktop (or a laptop, but a real PC in any case).

2) Ok, so there is expensive workstation available. It is a step forward I guess.

3) Call me when it is available and I can buy it in any normal computer shop.

Look, I hate the x86 architectur with a passion, having grown up with MS-DOS and the horrors of real mode. But the truth is that if I want to buy a computer right now, today, it is either a x86 PC or an Apple, and I have zero interest in Apple's closed ecosystem, so a PC it is.


1) Shrinking compared to Mobile, GPU/AI, Non-x86-Server. Manufacturing only x86-CPU will not pay for the next fab anymore.

3) PM me your number.

The fact is just that x86-CPU gets less and less important in the overall picture. It's not shrinking in total volume but in relative volume and fabs get exponentially more expensive. If Intel can make it in this generation they can't finance the next generation making only their own CPUs. They missed the big growth markets. They lost Apple as a customer. They have no GPU and no mobile chip that matters. They invested billions and billions into technology that went nowhere. Now they have to ask TSMC already to manufacture competitive CPUs. The only way this could change if Apple and/or NVidia massively start buying future capacity at intel. But why would they?


Does the Nvidia DGX Spark qualify as a desktop?


Technically yes, but I don't see the average person getting one. Much like the Raptor Talos, it is a very niche product.


Nobody really knows if 18a is a failure or if it was turned into one by deliberate mismanagement. It feels like when Microsoft took over Nokia.


Must be a force of habit.


Your model keeps the weights on slow memory and needs to touch all of them to make 1 token for you. By batching you make 64 tokens for 64 users in one go. And they use dozens of GPUs in parallel to make 1024 tokens in the time your system makes 1 token. So even though the big system costs more, it is much more efficient when being used by many users in parallel. Also, by using many fast GPUs in series to process parts of the neural net, it produces output much faster for each user compared to your local system. You can't beat that.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: