If I recall correctly, they were also licensed to produce some clones.
I remember when in the early 90s the am386-40MHz came out. Everyone was freaking out how we are now breaking the sound barrier. There was a company Twinhead(?) that came out with these 386-40Mhz motherboards with buses so overclocked most video cards would fry. Only the mono Hercules cards could survive. We thought our servers were the shizzle.
Intel did indeed later license AMD to produce some clones, but it was not due to their good heart, but those were cross-licensing deals, with AMD producing clones of some Intel chips and Intel producing clones of some AMD chips, which could be used as peripherals for the Intel CPUs.
Then there was the big licensing deal for Intel 8088 and its successors, which was forced by IBM upon Intel, in order to have a second source for the critical components of the IBM PC.
Weren't legal protections for semiconductor masks rather lax in the 70s, at least in the United States? You might need certain patent licenses for the manufacturing process, but the chip itself was largely unprotected.
> "In the summer of 1973, during their last day working at Xerox, Ashawna Hailey, Kim Hailey, and Jay Kumar took detailed photos of an Intel 8080 pre-production sample"
> "Xerox being more of a theoretical company than a practical one let us spend a whole year taking apart all of the different microprocessors on the market at that time and reverse engineering them back to schematic. And the final thing that I did as a project was to, we had gotten a pre-production sample of the Intel 8080 and this was just as Kim and I were leaving the company. On the last day I took the part in and shot ten rolls of color film on the Leica that was attached to the lights microscope and then they gave us the exit interview and we went on our way. And so that summer we got a big piece of cardboard from the, a refrigerator came in and made this mosaic of the 8080. It was about 300 or 400 pictures altogether and we pieced it together, traced out all the logic and the transistors and everything and then decided to go to, go up North to Silicon Valley and see if there was anybody up there that wanted to know about that kind of technology. And I went to AMI and they said oh, we're interested, you come on as a consultant, but nobody seemed to be able to take the project seriously. And then I went over to a little company called Advanced Micro Devices and they wanted to, they thought they'd like to get into it because they had just developed an N-channel process and this was '73. And I asked them if they wanted to get into the microprocessor business because I had schematics and logic diagrams to the Intel 8080 and they said yes."
From today's perspective, just shopping a design lifted directly from Intel CPU die shots around to valley semi companies sounds quite remarkable but it was a very different time then.
That wasn't the first time they had similar products out-speeding Intel. I have the CPU from the first PC I owned tacked to the front of my current main PC with a Ryzen. That was clocked at 20MHz IIRC (I'm at parental home ATM so can't confirm) where the Intel units topped out at 12MHz (unless overclocked, or course).
I'm guessing that was a 286. I think Intel parts topped out at 12.5 MHz but AMD and Harris eventually reached 20 or even 25 MHz. I still have my original PC with a 12.5 MHz one.
The difference with the 386, I think, is that AFAIK the second-sourced 8086 and 286 CPUs from non-Intel manufacturers still made use of licensed Intel designs. The 386 (and later) had to be reverse engineered again and AMD designed their own implementation. That also meant AMD was a bit late to the game (the Am386 came out in 1991 while the 80386 had already been released in 1985) but, on the other hand, they were able to achieve better performance.
> The 386 (and later) had to be reverse engineered … That also meant AMD was a bit late to the game
There were also legal matters that delayed the release of their chips. Intel tried to claim breach of copyright with the 80386 name¹ and so forth, to try stymie the competition.
> they were able to achieve better performance.
A lot of that came from clocking them faster. I had an SX running at 40Hz. IIRC they were lower power for the same clock then Intel parts, able to run at 3.3V, which made them popular in laptops of the time. That, and they were cheaper! Intel came out with a 3.3V model that had better support for cache to compete with this.
--------
[1] This failed, which is part of why the i386 (and later i486 and number-free names like Pentium) branding started (though only in part - starting to market direct to consumers rather than just EOMs was a significant factor in that too).
In my experience, most digital forensics is outsourced. The tools are too expensive, and expertise is not available, and few can handle the material for extended period without some mental concerns.
One thing that I have never seen at least in Gov space (of several nations) for employees or contractors is mental health support for those working in digital forensics with cases that are beyond disturbing. I have asked for such support for teams, for nearly 30 years. I hope this survey will shed some light to improve this.
Not the OP, but I have heard something similar from a sec conf before. Gist being if a laptop has stickers like this, then the chances of the owner being an engineer is significantly higher, so pentest teams / malicious actors can better focus their efforts on those individuals, and have a higher chance of gaining access to internal systems than if they targeted random folks in public.
Doesn't help as well that arguably the kind of stickers a laptop displays tends to hint at who's a sysadmin or not, etc.
You're missing something, but that's sorta the point. The idea of what a full-stack developer or back-end engineer or hacker (or whatever term we want to bandy about) looks like is largely based on stereotyping and a bit of myth. You can't tell what someone does for a living just by looking at them all of the time, but you can some of the time, so it's easy to play on that by dressing the part because we humans can be easily tricked into trusting our own information by default. If you cosplay as a network engineer, it's pretty likely that's what most people will think you do.
Say you're red teaming, and you are on-site looking to gain access to the server closet of a business. Some initial setup about you being there comes into play, but once there, it's up to you to look like you belong there, when some unwitting person with access to the server closet will lead you to it, then leave you to do your thing on the pleasant notion that you'll have the "problem" fixed by the end of the day. This is an ultra-simple scenario used as an example, but looking the part sometimes means having some stickers on your laptop that tell people you're really into a specific language or tool chain, or that you've been in the SOC trenches long enough to know what a lot of those inside jokes mean. Details often sell the lie.
Well, the _corporate_ stickers are a major giveaway, of course; if you have 15 AWS-related stickers it is highly likely that you work at Amazon, say, and it may not necessarily be wise to make it clear that your laptop is an Amazon corporate laptop, in public.
Beyond that, you could _maybe_ use it to identify a person's interests for social engineering purposes, but that feels a lot more tenuous.
I feel like only in the US is credit monitoring something sold as an optional service.
I got a confirmation mail from System76, because apparently they feel the need to validate my credit card can’t be used without my approval, but my back does this by default…
Yes. US residents' ability to obtain credit (cards, cars, houses) is based on three shadowy for-profit organizations who each keep a secret score on each resident.
One's employment history is not a factor in the score at all (contrast this with Europe).
Furthermore, privacy in the USA is so bad, the leaking of one's personal details which criminals can use to fraudulently obtain credit and ruin said score and possibly also one's finances is a major concern. Hence, "credit monitoring" exists in order to catch this kind of criminal activity in the act, and I don't know, become completely exasperated with the amount of ass pain that dealing with this then causes.
>I can only speak to my experience, certified devices by the largest firms will mostly not interoperate (fails around authN).
>Apple: Keeps Thread credentials locked to HomeKit's border routers.
>Google: Shares some credentials, but only within Google Account environment.
>Amazon: TBD, but their Matter implementation is mostly cloud-tied.
>Samsung: Hybrid approach; still best when used inside SmartThings, their 1.4 update seems to support for joining existing Thread networks. Still have to test it.
>So, even though Thread theoretically allows full interoperability, no vendor wants to be reduced to a dumb router in someone else’s ecosystem.
>there is no easy way to bridge Apple Thread to Home Assistant or Google Thread, even though it is theoretically supposed to be possible from a protocol standpoint.
>If you have such solutions, let me know, because I would take full advantage of it, and will regale your contributions in multiple home automation threads.
I remember when in the early 90s the am386-40MHz came out. Everyone was freaking out how we are now breaking the sound barrier. There was a company Twinhead(?) that came out with these 386-40Mhz motherboards with buses so overclocked most video cards would fry. Only the mono Hercules cards could survive. We thought our servers were the shizzle.
reply