It's rather ironic that a discussion on what Christians believe would be based entirely on quotes from before the birth of Christ.
I think though that he did say something relevant to the question at hand: "The kingdom of God is inside you" (Luke 17:21) (ἡ βασιλεία τοῦ θεοῦ ἐντὸς ὑμῶν ἐστιν).
Sorting has a simple local optimum implies global optimum property. If a list is sorted than any (not necessarily sequential) sublist is sorted, and the converse. No such property applies to TSP.
I think that the best way to think about why some combinatorial problems are hard and others easy isn't to ask what makes a problem hard, but rather what makes a problem easy. Combinatorial problems seem to be hard by default. It takes some special simplifying property to make them easy.
The idea here is not to understand why some combinatorial problems are hard and others easy, but rather to just be able to distinguish between them. So, instead of understanding why the measure functions structure the solution spaces the way they do, treating them as black boxes and examining the structure they provide to the solution space.
That still seems like a major issue, because I and probably quite a few other people rely on the security page to decide when to look into upgrading.
I hope there is some rationale for why the security fixes in later releases were not serious enough to warrant an advisory on the security page, rather than it just being an oversight.
Your attitude comes off as dismissive and condescending while gaslighting a perfectly valid criticism.
Why not recognize that the commentor is providing feedback, even if not directly, and is delivering it in a forum context which we (HN visitors) actually read?
The notion that every bit of feedback like this shoud be replaced with a FOSS contribution is profoundly entitled on your part.
TTL is just an implementation variant of DTL, which had some advantages for low supply voltages.
When making DTL circuits with bipolar transistors, especially at low supply voltages, in order to make the logic levels at the output equal to the logic levels at the input, a simple solution was to add a diode at the output of an AND gate made with diodes, to shift down the logic levels by the voltage drop over that diode.
When the diodes used for gates were bipolar diodes, not Schottky diodes, they stored a big electric charge when on and they could be turned off quickly only if there was a path to evacuate the stored charge. Also the bipolar transistor of the inverter stored a big charge in the base, which had to be evacuated quickly for turning it off.
The current needed to evacuate the stored charge was required to pass in reverse direction through the level-shifting diode, which is not possible, so the DTL gates with level-shifting diodes were slow.
In bipolar IC technology, all diodes are made from transistor junctions, in order to not have separate process steps for making diodes. The junctions suitable for fast diodes are the emitter-base junctions. So all the diodes of a DTL gate were made inside an IC as multiple emitters of a transistor, with a short over the base-collector junction, to disable the transistor effect and make it work as a bunch of diodes.
At this point in the history of integrated DTL circuits, someone made the observation that seems trivial in hindsight, that removing the short over the base-collector junction allows to use it as the level-shifting diode, saving a diode.
Moreover, this not only saved a diode, but the bipolar transistor effect results in passing current through the reverse-biased level-shifting diode, allowing the fast turn off of the inverting transistor.
So in those early times, when DTL circuits could be made only with bipolar diodes and with bipolar transistors, TTL was the best variant.
Later, when the bipolar diodes were replaced with Schottky diodes, which store negligible charge when on, and when diode clamps were used over the inverting transistor, so that it no longer reached deep saturation and it no longer stored a big charge, TTL was no longer the optimal implementation and some of the so-called Schottky TTL families were actually DTL circuits, not TTL, but they had retained TTL as a marketing term, as this had become almost synonymous with bipolar logic integrated circuit.
TTL could never be used in discrete circuits, because it is based on bipolar transistors with multiple emitters and/or multiple bases, which have never been available as discrete parts.
diodes are cheaper, smaller, and easier to solder than transistors; ttl took off with integrated circuits
ttl was the default choice for building discrete logic circuits until about 01980, after which point it was obsolete because cmos (74hcxxx, not cd4xxx) was better in every way except esd
if you're running an educational computer lab on a tight budget, esd is still the dominant consideration, because students will burn out all your cmos chips with static after only a few dozen uses, while ttl chips will survive most of their mistakes
but outside a circuit lab or possibly repair of 40-year-old devices there's no reason to use ttl
It depends how "discrete" you mean. If you're using gate ICs, TTL has most of the advantages. If, for some reason, you're limiting yourself to discrete transistors, TTL is going to be hard to implement. These transistor-based projects are mostly done for the bragging rights.
Does anyone know of a good resource explaining how the compose key is supposed to work in general beyond the simple cases like those discussed in this article ?
I have an interest in some less spoken languages like Ancient Greek and Sanskrit and though there are specific keyboard layouts that mostly work there are still some less common combinations of diacritics used in writing Ancient Greek, for example, that don't seem to be covered.
Is there some way to use the compose key for entering general Unicode sequences for example, that would work for different applications ?
I'm a Linux user but I'd be interested in seeing a solid exposition on this topic even if it was for a different OS.
BUT they may already have their own systems for diacritics, as these do.
I note that the Sanskrit one here uses dead keys, which I personally hate.
Second note: dead keys are responsible for the very common issue of Eastern European keyboard users trying to type English on layouts with an acute accent key, and using the accent for an apostrophe. If you know the difference the result looks terrible, especially in any and all proportionally-spaced fonts.
In other words the core (key, haha) problem here is lack of uptake of the Compose key by IBM when designing the PC, so meaning it is not generally known and people design fancy new layouts because they don't know this is an option.
Yeah, I was hoping to find some general information on how the Compose key is (or was) supposed to work. It's easy to find info on how to assign the Compose key to a particular key but I haven't found anything on how to actually use it to input more or less complex character codes.
Part of the problem may also be the difference between precomposed and non precomposed glyphs in Unicode, which I don't really understand, but it seems that if a keyboard layout is designed to use precomposed glyphs it may not allow you to further compose those with other code points.
For example there is a character that is a lowercase alpha with both an acute accent and a macron on top. With the right font it displays correctly (it probably wouldn't if I tried to copy-paste it here) but I don't know how to enter it on the keyboard. I suspect that's because there's no precomposed Unicode codepoint for it and my keyboard layout only seems to work with those.
The correct encoding for it is "GREEK SMALL LETTER ALPHA WITH MACRON" (Unicode name) composed with "'".
Yeah, that might work with a default keyboard layout but it doesn't work with a polytonic Greek layout. How would you type a lowercase alpha with an acute accent and a macron that way ?
I don't speak or write Greek, so I don't really have any insight at all into this. (I can just about, very poorly, read the alphabet, that's all.)
So I can only guess:
Let's assume we have a Greek layout. And that that has keys from alpha to omega, and a RightAlt or something, and keys that have some useful resemblance to acute, grave, circumflex, rough, smooth, etc.
Compose, type an alpha, type an apostrophe?
As an example, I sometimes type in Czech.
č is comp, c, <
š is comp, s, <
Others...
î is comp, i, ^
ï is comp, i, "
ç is comp, c, comma
All that's needed is a vague visual resemblance to the desired diacritic.
With the high level of self awareness that you've developed as evidenced by this post I would think you would be largely immune to falling into the kind of traps you're worried about. Maybe you should ask yourself why you don't trust your self enough to avoid such traps in the future ?
I think there is also a risk in becoming too skeptical or even cynical. I agree that there is much in most religions that is not useful and can sometimes by very harmful yet at the same time I think many religions contain elements of truth that are worth being open to.
My recommendation would be to approach all ideas with openness while maintaining a healthy amount of skepticism. Try to find the nuggets of gold without becoming burdened with tons of sand.
Not to diminish the achievements of these people but I suspect being in the right place and time has a lot to do with it. I suspect that for each of the names you cited there are a thousand others who were equally as technically skilled but who you've never heard of.
Wozniak, as I understand it, is best known for creating elegant and highly optimized designs for the Apple I and II circuitry. That may be a rare skill, but I don't think it's nearly as rare as the vast gulf in fame between him and the typical EE would lead one to believe. He became famous because he used those skills in the exact time and place where they mattered most, creating one of the first usable computers that was affordable for the middle class.
Torvalds didn't create Linux ex-nihilo. There was at the time an extensive literature on the design of Unix/Posix systems on which he could lean, as well as the example of Minix, which as I recall, is largely what inspired him to create his own clone of the Unix kernel. The reasons it became as successful as it did are numerous. Part of it has to do with that work being done at the time the Internet was enlarging the number of people who could access and contribute to open source projects. It also coincided with the GNU project being at a stage where it had already developed many of the user space tools for a fully open source Unix-like system but was having trouble getting a kernel off the ground. Note also that a key ingredient of his success beyond his technical competence was his ability to shepherd a world wide group of open source collaborators, keeping them all moving in the same general direction.
I don't know as much about Jeff Dean's history but I do know from experience that there's a lot more to being successful in a corporate context than just technical competence, or even, I suspect, genius. It's rare that one person can create an entire system on their own (though it does happen that one person's work can establish a framework for future contributions) and moving beyond the work of one individual requires a whole additional set of skills.