This "afterthought ... poorly bolted ... worse than its predecessor" is a type of HN comment that's too common. Not being a CLOS or Lisp user, I have no idea why you say this and can learn nothing from it.
HN's really starting to put me off with suchlike rife blanket-damning posts that are uninformative and usually come from people who have little experience.
Indeed, this type of threads always remind me that HN is not the same as it used to be. If you wish to learn about something (including about its worth), it's not the right place. Read books; read and write code; form your own opinions.
Replying to self because HN doesn't show me reply buttons to this comment's children.
This account is 52 days old, but the user behind this has been reading and posting on HN for a bit longer (let's say since PG decided to put HN online).
For the other comment, my point is that "genuinely wise" is much rarer nowadays, and instead we're drowning in "genuinely clueless". This is why you shouldn't learn about something's worth by reading HN comments nowadays. Instead you can put a limited amount of time actually studying the subject and then decide if it's worthy of more of your time or not.
Perhaps.. but clueless comments are usually evident by the very lack of solid info. So why even post?
I get further pissed off by having people argue back when I say something from my small island of expertise (SQL and a few other things) they clearly know less than me - I can't learn from them and they won't learn from me. I don't mind n00bs, we all were once, but to willingly remain n00bs by rejecting information, well I can't comprehend it.
I miss posts from the likes of BeeOnRope. The really good people get driven away.
Your account is 52 days old, which I will admit is a poor way to estimate how long someone's been around. Look at any old thread around Arc's release and you'll see the comment quality was even lower than here.
Yes and no, books are fab but words from the genuinely wise can save you a lot of otherwise wasted time so I always appreciate for knowedgeable advice as well.
It wasn't elaborated on because it wasn't part of the core point of the comment, which was someone claiming falsely that to write Common Lisp you needed to use CLOS, which you absolutely don't.
You wouldn't go into why JavaScript is bad to point out that you don't need to write JavaScript to make a web page. You would just note that JavaScript was a late addition to the web and wasn't the first language usable on it.
Plenty of people were writing CL before CLOS existed.
I don't know what their own thoughts on it are, but I personally think it's poorly bolted on because it's not actually used by the rest of the spec. There's a massive proliferation of functions acting on data structures, and none of them are generic even if they do the same thing. Poorly bolted on indeed.
CL's functionality is an amalgamation of the prior Lisps (thus "Common") which didn't have CLOS. But it was CLOS-ifying a variety of things by the time of the 1994 standard. You just don't see more because the immediate goal was a large degree of compatibility with those prior Lisps. If a second standard had developed a larger portion of the system would probably have been brought under CLOS.
This is a valid criticism. Many of the (possibly) non-generic functions in Common Lisp could be made generic. That they weren't was a bow to existing implementations of Lisp that wouldn't have supported it. The stakeholders engaging in the standardization process didn't want extensions that would be excessively (at the time) costly for them to implement, especially in a way that wouldn't have a large runtime performance impact.
The CL error system is built with CLOS. If you want to customize error handling you'll basically be writing CLOS.
But I agree about general data structures. Some of the sequence functions are generic but not enough of them. I presume this happened because circa 1990 generic dispatch was too slow to handle high speed data traversal. That's no longer true, and many libraries exist to "generify" more of Common Lisp.
For the storage heaters I had the 'bricks' seemed to be mainly iron oxide IIRC. Also I am struggling to believe water has that multiples of capacity over brick, do you have a ref?
Brick has a heat capacity of 840 J/kg°C. Water has a heat capacity of 4182 J/kg°C. Almost but not quite five times as much heat capacity. Water has a staggering heat capacity.
Feolite
specific heat = 920.0 J·kg−1−1·°C−1,
density = 3,900 kg·m−3,
thermal conductivity = 2.1 W·m−1·°C−1.
maximum operating temperature 1000 °C.
Water
specific heat = 4184 J·kg−1·°C−1
density 1,000 kg·m−3,
thermal conductivity = 0.591 W·m−1·°C−1.
maximum operating temperature <100 °C unpressurised.
I wonder why you say this. Having actually had them, they're insulated and you can open/close vents as you choose. And weather is usually pretty predictable; when it's winter it's cold.
Oh look, the article even says this: "Today's storage heaters have such good insulation that the only heat coming into the room is from air being blown through the hot core by an ultra-quiet low-speed fan the length of the heater with the warm air coming out of the bottousually m of the heater at floor level - even when the core is full charged the outside of the heater is usually barely warm to the touch - and never hotter than a water-heated radiator"
Some rare nice words from me about C# - the not-null facility is great. It's very basic propagation that should be better and I've fought the compiler too many times over it, but I really like the compile time guarantee it gives.
I know Dijkstra's paper and it's short, good and should be read but this article is wrong in saying always. It feels like a newbie programmer came across a good thing then lost all proportion; use the right tool for the right job, as ever.
IIRC (bit rusty now) python makes you do exactly [x, y + 1) in code and it pisses me off which is why I did my own trivial end-is-included range for loops.
Otherwise would result in surprises like range(10) giving 11 values, violating convention. I admit I usually have to pause to remember whether it's inclusive of the upper bound or not.
> sorting polygons correctly is inherently O(N^2), [...] because polygon overlap is not a transitive property (A in front of B and B in front of C does NOT imply A in front of C, due to cyclic overlap.)
Well ok but I don't get this:
> This means you can't use O(N lg N) sorting, which in turn means sorting 1000 polygons requires a million comparisons -- infeasible for hardware at the time
ISTM you CAN'T sort it full stop because of cycles - so what kind of sort (never mind O(N^2), any sort) can order cycles? They can't.
Your intuition is correct. However, in the context of an O(N^2) comparison sort you can implement a tie-breaker check that at least ensures order stability between frames.
I don't have synesthesia to any notable degree but what you're describing kind of makes sense as being it. Not tastes for me, but sounds can elicit such responses eg. a sound having graphite steel I can well imagine. Or other things that would baffle another person.
Frankly, "notes of blue and a hint of graphite steel" sounds too remote from normal experiences to be consciously made up - I mean, who's going to relate to that anyway?
HN's really starting to put me off with suchlike rife blanket-damning posts that are uninformative and usually come from people who have little experience.