Hacker Newsnew | past | comments | ask | show | jobs | submit | cossatot's commentslogin

There were no alphabets in the Americas before European contact. Mayan had written mathematics and hieroglyphics, and some Quechuan speaking peoples had string that had symbolic knots that had some mathematical representation (I don't know if it allowed arithmetic or was just record keeping).

Sequoia developed the Cherokee syllabary (where symbols represent syllables instead of vowels/consonants) in the 1800s after seeing white men reading, and figuring out what they were doing (he spoke little English and could not read it). This is the first real written indigenous language in the Americas.

The Skeena characters shown here are obviously derived from European characters, as was the Cherokee syllabary. I think most written forms of native languages in the Americas are similar.

The Cree have a script which is far from European characters but was nonetheless developed for the Cree by a missionary in the 1800s. The Inuit have modified it for their language.

I don't know much about indigenous languages in the rest of the world.


The Maya script was not an alphabet because the word alphabet refers to a specific subcategory of scripts.

The Maya script is a logosyllabic script. Such a script combines symbols for whole words with symbols that represent syllables phonetically.

A modern example of logosyllabic script is japanese (kanji + kana)


They are so frequently intertwined


Rocks could be potential sources. Crystals that large are by no means rare, with feldspars being the most common on Earth and perhaps on most rocky planets (quartz is well known of course but I think would be rare without the magmatic fractionation that happens due to plate tectonics, which is perhaps unique to Earth in the solar system.)

Volcanic glass (eg obsidian) is also shiny and by no means rare in the solar system.

Many asteroids are also metallic, and perhaps metal crystals or fracture planes could produce reflectors of the right size.

But maybe it’s just aliens.


Don't quote me on this but I think a lot of the change has to do with fire and flood suppression. Certainly on the Konza prairie and similar areas, small trees (post oaks and eastern red cedar) grew in natural fire breaks like bluffs, and individual trees would live for several hundred years. Floodplains would have large cottonwoods which can withstand seasonal inundation but wouldn't necessarily be thick forest otherwise. And the prodigious lightning storms and (throughout the Holocene) burning by native tribes for hunting kept trees off of the uplands.


Do not quote me either but you are correct. Prairies depend on fire as do most native forests. Many trees are dependent on fire for their off-spring to succeed (Jack Pine, Red/White Pine, Bur/Northern/Pin Oak) and to kill off invasive species (prairies). Prescribed burns are critical in maintaining these eco-systems and are an under utilized resource. They require 'perfect' conditions (temp, humidity, timing, human resources), so are rarely done correctly if done at all. Source wildland firefighter in another life.


Because it's for your kid's bed. At 3 AM the previous night, they peed the bed, so you got the other one out and put it on, throwing this one in the laundry room. Then, today you washed it but the one on the bed already is still in good shape.

Or, you have sheets of a few different colors, each paired to a comforter with a different weight that is changed seasonally, or biweekly, depending on the preferences of you and your bedmate.


This is evident in his description of programming in his later years:

Time and time again I would send my friend Dave Walker an email declaring that Javascript (or something else) was utterly broken, incapable of executing the simplest program without errors. Dave would ask to see the source code and I would present it to him with detailed notes proving that my code was perfect and Javascript was broken. He’d call me, we’d discuss it, and eventually he’d say something like, “Where did you terminate the loop beginning at line 563?” There would be a long silence, followed by the tiniest “Oh” from me. I’d thank him for his help and hang up. A week later, I’d be fuming again about another fundamental flaw in Javascript.

Many of us are stubborn and will work hard and long, without much positive external feedback, under the assumption that our vision is correct and the audience, if one even exists, is wrong. Much fundamental progress has been made this way: Faraday, Einstein, Jobs, etc. But of course many times one simply is wrong and refusing to see it means throwing years away, and whatever else with it (money, relationships, etc.). It's a hard balance, especially for the monomaniacal without much interest in balance. Finding out how to make solid (public, peer-reviewed, evidence-based, whatever) incremental progress towards the paradigm shift seems to be the way if one can manage.


That quote about JavaScript is... huh. I do not understand how you can even begin coming to the conclusion of "JavaScript [is] utterly broken, incapable of executing the simplest programs without errors" when obviously, JavaScript (which I do not like, by the way) is productively used on a large scale (even back then), and constantly under scrutiny from programmers, computer scientists, language designers... it's just baffling.

It reminds me of when I was around 10 years old or so, maybe slightly older, and playing around with Turbo C (or maybe Turbo C++) on DOS. I must have gotten something very basic about pointers (which were new to me at the time) wrong, probably having declared a char* pointer but not actually allocated any memory, leaving it entirely uninitialized, and my string manipulation failed in weird and interesting ways (since this was on DOS without memory protection, you wouldn't get a program crash like a segmentation fault very easily, instead you'd often see "more interesting" corruption).

Hilariously, at the time I concluded that the string functions of Turbo C(++) must be broken and moved away "string.h" so I wouldn't use it. But even then I shortly after realized how insane I was: Borland could never sell Turbo C(++) if the functions behind the string.h API were actually broken, and it became clear that my code must be buggy instead. And remember, I was 10 years old or so, otherwise I don't think I would have come to that weird conclusion in the first place.

Nowadays, I do live in this very tiny niche where I actually encounter not only compiler bugs, but actual hardware/CPU bugs, but even then I need a lot of experiments and evidence for myself that that's what I'm actually hitting...


>I do not understand how you can even begin coming to the conclusion of ...

Obviously he's not serious, he's playing the part of the out of touch old man.


Ah, okay. Maybe it’s more obvious in context, or maybe my hyperbole detector is broken.


I can imagine grumpy an old man frustrated by a different paradigm shouting at his computer.

We all become that eventually, hopefully we can all be as poetic and humble (and honest) about it.


Sure, but “JavaScript [is] utterly broken, incapable of executing the simplest programs without errors” is a bit much. I find it hard to believe that even when I’m completely out of touch, I’d say that about a language that people are obviously productive in (as much as I hate JS myself).

But apparently I didn’t get the hyperbole.


Sometimes when I play a point n click adventure and I am stuck for hours on a puzzle I tend to think: I've tried everything... surely there must be some kind of bug for why I am not proceeding.

Only to then realize (after reading the walkthrough) that there was indeed a way.

I think it's human nature to find (rather search) blame not only in yourself but everywhere else... anyhow, since the author is reflective we should be forgiving as well.


Just as a small note I did not get that too.


It is a rather common mindset among beginner programmers though, particularly younger ones.


> I do not understand how you can even begin coming to the conclusion

Tell us again when you're 74.

I'm still nearly 2 decades from it, but I am a profoundly different human to the one I was 20 years ago, or 20 years before that.


Other languages have problems, but before some basic libraries (jQuery/Underscore) and language enhancements (Typescript/Coffeescript), it was arguably quite simplistic, and parts of the language were straight up anachronistic.

If you've ever been unfortunate enough to have to wrangle a VB script routine, it was (less bad) like that. If not, I would go find some assembly code and teach it yourself, and then imagine that instead of side effects in registers there were random effects on your code/visual state.

And like assembly code, you could now imagine that the same code might behave wildly different on different machines in different browsers.

So a bit of "old man"isms, but also I imagine his JavaScript was tainted by the early days. It's better in some ways now, worse in different ways, I don't mean to say that is the worst or the best, just to offer perspective on where it came from.


I’m well aware of all of those things (I program modern assembly for a living, and witnessed the evolution of JS), but the quote was “JavaScript [is] utterly broken, incapable of executing the simplest programs without errors”, which is a bit more extreme than what you’re describing.


It’s a quality I’ve run into with a couple people: young or old, once they’ve ossified into thinking they are Better and Smarter than everyone else, they stop being curious and simply start mandating their wild “truths”

I’m sure we’ve all done it at one time or another, but repeated as habit without learning seems to speak of a certain kind of personality.


"Am I so out of touch? No, it's the audience who's wrong!"


Similar to the flood analysis others have mentioned, this can be used to create databases of buildings with the number of stories for each, which is important for understanding how each building will respond to various catastrophes (earthquakes, strong winds, etc.) in addition to various non-catastrophe administrative tasks. The other post about finding the depth of oil in oil tanks is actually super interesting to me because the amount of oil in the tank is a huge determinant of how it will respond to seismic ground motions. I had no idea the top sinks with the oil level and am skeptical that it does on all of the tanks but it's cool nonetheless.


They pretty much all do by design, it prevents vapours from building up at the top of the tank which is a fire/explosion hazard.

It works even better with high resolution synthetic aperture radar as you can measure the tank height displacement directly: https://www.iceye.com/blog/daily-analysis-and-forecast-of-gl...


Having read Eric Berger's Reentry about SpaceX and having a few friends who work at Tesla, my impression is that those organizations are not too dissimilar. They are also populated largely by millenial and gen Z people because older workers can't/won't deal with the hours and other working conditions.

Furthermore I think most blue collar American workers, and many white collar workers, are used to the concept of sudden and arbitrary termination.


That you are comparing Rickover's nuke program to Tesla and SpaceX kind of illustrates the cultural gap. Anyone at SpaceX ever get jailed for whatever reason his/her boss dreamed up off hand? Any analog to Skipjack at Tesla?

Think about that, today, Tesla and SpaceX are "tough" environments to people.

It's kind of a sign that a lot of people today have no idea how things worked back then. We will definitely have trouble bringing those environments back.


So why is the quality and reliability of Tesla products so bad compared to competitors? From an outside perspective it seems like Tesla engineers are generally lazy and incompetent, at least relative to an organization like Naval Reactors which maintains much higher standards.


They are not being tasked with making Toyota like reliability. That is not what made Tesla successful. Falcons are pretty reliable. With that said, Musk went crazy a few years ago and people are just now starting to realize.


This seems to have happened on 26 Feb 2023 as well: https://www.washingtonpost.com/technology/2023/02/26/instage...


The date must be cursed? Is there anything weird about the numbers? Y2K, 2038, and 0226?


Weirdly close date too.


The cuts aren't performance based. They're based on the ease of dismissal.

The voluntary resignations are what they are--I can't fault anyone for taking a good deal, and from what I have heard (married to someone in the government, with many friends as well) no one is being pressured inappropriately.

However, the other cuts are dominantly people who are 'probationary' which means that they are new to their positions, either by being recently hired or in some cases promoted. These people are, actually, on the whole harder workers than those who have been in their jobs for a long time, because they're still being competitive in order to move forward. The non-probationary employees have stronger civil service protections which means that they are harder to fire. This is the major discriminant used to decide who leaves and who goes.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: