Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The graphical difference has semantic significance in some domains: http://en.wikipedia.org/wiki/Mathematical_Alphanumeric_Symbo...


I guess that makes sense.

Personally I find it annoying how mathematical notation seems so intractable today. Things that are easily understood in code for me are a mystery in math notation. But I guess there will never be an overhaul with a more intuitive typography...


The book Structure and Interpretation of Classical Mechanics redefines some of the trickier parts of the standard mathematical notation, and does all of the actual computation in Scheme. They extended the standard Scheme interpreter/compiler to support algebraic manipulation of Scheme programs, which lets them do all of the higher-order computations in Scheme as well (things like transforming between coordinate systems, finding the derivative of a function, computing the Lagrange equations from partial derivatives, etc). Usually the proofs/derivations are shown in the modified standard notation, and then the resulting implementation is shown in Scheme.

I haven't finished the book (turns out I know less calculus than I thought), but the result is pretty effective. You're much less likely to get confused about which things are numbers and which are functions, and which of those functions operate on numbers and which ones operate on other functions, once you see the Scheme implementation of something.


This looks like a pretty cool book, thanks for the pointer. For anyone else who's interested, MIT press has it online for free here:

http://mitpress.mit.edu/sites/default/files/titles/content/s...


In some cases, you might be reading poor-quality mathematical writing.

According to my generalization of some advice from Knuth:[1] in a good math text, definitions of terms are presented as they go along, and they are explicit about what means what. Furthermore, one of the factors that determines the quality of mathematical writing is

- Did you use words, especially for logical connectives, whenever you could have used words (instead of symbols) to express something?

and

> Try to state things twice, in complementary ways, especially when giving a definition. This reinforces the reader’s understanding. [...] All variables must be defined, at least informally, when they are first introduced.

This is repeated:

> Be careful to define symbols before you use them (or at least to define them very near where you use them).

There are some cases where "the general mathematical community is expected to know what you mean," like when publishing papers in some specialized field, but if you're writing a book, these rules hold quite true. Books certainly should explain their notation, especially since the general consensus for certain notations is expected to change over the decades ...

[1] http://jmlr.csail.mit.edu/reviewing-papers/knuth_mathematica...


Keep in mind it is also true the other way around. Something can be mathematically clear to someone and totally a mystery in code form. Each one has his/her strengths and weaknesses.


For some concepts that can be expressed in both code and math, I prefer the code notation because I can run it, and also make small tweaks and see what happens. For example, I got a better understanding of Löb's theorem [1] by translating the proof into Haskell [2].

[1] http://en.wikipedia.org/wiki/L%C3%B6b's_theorem#Modal_Proof_...

[2] http://lesswrong.com/lw/l0d/a_proof_of_l%C3%B6bs_theorem_in_...


If it can be coded, I prefer having both, or implementing the code. It helps in understanding the algorithm behind. But maths is much larger than what can be coded, or is useful in code, so the only thing left is playing with toy examples ("coding" when working with really weird stuff.)

I'd love to see more of APL (and a "larger" set of APL functions, actually) in use. The idea of a notation we could run directly is/was awesome.


Probably true, and I guess if you're a mathematician, you quickly get used the symbols. And I'm not arguing against having those symbols in the first place, its just that some of them have an 19th century feel to them, and do not seem intuitive.

The art of typography and signage really only matured in the 20th century, and I'm certain some of the symbols would look very different if they were designed today. Anything that helps with teaching math and making it appear friendlier is a plus, imho.


I'm not sure what symbols are you hinting at. First I thought it was to Fraktur kind of letters, but obviously this shouldn't be the case, as you point "teaching" as a plus of redesigning them, and Fraktur symbols are used "traditionally" in relatively high level algebra (for some reason some symbols are used more in some realms, for me Fraktur started appearing when talking about complex stuff about ideals). Once you get used to them, it's like a second language, and that's it. I remember reading Feynman used his own symbols for sin, cos and other basic functions (turning them to one-stroke symbols) but he had to give up once he had to talk with other people.

Math symbols are more or less a universal language. Once you know how the symbol appeared, or get used to "reading it right" they are totally natural. I don't see ∂ as a "weird d," I read this as "partial." It wasn't natural at first, but I got used to it, just like I got used to English.


It's like three-letter names in assembly. It's good when you're doing it, but step away from it for a while and you can't remember what the signs mean anymore.


Indeed, this is technically a misuse of Unicode.


It's unclear whether you're talking about the page or the unicode block.

For the page, that's fairly obvious when you look at the pseudoalphabet converters.


If you refer to those characters, no it's not. It's not just a different style for the same character, it has semantic meaning.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: