1994? How about 2016 and I'm still living in a nightmare - my company uses tcl ("legacy" scripts). No matter how many times I tried to approach the language with an open mind there is no other language that grinds me down as much. If you think passing a variable name in a string to a function and then using upvar to access it is an acceptable way of returning values then have I got good news for you - we have languages with a sane way of doing it today (heck we've had them for longer than tcl has been around for)! I die every time I have to look up which order arguments come in because of how inconsistent it is, and oh, is it append or lappend or some other crazy keyword? People at the office had this for decades and they still have to use a cheat sheet. Why... There's perl, python, bash, anything god, please, make it stop.
[Warning: Personal anecdote... somewhat off topic.]
Around 2003 I was getting very heavily into what we used to call "Dynamic HTML", which of course required programming in JavaScript. At the time JavaScript was looked down upon by most professional programmers as a toy. A toy that had buggy implementations in various browsers, that lacked the strict typing of C++ or Java, that didn't have classical inheritance, that had no good IDE, and, worst of all, that you could create some terribly confusing bugs with.
At the time I'd moved to Fort Worth which let me attend the monthly Pragmatic Programmers meetings/dinners at Spring Creek Barbeque in Addison, TX. This was a meeting with a lot of polyglot programmers and "old timers", and there was a prevailing attitude that programming languages were just tools, and a good programmer learned to use many tools well.
Anyway, there I was one night ranting about JavaScript to someone I only remember as Chris. I probably chewed his ear off for a long time about how terrible JavaScript was. After what I'm sure was many minutes, I paused to let Chris tell me how much he agreed that JavaScript was terrible. Instead, he simply said: "You need to learn about prototypical inheritance."
This caught me off guard. I don't remember if it shut me up, but I hope it did.
Fortunately I actually took Chris' advice. While I'd understood that JavaScript had prototypical inheritance, I'd never really taken the time to understand how it really worked. I'd also never really grokked how scopes worked. As I properly learned these features, I started to gain a new respect for the power of the language, and it quickly became my favorite.
So, now when I have a strong negative reaction to a new language or tool, I ask myself if I'm reacting because of a fault in the language or if I just don't like that feeling of being less productive because I haven't learned the language yet. Usually it's the latter, which means I need to grit my teeth and bend my brain and learn.
(P.S. Chris, if you're reading this: Thank you! I owe you a millions lines of code in grattitude!)
> At the time JavaScript was looked down upon by most professional programmers as a toy. A toy that had buggy implementations in various browsers, that lacked the strict typing of C++ or Java, that didn't have classical inheritance, that had no good IDE, and, worst of all, that you could create some terribly confusing bugs with.
I could have sworn it was a _different_ Chris. I remember someone with black hair. He may not have frequented the meeting as much, because I don't remember seeing him again after that.
I've written some surprisingly complex software in Tcl. It's easily preferable to Perl and Bash to me by an order of magnitude. Python I generally prefer unless I'm working with threads and need parallel execution across cores, in which case Tcl is actually better (unless multiprocessing can handle your use case). By the way, Tcl has full coroutines too: https://www.tcl.tk/man/tcl/TclCmd/coroutine.htm
What I've found is that most languages have language smells that you just have to get used to. The same is true of every language you listed there. Using 'upvar' and the like quickly becomes second nature and barely poses a problem in Tcl code.
Honestly I don't know what this comment means. What category are you talking about, and what else is in it? Those were two languages explicitly mentioned by the comment I was replying to. As I said, there are even cases where I find Tcl to be a better language than something very popular today like Python. Tcl has had features for years or decades that Python has only very recently acquired (full coroutine support, an event loop), and it's still much better for multithreaded programming.
I had essentially the same relationship with Tcl until I went through the Brent Welch book in great detail. From that point on, the only real cost in using it became not remembering the ... soup of what I ( or someone else ) had done when returning to old code. That's true of anything.
I presume that by "look up in which order..." refers to lsearch vs "string first". Yup. Annoying; one's "needle haystack" the other is other way 'round. This being said, I am skeptical that this grinds anyone down. It'd have to be one grain of sand on a piece sandpaper.
My experience with Tcl is mainly that if I have to extend or modify a script, some care must be taken to design the thing for that upfront.
But after a couple decades watching comp.lang.tcl, I think the big leap is that you either embrace the event driven model or it's not nearly as much fun.
> My experience with Tcl is mainly that if I have to extend or modify a script, some care must be taken to design the thing for that upfront.
That matches my experience with assembly language.
Even in C, I can just begin coding and a good design emerges from prototype code. Months later I understand everything, and have a good time maintaining it.
It may be the case that you've got Good Style, for whatever reason (inherent gift, good training, school of experience), but my take ( personal experience/view only ) on the two languages is that C style largely falls out of data structures[0], and Tcl out of action-directive (commands). I think of Tcl as a long lever where a small amount of code lifts a lot, either by composability[1], or leaning on heavy-lifting extensions, but very command-oriented none-the-less. By comparison, I feel my C more often is centered around structs.
[0] Show me your code and conceal your data structures, and I shall continue to be mystified. Show me your data structures, and I won't usually need your code; it'll be obvious. -Eric S. Raymond, The Cathedral and the Bazaar
The exposure of features like "upvar" is the programming language designer saying, "This shit is implemented in exactly one way, which is interpreted, and it will stay that way forever. Thus, here are the run-time keys to the interpreter environment, so you can implement for yourself that which I won't. Good luck!"
I have used Lisp, Scheme, Prolog (and multiple dialects of each) for extended periods in large projects. Each was a pleasure to work with. They all have their own advantages. I am not sure if there is value in arguing over x feature of one language vs. y of another. As a matter of fact, it was fun to write Prolog interpreters in Lisp, and Lisp interpreters in Prolog, for example. (Many courses / textbooks (used to) present these as side projects to do and the task is simpler than it sounds.) One could say Tcl is really an extension of Lisp or Prolog in its core concepts of syntax, data and program equality, style of interpretation, etc.
I agree wholeheartedly with the last paragraph of Mr. Ousterhout's reply here - and I must say, a smart, classy and almost Tcl-ish way of a jab at its distractors:
I came across Tcl/Tk while doing Motif in a large C-based project. I simply couldn't believe how powerful, simple and succint Tk was compared to Motif (or anything else since). And it ran on Windows, Unix, Linux and Mac OS, to boot. It was too late for that project to switch, but all my other projects have used Tk if they ever needed a UI.
Similarly with Tcl. I still think the best introduction is Mr. Ousterhout's original Tcl/Tk book. It stands as one of the best language books on my shelf. Combined with Tk, one can put together a working application prototype in no time.
Of course, this was eons ago. Nowadays, Tcl offers one of the best environments with a complete set of packages ranging from web servers to image and sound processing. Plus, you can distirbute your program and all of its media and support files easily as well.
I believe Tcl has been mischaracterized and has suffered in terms of open popularity. But for insiders, it remains as one of those secret indispensable Ninja tools that is used over and over again for competitive advantage.
A bit off topic: did you really use Prolog for something else than just research? That is, did you write software that was used in production? I have learned it many years ago but always had a nagging feeling that it is better to use a "normal" language with AI libraries if you need the program to actually do something... Did I miss anything?
Most of my Prolog experience was in academics - for my own research as well as for maintaining some existing packages. But I did get to use it outside as well. There are many cases where a controlled language is needed or comes in handy and solves a lot of issues. This is where Prolog shines - plus you get to use a lot of the previous stuff you developed as well.
When it comes to an "AI" language vs. typical language with an "AI" library, I will just repeat a phrase that is often used in the field: what you thinnk of as an AI problem today will no longer be considered AI tomorrow, since it will then be well-understood and implemented in many places. People used to say that in response to the (seeming lack of) progress AI had made to date. A good example of that is Siri or Amazon Echo. Not many people consider it an AI problem anymore, or so it seems.
Dunno if I'm dense, but reading that second post, I can't figure out if he's saying Lord's name was attached to RMS's post, vice versa, or some third party attached both of their names.
I'm guessing you mean it's the second possibility, but I can't figure out how you get to there from reading it.
Two things about it make me think that RMS actually wrote it (probably after consultation with Lord). The first is this sentence in that post by Lord: "Who cares if a post occurs under my name that I didn't post so long as I more or less agree with it -- this isn't about ego."
The second is writing style. The Tcl War post is classic RMS. Compare it to this reminiscence from Lord and it's hard to imagine they were written by the same person.
When I wrote my comment above I skimmed through Thomas Lord's essay, found that the
parts I remembered being there were all in their place, pasted the link and pressed "submit".
In particular, I remembered the paragraph from which you quote as written
in a ironic manner, suggesting with a wink at the reader that Lord actually
wrote the post. Now that I've reread the essay, I don't see that at all. I
think in that paragraph he probably says what he means. That, and you are
right about the style.
In short, I retract my claim. Sorry to whoever I misled! My confidence was completely misplaced.
I should be wary of the account anyhow. For calibration: it's just false to say egcs was due to "friends of Cygnus" and "hostile"; it was friends of GCC with the intention it should turn out as it did, happily.
I can't find anything archived to check but, as I recall, Lord later posted a bizarre sort of semi-retraction in rms' name, and rms fired him soon afterwards, possibly as a result. Perhaps that's partly a source of rms-didn't-write-it ideas? I don't remember the suggestion at the time.
Having spent the past few weeks debugging a Gtk application, Gtk is such a step backwards from Tk. It's so difficult to predictably lay out a Gtk application in code, things which Tk solved with the beautifully elegant pack and later grid models.
The Tcl language OTOH .. awful once you tried to do anything mildly complicated.
Depends on what you mean by "complicated". My habit is to lay things out in event cross state and for that, it's pretty even. IMO, FSM are awfully good at reducing/decomposing complexity. An FSM in Tcl may look like:
array set FSM {
init {
...
set state two
}
two {
...
}
}
...
set state init
...
eval $FSM($state)
I have just put together too many things that would have been severely daunting in any other way in this manner to agree with you.
I both like to use Tcl and Lisp. Tcl and Tk I use for smaller tools and Lisp for larger application programming. While the Tcl syntax indeed is a big rough, its also very simple - I can strongly recommend to read Ousterhouts Tcl book for insights. It is also remarkable, that Tcl is one of the few languages that like Lisp allow the user to define arbitrary syntax elements (e.g. you could rebuild if, for, ...) in the language itself. This is a very powerful feature which allows you to create new abstractions the original language designer did not build in.
Beyond that, Tcl comes with a rather complete runtime, and using tclkit, you can very easily create standalone and small (<2M) installation-free packages out of your program. This is very valuable, especially if you have to distribute to Windows. This is where Tcl easily pulls ahead of a lot of languages.
And often I use Tk for the user interface of my Lisp programs via Ltk, which brings both together.
Heh. In a recent overhaul of my site, I considered taking these pages down. But they do still get a reasonable amount of traffic, and I hate broken links.
Thanks to amelius for the "Tcl the Misunderstood" link. I agree with that, for the most part. I haven't written Tcl in many years, but I don't think it deserves the reputation that it has.
To my mind, there's a tremendous irony in the 1994 flame war. A user of Tcl since the 1990's I'd read, and been amused by, the hyperbole contained in those long-ago comments a number of times. Not that flaming doesn't still happen, such emotion isn't hard to find, though the subjects in dispute vary, the form of battle remains the same.
Goes without saying that there's a nub of truth in those ancient rants. Tcl can be kind of hard to catch on to, but then again, the Gnu "replacement", the Guile implementation of Scheme, is also hardly a mainstream favorite. Plenty of hate is expressed toward Lisp-derived fully parenthesized languages, Scheme included.
The not-so-dirty secret in the "language war" is that Tcl is itself very much a Lisp-like language, and over the years has increasingly adopted many features that make it even closer to languages like Scheme.
Having become reasonably fluent in both Tcl and Scheme I see similarities that outweigh syntactic differences. Not identical of course, but translating programs across them is fairly straightforward compared to C-like code, e.g., Javascript.
Maybe Lisp derivatives are an acquired taste, or maybe there some inborn twisted brain thing that entices some and repels others. At any rate, I'm convinced Tcl and Scheme are very capable languages, admittedly different in approach vs. more commonly used systems. Looking back it is ever so obvious that the old war was fought between two sects of the same tribe, and isn't that often the source of the fiercest rivalries.
Around the time leading up to the TCL war, Lua was peacefully and quietly born in a manger at the Pontifical Catholic University of Rio de Janeiro, Brazil:
"In 1993, the only real contender was Tcl, which had been explicitly designed to be embedded into applications. However, Tcl had unfamiliar syntax, did not offer good support for data description, and ran only on Unix platforms. We did not consider LISP or Scheme because of their unfriendly syntax. Python was still in its infancy. In the free, do-it-yourself atmosphere that then reigned in Tecgraf, it was quite natural that we should try to develop our own scripting language ... Because many potential users of the language were not professional programmers, the language should avoid cryptic syntax and semantics. The implementation of the new language should be highly portable, because Tecgraf's clients had a very diverse collection of computer platforms. Finally, since we expected that other Tecgraf products would also need to embed a scripting language, the new language should follow the example of SOL and be provided as a library with a C API."
The Guile vaporware was the first time I really lost respect for Stallman and the FSF. It's incredibly damaging to say "don't use TCL use Guile instead" and then never actually produce a useful Guile product. A similar pattern happened in the same timeframe with the Hurd kernel.
I have an enormous amount of respect for the FSF. But it has been most effective when it has working code, like gcc or emacs. The political pronouncements not backed up by working code were actively harmful.
I guess it took awhile, but Guile is quite a capable programming language and VM platform these days. I have hacked on package managers, web applications, game engines, static site generators, and more with Guile. It's a really wonderful project that is only getting better.
Many FSF projects that are not clones of other projects tend to get bogged down in getting it right in the MIT sense.
Thats why a plucky student from Finland managed to steal their thunder by getting a monolithic kernel out there while the FSF was trying to get that microkernel working for the Nth time.
IIRC, GnuCash and LilyPond use Guile. The project itself now has some good maintainers, and has been making strides since its 2.0 release. Some parts of the (small) Scheme community have taken notice, and new projects are popping up; like GNU Artanis[0].
For a long time -- until Andy Wingo took over development -- Guile was kind of a joke of a Scheme. It was horrendously slow and buggy, and it was updated very infrequently. Since Andy took over, it's quickly become one of the better choices for Scheme. Guile 2.0 is solid and reasonably fast. The 2.2 release is shaping up to be even better and even faster. Andy has done a great job with it, and he keeps his blog[0] updated with a lot of interesting posts about Guile internals and development.
The Law is the basis of my concerns for Rust. I love the language, but I feel that using it consists of a considerable amount of yak-shaving; I have a habit of loving languages that violate The Law, Scheme and Haskell being among them.
Yep, should be the top comment. You can save yourself from having to know anything more about this incident by reading that and going back to whatever more interesting thing you were doing before.
"I didn't design Tcl for building huge programs with 10's or 100's of thousands of lines of Tcl, and I've been pretty surprised that people have used it for huge programs."
lots of Tcl criticism stems from the fact that people started using Tcl not in the way it was intended.
I have trouble finding things that require more than 10kloc of Tcl but Tcl has been extended since Osterhout sorta faded on it.
It has features like namespaces and package management since at least 8.x . Language comparison is always challenging. I've had less trouble with Tcl than with just about any other language. But it rather requires thinking about the problem in certain ways. The main thing it requires is adherence to an event driven model.
Posting on a newsgroup for X that X sucks is the very definition of trolling. Since long before it came to mean "anyone who disagrees with me for any reason".
Dived into tcl wiki after writing a program with tkinter (tk python bindings), it's amazing to see a community speak about other paradigms. Reading how they approached FP idioms for instance. I felt this way with perl too. These are two old languages that are mostly ignored nowadays (except for the perl6 has shipped) but they have very interesting ways.
It's a tiny Tcl subset, implemented in only a few kloc. It's used by Fossil as a HTML templating engine but it's easily adaptable for other things. It's small enough to be understood and does an excellent job of demonstrating the core principles of how string-processing languages think.
@david-given probably knows this, but that TH1 is Tcl-ish is no surprise, given fossils founder/principle author is drh (Richard Hipp).
drh is a former Tcl Core Team member[0], and probably most famous as the author of Sqlite, originally a Tcl extension that "escaped into the wild"[1]. Sqlite also has a comprehensive test-suite written in Tcl[2].
I was (pleasantly) surprised to find out Fossil used a Tcl-like language for templating.
Tiny Tcl implementations are fun in general and perhaps easier to produce than tiny Scheme implementations — as long as you are willing to concede that everything really is a string. :-) (I.e., no caching binary representations; you have to parse from scratch each time.) In a similar vein as TH1 but with a little more functionality there are Picol (https://tcl.wiki/Picol) and LIL (http://runtimeterror.com/tech/lil/).
Disclosure: I maintain an expanded fork of Picol. The original version of it written by antirez was only ~550 LOC but with suchenwi's additions and mine it is now around 3100. There is a link to it on the wiki page. The change that I am most fond of is making it an stb-style (https://github.com/nothings/stb/) header library.
You don't have to treat everything as a string. I've written a Tcl interpreter in VHDL and it avoids the traditional pre 8.0 string interpretation. It parses everything into an AST and interprets that instead. This saves on the overhead of string processing mostly static code inside another interpreter that isn't optimal for such tasks.
Oh, I agree with you on that. My point was that Tcl makes it especially easy to "cheat" and avoid building an AST. You may do this when you only care about writing an interpreter in few LOC, like the original version of Picol, quickly.
A Tcl interpreter in VHDL sounds interesting. Do you have a public
repository for it?
TH1 (literally "Test Harness #1") was originally conceived as a minimalist reimplementation of TCL sufficient to run the TCL-based SQLite test suite on SymbianOS. That use case never materialized. But later, when I needed a small and lightweight scripting language for Fossil, TH1 was drafted for that alternative purpose.
It so happens that the 14th annual European Tcl/Tk users conference just took place [1]. Quick perusal of the event site shows the breadth of uses Tcl is still being put to (enthusiastically, by programmers who choose to use it):
* nuclear power plant operations
* high-performance visualization and analysis of microscopic images.
* design analysis of high-resolution camera light sensors.
* embedded programming of Cortex-M3 ARM v7-Microcontroller.
* satellite control systems.
* multi-platform/compiler automated C++ library compilation.
* customization of the Fossil VCS.
* scripting and visual programming on Android.
* communicating sequential process programming.
* Linux D-Bus scripting.
Slag Tcl all you want, but it's still being used every day by brilliant programmers to ship real products, in environments where reliability is crucial. How does Guile stack up to that? Or any other unfashionable programming language for that matter?
Maybe more effort should be put into understanding what these programmers see in Tcl that the critics can't.
Coming in from the harsh fronts of YouTube, Reddit and 4Chan, this is an awfully tame flame war.
In any case, a lot of good points are raised by both sides. As an Emacs user, I appreciate Stallman's points, but he does make a fatal mistake, in that he doesn't include popular extensions like itcl in his evaluation. That's like evaluating lisp without considering popular macros, or the potential of macros. While that doesn't alleviate the problems Stallman has with tcl's lack of datastructures, those are not insurmountable with a couple of well-written C-extensions, assuming that tcl's C interop is any good at all.
Also, tcl is a pretty terrible lisp, as although they have a lot on common, the differences are substantial enough for fans of one to have some degree of distaste for the other.
The most entertaining part is that Stallman criticises tcl for lacking common datastructures, being slow, and having bizarre syntax that appeals to hackers alone. ESR (and, I believe, Stallman, although I am unsure), goes on to note the common use of dynamic scope, confusing quoting mechanism, and the horrors of upvar and upvalue.
Gee. DOES THAT. SOUND. FAMILIAR?
If it does, it's because most of those problems also appeared in MacLisp, the direct predecessor to Common Lisp. Many of them also appear in other lisps:
Linked lists, an unusual datastructure, are one of the few well supported by Lisp. Lisp used to be quite slow, and many implementations still are. Dynamic scope is still the default in elisp, and macros just barely hold the system together when you're writing lots of code, a cicumstance that Stallman claims elisp is actually better in than tcl. As for confusing quoting, `(there ,(is 'quite) ,@(a lot) of that in lisp). And upvar and upvalue are in some ways less dangerous than macros.
Actually, lisp used to have its own version of upvar and upvalue, coupled with a mechanism called the fexpr that let you get rid of a lot of the quotes. These were not implemented in most later lisps, due to the fact that they are nearly impossible to optimize well.
There was an interesting article by Ousterhout (Tcl creator) about scripting languages versus system programming languages some years ago. I think I read it in one of the print computer magazines that existed at the time - Software Development Magazine, Unix Review, Dr. Dobb's Journal, C User's Journal, or some such. I wonder whether he was inspired to write that article after this post (mentioned elsewhere in this thread):
Just took a quick look at it. Need to read more, which I will, but pretty sure it is that one. I doubt he would write more than one article on that topic, also it is slightly long (about 24 PgDn keystrokes on my PC :), which I remember from before.
Excellent. I thought of googling for relevant keywords to check if the article was online (after posting my previous comment here), but you saved me the trouble. Thanks!
The guile history also mentions the TCL war from their point of view: https://wingolog.org/archives/2009/01/07/a-brief-history-of-...
although it misses the important technical details.
Stallman was right with his criticism, but could of course convince nobody to add the missing features. Some of them came with 8.5 eventually but support for lexicals (upvar) is still horrible.
There was a semi-rebuttal … I think in a post from Brent Welch. Tcl does have things called "arrays", and you use them much as you'd expect. Tom Lord confirmed later that what he and RMS meant by this was that it didn't have "true arrays" that are laid out contiguously in memory, to be much more efficient. That's a valid criticism: Tcl's arrays are basically dual-use hash tables, much the same as JavaScript's arrays.
The other two things are totally valid. Tcl has no references, so you can't build your own linked list. And all values are (at least semantically) strings, so numeric operations are slow.
But of course, a lot can be done to optimize those things in the implementation, and over time at least some of that optimization was done.
> all values are (at least semantically) strings, so numeric operations are slow.
All values are _still_ semantically strings (it's a core tenet of Tcl: "Everything is a string (EIAS)"), and in 1994, they were practically strings, too (that was the implementation). In 1999 w/ Tcl 8.0, Tcl became byte-compiled and adopted "Tcl_Obj" to represent values[0]. The Tcl_Obj (nothing to do w/ object-oriented programming), is what's called a "dual ported" object. It contains a string-representation of it's value (to preserve EIAS), and a "native" value of it's last computed use. For example, if you:
% set a 9.0
% expr { $a * 0.01 }
The native value of the Tcl_Obj $a refers to will be a float. Repeated calls to $a in a float-context will continue to use this already-computed native value. No need for further string interpolation.
Well, right. Tcl was an early example of a trend where key data structures were built into languages and fairly well encapsulated as "one size fits all" abstractions. Computers had become fast enough that for many purposes you didn't have to build a custom list or hash table.
That all seems normal to us now, but it was uncommon back then.
I love how RMS always begin is war on language with "it does not have a LISP/scheme syntax".
University coders and their obsession of purity of syntax (map/purity) superior to actually dealing with the true problem of the hardware is looking (territory/practicality) is looking for me like an Aristolecian argument of superiority of the style over the «getting things done».
Well, as spotted in the answers, expect is a nice non university bred innovation (is it?) that is still a de facto standard in handling interactive session (stateful) with a simple syntax that helps getting things done with maintainable code.
It is ported in Perl, python ... and I think this is probably a point for what Tcl is worthy : getting things done.
Practicality beats purity.
And as of today it is still the most portable way to write embeddable graphical user interface that have good FFI.
Lisp is close to syntaxless. What "lisp syntax" means is principled to the bones systems, including syntax. The fact that everything in lisp was in recursively defined lists meant you could reuse recursive thinking everywhere. An `infinite` tower of reuse. You get genericity, pattern matching, combinatorial generation for very low cost. It's not practical for people crunching data or opening files .. but whenever you need to attack something a little more complex the 'practicality' argument shift sides.
Well, if he meant "it doesn't have S-expressions", he should have said that. That's not an unreasonable expectation for a guy who is otherwise a pedant in almost all other areas of his life.
Fair point, and to be honest, it's a common problem in many places. People enjoying a property often fail to communicate it clearly with those who never did. Math, and I believe Lisp has many times more inherited math genes than other languages, is especially guilty of this. It's often a lethal cultural shock.
S-Expressions are fun. When I first learned a Lisp, Scheme in my case, I found them weird but could see the appeal.
When I learned Haskell afterwards, it was the opposite: I thought I'd miss S-Expressions, but I learned to like its more baroque syntax.
S-Expressions have the chief virtue of making even the most crude macro look no different from the most over-engineered builtin to the language from the user's point of view.
And they clearly separate the syntactical space in the language, so that new features (whether as macro or built-in) can always find a space. Compare to the cracks of C they had to fit C++ into.
Haskell might be even worse. The chief obstacle to adding OCaml or-patterns to Haskell (https://stackoverflow.com/questions/24700762/or-patterns-in-...) seem to be finding a syntax that's both pleasant and fits between the strands of the fine web in ascii space of already valid Haskell programs.
...And I never grew to like Haskell. The syntax makes all of your programs look very elegant, but there are no clear delimiters. The syntax for how functions are delimited isn't clearly explained by most sources. And I don't want to type my code and just hope that the magical compiler fairies deduce my meaning. That kind of programming attracts bugs like a man covered in honey.
I don't know what that means, but if the objection is to layout ("the off-side rule"), that's just an alternative to the braces and semi-colons that you'd typically use if it was machine-generated, for instance.
The objection isn't so much to the layout, as it is to the fact that I can't find any documentation as to what the layout is and how the compiler interprets it.
Interesting. I've never bothered to look up the formal specs, but it was never a problem that lead to an ambiguous interpretation of my programs. (I sometimes write programs that the compiler doesn't like. But seldom once that the compiler interprets different from me.)
I usually give types for my top level functions. It's good documentation, and also prevents most of these kind of issues. (For a similar example, human mistakes with operator precedence usually lead to compile-time type errors in Haskell, seldom to runtime problems like in C or C++.)
It's not a huge problem, but lack of understanding of how the compiler delineates blocks of code makes the code hard to reason about. I like to know what the compiler will do with my code, and what is and isn't valid through something other than trial and error. Scheme's perfect regularity makes that easy. Haskell... doesn't.
I am aware that the information exists somewhere, and I think I got a link a bit back, so I'll have to go track that down. But it is absolutely baffling that people don't discuss it more. Would you be able to write Python if nobody told you about the indentation rules? Of course not.
> And I don't want to type my code and just hope that the magical compiler fairies deduce my meaning. That kind of programming attracts bugs like a man covered in honey.
I wrote plenty of Haskell commercially (and for fun). Not once did we run into that problem.
> It's not practical for people crunching data or opening files .. but whenever you need to attack something a little more complex the 'practicality' argument shift sides.
"not practical for people crunching data or opening files" in what way? It's been extensively used for crunching numerical and symbolic data, and I've no idea how opening files is an issue.
Oh, I meant where he's living intellectually. But I think my perception of what classifies an academic is pretty skewed: I managed to code Haskell for money for the first five years of my career (now still sneaking it in at Google in my 20% time), and don't consider myself an academic at all.
Perhaps I also have a European bias here. After the war, the Americans got to play with their computers early on, and their academics ended up with Lisp. The European computer scientists had to wait, and thus had nothing better to do but think. They ended up with types and ML (the family that Haskell sprang from).
These days, it seems that the European tradition has firmly won in the most academic parts of computer science. Lisps are still around, and with Clojure have seen more mainstream success that ever before. But in Academia the only Lispers left standing seem to be the Schemers, and they have warmed up to static types, too. (They did some great work on contracts.)
Here he begins by saying that TCL, in effect, does have Lisp syntax (which "hackers" like) but that the regular non-hacker users will balk at it.
This part of his proclamation anticipates and softens the ironic impact of subsequently promoting a Scheme, further mitigated by in the same breath promising the availability of a parallel notation that will provide the blub syntax for non-hackers.
(Of course, he was right about a dual syntax approach being viable: today we have virtual machine environments with multiple languages. As a Lisp hacker, Stallman knew about the existing history of such practice, like CGOL for Common Lisp and whatnot.)
It did later come out that Tom Lord wrote it, but not as a prank; he was working with RMS to build Guile, and this was the opening salvo in trying to build the case for it. RMS did post it under his own name.
Well, fake money is well done when it looks as good as the original :)
Just for saying I did not invent the lisp/scheme obsession directly from his web page[1] :
The most powerful programming language is Lisp. If you don't know Lisp (or its variant, Scheme), you don't know what it means for a programming language to be powerful and elegant. Once you learn Lisp, you will see what is lacking in most other languages.
Lisp is still pretty damn practical. I can crank out working code far faster in Scheme than anything else. Because the mental model is so small, you focus more on your design than on getting it to fit your language.
And if you like TCL's command syntax, srfi-49[0], srfi-110[1], and srfi-119[2] all push scheme towards that to varying degrees.
Well RMS didn't go with a unix clone because it was what he wanted, but because he figured it was more likely to get to a working state. At his heart he is always an AI Lab guy, lamenting the demise of the lisp machine.
Tcl and Python (among many others) were also designed by "University coders". When it comes to language design, academic origins tend to have little bearing on how "practical" a language is.
In programming, that which you're calling the map is in fact the territory. The analog of a map is design documents: these indicate how the territory is going to be soon, or how it is.
In the physical world, we change the territory (e.g. alter the landscape, build). That's what is analogous to coding.
It is helpful for the territory and all of its manipulations to be clean and well-organized.
This is not just an "obsession" of "university coders".