Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Fortran.io – a Fortran Web Framework (fortran.io)
422 points by lelf on Jan 22, 2020 | hide | past | favorite | 246 comments


It's definitely not a dead language – a new Fortran front-end has recently been accepted into the LLVM project, joining Clang. So LLVM's only in-tree supported languages (excluding the various IRs) will be C, C++, Objective-C, and Fortran.

It had been scheduled for merge into the LLVM monorepo 2 days ago, but has been delayed pending some additional architecture review.


Lots of number crunching code relies on Fortran libraries even today.

Install numpy, it will pull in Fortran stuff.


Numpy actually explicitly doesn't need fortran. It will use some fortran lapack libraries for optimizations if they're present, but it doesn't depend on fortran at all.

Scipy is a different story.

The need for a fortran compiler was a big part of the original rationale for the divide of numpy and scipy when they replaced numeric. Numpy was meant to be the lighweight/core version of things, and scipy was the full-fledged environment. (I think numpy was even scipy-min, or scipy-core, or something along those lines for a bit.) A key differentiator for what when into numpy vs scipy was whether or not it needed a fortran compiler. That's still true today -- numpy explicitly doesn't depend on anything fortran related, and anything that uses it is optional.

(I'm not authortative on any of this, I'm just someone who's been doing scientific computing in python (and fortran) since the numeric days.)

On a different note, modern fortran (e.g. F90 and above) is actually a really nice scientific software language. It's _still_ really hard to beat, and modern fortran is pretty maintainable and easy to read in addition to being performant. I've dealt with highly optimized fortran codebases and highly optimized C codebases for the same scientific tasks. The fortran ones were usually a lot easier to read, more maintainable, and were still faster. Fortran is not really a general purpose language, but if you're doing lots of number crunching on large arrays, it's damned nice.


Yep! Seconding what you said: A key aspect is that it’s very easy to write fast matrix math in Fortran. Vectorized operations are downright pythonic!

I was amazed how much more complex things were when moving to C++ (as opposed to C or C written in cpp files which you’ll find in academia and some books).

Of course, that ease is a hazard when all the modern (i.e. fun) sci-eng software development is in C++.

Do not grow complacent in school, ye numerical methods researchers! Write your stuff in C++. The job market awaits.


> Do not grow complacent in school, ye numerical methods researchers! Write your stuff in C++. The job market awaits.

Could you elaborate a bit on this? I think I know what you mean, but I'd like to hear more if you don't mind.


Sure - I am speaking broadly of the job market for computational programmers outside of academia and national labs, and I'd better say that I am in the USA. So if you did a masters or PhD in CFD, or the finite element for structures, (and I wager also, E&M/Maxwell, computational physics, and numerical simulation generally (a catch all because the boundaries are fuzzy), but I think "as it goes in CFD and FEA for structures, so it goes in other numerical domains...?"), then it's quite possible you came from a school where your professor(s) wrote in old style C++ (or maybe straight up procedural C), or maybe used only the most basic features of Fortran 90 or worse (F77). The main thing is that while you did something noteworthy in your field to get your degree in engineering simulation or computational physics, you used a language, or language variant, that is not common anymore in _actively_ developed projects commercially. So, outside of academia and national labs, which are hard to get into if you are coming in from say, a state school with fewer connections (and did not intern somewhere as you had research obligations at school, say), in commercial code bases, as far as I have seen, hiring "from without" looks generally for lots of C++ computational/numerical experience.

Not such a big deal if you are a pure-researcher in, say, CFD, and not such a big deal in a national lab, where the big codebase may be FORTRAN anyway (caps to denote old school code bases). But it really limits your mobility not to be "C++ first". Lastly, if you land a job at a big numerical analysis company, you may be looking after legacy products forever if you are "just a Fortran person".

Well, this is obviously anecdotal. Sorry to state such bald opinions. But it's what I've seen, and really would have been helpful to know as I was going through school. I wrote a lot of Fortran and Python. It's left me playing catch up and it could have been avoided with comparative ease compared to my time budget now.


I'll second all of that. It's well worth actually learning C++ (and not C masquerading as C++) if you want to go into the broader scientific computing job market. I certainly wish I had focused more on "real" C++. I have maintained some very large C++ codebases, but I still don't really know C++, just "C with classes". Thankfully, C++ is not my day-job anymore.

I'd also give a quiet nod to trying to get familiar with modern dev-ops tooling. E.g. docker, kubernetes, and interacting with at least some forms of cloud storage. Get used to the idea that a bucket is not actually a directory, it's a hierarchy of prefixes and learn how to use that. Think of a dockerfile as learning how to write a makefile. It's worth learning the basics of, and it's not hard for simple things. HPC work has traditionally been a lot of fairly custom clusters (Condor, anyone?) or problems that are best solved with a lot of RAM on one machine. However, things are moving towards commodity solutions (i.e. "the cloud") or at the very least, common software layers. You don't need to understand the the ins and outs of managing a kubernetes cluster, but it's helpful to have used kubectl to spin up and debug a pod or two at some point.

However, I will say that there seem to be more Python-based jobs than C++ or Fortran jobs in the industry-based scientific computing world these days. Perhaps less so for fluid dynamics or finite element methods or other "simulation on a grid" problems, but there are a lot of python codebases out there in industry right now. I think it's very important to be comfortable working in higher level languages as well, and python is kinda becoming the lingua franca of a lot of sub-fields. For better or worse (I'm in the "better" camp), a lot of users are going to want a python library for what you're writing. It's best if you're used to "thinking in python" and can design a pythonic api.


Thanks, and it's nice to hear that about Python! I don't know why I am surprised, as the more forward thinking folks at (even such a place as) my work are having a Python api built for the stuff they look after presently (their bit does structural analysis). That should make design generation/variation/optimization a lot more fun.

I guess my fear was that all Python jobs (that I'll find) are going to be "machine learning" -- but really that would just mean data munging to feed Tensor Flow (or similar lib) and post process. Total snooze fest unless "post process" means something more like design generation/variation/optimization - and we are back to the api.


I guess "Formula Translator" lives up to its name then!


Thanks this makes sense. Somebody on Reddit told me the other day that numpy used Fortran and I couldn't figure out where they'd come up with that.


I'd like to write some Fortran, but there are a few libraries I need like database access and I'd be set.


I wonder if we can use a Python wrapper for that?


At that point I'd just use Python.


I don't work on it directly, but the code base where I work has lots of Fortran. I'm pretty sure they've been trying to migrate away from it since the early 90s, yet it's still there. Fortran has an amazing amount of staying power.


There are some optimizations Fortran compilers can make which C compilers cannot, though we can use the "restrict" keyword to allow C to make the same assumptions Fortran makes for aliasing pointers (ie. there are none).

However, in order to support all existing C programs I believe they still can't make the same level of optimization as Fortran does. It'd be nice to see this gap merged, in fact.

C++ can actually do better with some numeric code since commonly used libraries perform expression optimization at compile-time using Eigen for example.


I sometimes think R is just a fronted for Fortran. For the packages I install I always seem to be waiting for Fortran code to compile.


Latin was dead long before it stopped being used in written works. It died when it stopped being spoken and stopped changing.

I don't know if Fortran would be considered dead by the "no longer adapting" definition, but that it's still in use isn't proof it's alive.


Fortran receives updates to the language spec with new features, and modern fortran is pretty different both syntactically and semantically from the 70s fortran you are probably thinking of. It is also extremely fast, often faster than C due to less pointer aliasing which lets compilers optimize loops more. (or so I'm told)


Fortran's most recent spec update was in 2018...

https://en.wikipedia.org/wiki/Fortran#Fortran_2018


I work in computational fluid dynamics. Fortran is very much alive and being updated. The bigger issue, as I see it, is the adoption of newer features by current programmers. I know many people who program in modern Fortran (the 2003 and later standards with object-oriented features), but I also know some people who still program in Fortran 77 (either literally or in spirit). The second group tends to work on older "legacy" projects that may be decades old at this point. The maintainers usually decided that the cost of updating to a newer standard wasn't worth it ("if it ain't broke, don't fix it").

That said, I only write in modern Fortran when I use Fortran. The newer features make it much easier to write complex data structures, in particular.


But Latin changed quite a bit even after it stopped being spoken as a first language. There are courses in reading Medieval Latin because it isn't that close to Classical Latin.

In regard to Fortran, things are changing too. There's even a Fortran 2018 standard, which is fairly close to Fortran 2008, but would be almost unrecognizable to people who learned Fortran 77 or earlier.


Latin never died or stopped changing or stopped being spoken; it is still spoken by about a billion people today — they just stopped calling it “Latin” at some point.


I'm ignorant. Who speaks Latin today under a different name?


Speakers of French, Spanish, Italian, Portuguese, Romanian, Catalan, and various other lesser-known languages and dialects.

All of these are directly descended from Latin in exactly the same way that today's English is descended from 1900s English, just over a longer period.


I think that's pretty disingenuous. The fact that my cousins and myself are around doesn't mean that my grandparents aren't dead.

Latin is certainly a dead language, even though it has living descendants.


Is 1900-AD English "a dead language" because nobody who spoke it natively is still alive? What about 1800, 1700, or 1000? What year did English die and if it did indeed die, what are we speaking now?

The reasons for considering English circa 1000 and English circa 2020 to be "different stages of the same language" but Latin and French to be "different languages" are an arbitrary cultural distinction not rooted in science. The English of Beowulf is just as different from the modern variety as French is from Latin. And at every point from the introduction of Latin into France thousands of years ago until the present day, everybody could communicate perfectly easily with their own grandparents, and thought they were speaking the same language.


The natural conclusion to this argument is that everyone on earth is speaking the same language, because they all descended from the Mitochondrial Eve’s utterances 150000 years ago.


No, the natural conclusion is that whether two varieties are "the same language" or "different forms of the same language" is a cultural question, not a scientific one that can be answered objectively.

(Tangent -- certainly not all languages are descended from a hypothetical Proto-World, since we have examples of brand new languages forming: any creole languages, or Nicaraguan Sign Language. Whether most of the major language families are descended from a Proto-World is very much an open question, and probably one that will never be answered, since even if say, English and Navajo are genetically related, they diverged so long before we had written records that no trace remains.)

In any case, there's a meaningful difference between a language gradually changing over hundreds of years such that the newer varieties are quite different from the old ones, and a language truly dying, because it has exactly one remaining native speaker, and that person dies.


> No, the natural conclusion is that whether two varieties are "the same language" or "different forms of the same language" is a cultural question, not a scientific one that can be answered objectively.

It's a question that linguists (who are scientists) strive to answer in ways that are useful to studying, reasoning about, and explaining language. It's not much different than the species problem in biology. Everyone knows that speciation happens gradually, but scientists still propose ways of defining and explaining speciation to aid in scientific inquiry. The labels and dividing lines themselves are not empirically observed, of course, but that doesn't mean they're unscientific or outside the purview of scientific inquiry.


Do you have any examples of serious papers written by serious linguists debating whether X and Y are the same language or not?

I would be very surprised if such a thing exists.


As far as I know, there generally aren’t papers that contain debates between multiple linguists. Scientific papers usually aren’t written as debate transcripts. But there are certainly many papers about dead languages, and about the lines between languages. You can look at pretty much any of the citations on Wikipedia articles about dead languages like a Middle English, or creole languages.


> Scientific papers usually aren’t written as debate transcripts.

Sure they are, although a debate wouldn't be contained in one paper written collaboratively by debating authors, as you seem to imagine -- it would play out over a series of papers, each of which cites previous papers and claims they're wrong.

For example, Timm 1989[1] argues that modern Breton is a VSO language, disputing the claims of Varin 1979[2], who claims it is SVO, which in turn disputes the traditional understanding that it is indeed VSO.

Or the famous paper of Haspelmath 2011[3] citing many other authors' proposed approaches to word segmentation and arguing that they're all wrong (i.e., that "word" is not a meaningfully defined concept in linguistics).

Where are the papers that you claim exist about the lines between languages? If this is really something mainstream linguists care about, you should be able to give examples in non-fringe journals.

I just checked the citations on the Wikipedia article for Middle English like you suggested, and found zero papers about whether Middle English should be considered "the same language" as modern English. Can you tell me which ones specifically you mean?

[1]: Timm, L. (1989). "Word Order in 20th Century Breton". Natural Language & Linguistic Theory, Vol. 7, No. 3, Special Celtic Issue, pp. 361-378 (https://sci-hub.tw/10.2307/4047723)

[2]: Varin, A. (1979). "VSO and SVO Order in Breton". Archivum Linguisticum 10, pp. 83-101

[3]: Haspelmath, M. (2011). The indeterminacy of word segmentation and the nature of morphology and syntax. Folia Linguistica, 45(1). (https://sci-hub.tw/https://doi.org/10.1515/flin.2011.002)


It's very much used in science for naming things, for studying old literature and also in the Catholic Church. We also had Latin courses in highschool. Also many words in many European languages, including Englush, have Latin or Greek roots.


I studied (classical, not ecclesiastical) Latin for about 8 years, and I don't think it's accurate to claim that the Romance languages are in any meaningful sense "Latin".

The weaker (and more reasonable) claim is that learning Latin improves your ability to recognize words and (very basic) structures in its descendants.


Roman graffiti from Herculaneum and Pompeii in particular shed some light on early vulgar Latin, suggesting that already that the regional homogeneity of the vulgar register was already breaking down by the time of the Plinian eruption. Given the large volume of uncovered graffiti, it is fairly easy to discern several trends, notably the loss of written dipthongs (æ->e, oe->e; comparably au->o), losses of final unaccented consonants and medial vowels.

http://ancientgraffiti.org/about/ is an excellent resource specifically for Herculaneum and Pompeii, but it also links to broader collections to which the project has contributed.

There are interesting aspects of graffiti throughout the Roman Empire. Children (or exceptionally short adults) practised writing on walls; some taller people's graffiti showed not just literacy but familiarity with Virgil and even decent command of Greek and other second languages. Conversely, numerous graffiti are supporting evidence for partial Latin literacy among speakers of other languages, even among celtic-language informal epigraphers in the west and northwest in the first decades CE. It seems likely that these influences "accented" day-to-day Latin, perhaps comparably to https://en.wikipedia.org/wiki/Singlish .

A couple of centuries later and one could see a clear split between the classical register and early Vulgar Latin. Appendix Probi is interesting, and some examples are listed at https://en.wikipedia.org/wiki/Vulgar_Latin#Evidence_of_chang... which also draws some comparisons with Romance languages. (cf. https://en.wikipedia.org/wiki/Appendix_Probi )


Latin and the Romance languages are indeed very different, but their similarities are much stronger than just vocabulary. The Romance languages have lost noun cases and gained prepositions, fixed word order, and definite articles, but they retain noun genders (though without neuter), remnants of the case system in pronouns ("je", "me", "moi"), many of the same verb tenses, and the subjunctive/indicative moods, to give a few examples.

Anyway, my point is that it's an error to say that Latin "stopped changing". Other languages have changed similar amounts: modern Americans can't understand Beowulf, modern Greeks can't understand Homer, and modern Chinese people can't understand Confucious, but nobody would claim that English or Greek or Chinese died and stopped changing. The fact that people call Latin, but not English, a "dead language" is purely due to the fact that the different stages of English all happen to be called "English".

In an alternate world where Latin was exactly the same as it was in the past, and Italian is exactly the same as it is now, but Latin had never spread outside of Italy, I suspect that we would today call Latin "Old Italian" (or perhaps we would call Italian "Modern Latin"), and nobody would be having this discussion, despite the scientific/linguistic facts being identical to what they are in our reality.


> The fact that people call Latin, but not English, a "dead language" is purely due to the fact that the different stages of English all happen to be called "English".

I don't think this is the case: the language that Beowulf is written in is normally referred to as Old English. Chaucer is Middle English. Shakespeare is Early Modern English. William Makepeace Thackeray is Victorian English.

IMO, it's reasonable to assert that each of these are "dead" in some meaningful way: even Early Modern and Victorian English, despite their intelligibility, are simply not spoken by any group of current-day English speakers.


The difference between Beowulf English vs modern English and say Italian vs Latin is not the same at all


You're right, Italian is much closer to Latin than English is to "Old English".


Do you care to elaborate?


ugh you still speak Latin 77? the new Latin 2018 spec is way more readable


Latin was in use by the Church well into the 20th century, until the Second Vatican Council in 1962.


That didn't end Latin being used by the Church.


Correct. It is still the language used for official documents by the Holy See (though it is not the official language of the Vatican).

That said, Latin was certainly considered a dead language before Vatican II.


Fortran is doing fine in scientific computing.


In physics it's being replaced by C++ even where it requires rewriting lots of code that's already implemented and tested.


Speaking from my experience at CERN, someone needs material for their thesis.


Quid?!


I don’t think that works (which kind of proves the grandparent’s point). The appropriate interjection would probably be one of the following:

Ecce?! Hui?! Vah?!


Interestingly in my high school, we had more people who studied Latin than German.


Looks vulnerable to SQL injection attacks. (https://fortran.io/static/model.html)


I held out hope that maybe there was some sort of special treatment for that parameter. But then I looked at the source and... nope, it's just slapped on the end of the command.


I couldn't remember, but checked - I am replacing ' with _ in the actual code https://github.com/mapmeld/fortran-machine/blob/master/marsu...


Generating an SQL string with any input from the user is not accepted best practice, even if that were a sufficient way to achieve it (I suspect it would disallow a parameter string to contain a legitimate, escaped ' character).

Generally, it is recommended that you create a prepared statement entirely from static SQL string(s) (no user input) and then bind parameters into it, such that there is no possibility for any user input to be parsed as SQL:

https://www.sqlite.org/c3ref/stmt.html


If they've haven't got that right I would be sceptical what other insecurities are present.


Yes this brand new Fortran web framework may not be suitable for production use yet.


It doesn't say that anywhere on the site or their github page.


This just in: Most open source is provided as-is and it's up to you to assess it's quality and production readiness.


Oh this old chestnut.

This is quite frankly nonsense. You have released the software to the wider world it is your responsibility to make sure it is decent. Just because you have released it for free doesn't suddenly mean you can avoid criticism.

SQL injection is such a basic thing to check there is really no excuse.



Getting a standard Nginx error "502 Bad Gateway nginx/1.10.3 (Ubuntu)". Are they slashdotted[1]?

[1]: https://en.wikipedia.org/wiki/Slashdot_effect


I opened it and instantly thought — "Ohhhh, you got me!"


Legitimately, I wish it was intentional. In fact, I think the maintainer should make it intentional).


That seems like a good demonstration of what the framework can do


Yeah, the site owner said he's been slashdotted in an IRC channel.



The static link works for me[1]

The default https://fortran.io/ not so much.

1. https://fortran.io/static/index.html


Back online! Sorry just saw this


Wow, it is a great time to join Fortran hype: https://www.manning.com/books/modern-fortran !


Fortran actually isn't a static language. It's evolved over time. The last significant revision according to wikipedia is just over a year old. The beauty and horror of it probably depends which era of it you're looking at.


In the late 90s I remember a saying roughly "I don't know what language scientists will be using in 20 years, but it will probably be named Fortran"


As someone professionally involved with computational physics / geophysics, I can vouch that Fortran is still heavily used in these domains.

EDIT: for a modern, relatively new and highly-used scientific codebase that uses Fortran90 exclusively, see PFLOTRAN: https://bitbucket.org/pflotran/pflotran/src/master/


The quote is from Tony Hoare, winner of the 1980 Turing Award, in 1982.

https://arstechnica.com/science/2014/05/scientific-computing...


I also like the saying "you can write FORTRAN in any language".


After Fortran got its array syntax 29 years ago, there aren't many languages that you can use to write array-manipulating code like in Fortran. Maybe just Julia and NumPy?


All array languages (apl and its derivatives).

Raku (perl6).


R, Matlab/Octave, Mathematica


What are the highlights of Fortran's array manipulation capabilities?


Well, things like... you have 2, 2 dimensional arrays of the same size, A and B.

You want to make a new array, where each cell in A is added to the same cell in B. So you are going to do a loop over 2 variables, right?

in FORTRAN the code is...

C = A + B

in Julia the code is C = A .+ B


That's what I do in C++ with Eigen or boost::ublas or any other library. Operator overloading is not a feature limited to Fortran.

Fortran gets its performance from prohibition of aliasing, which is something that C++ admits as per the standard. One could enable the same performance boosts with a non standard compiler flag, but this breaks a lot of codebases, including the Linux kernel AFAIK.

Aliasing is a pox upon the world. There's a lot of work gone into stuff like UBSan for C++. I think aliasing should be added to stuff that it checks for. Dunno how much overhead that would be though.


Taking rows and columns is still nice in Eigen. For example taking the second column of a matrix is m.col(1) (Eigen, 0-based indexing) and m(:,2) (Fortran, 1-based indexing unless specified something else).

But more generic array slicing with strides seems to get nasty in Eigen. For example if

    v2 = v(1:2:20)
is

    Map<RowVectorXf,0,InnerStride<2> > v2(v.data(), v.size()/2);
in Eigen, I don't want to read the docs for how to do the same for 2- or 3-dimensional arrays.

https://eigen.tuxfamily.org/dox/group__TutorialReshapeSlicin...


In Julia, it is C = A + B too. (You don't need the elementwise operator ('.') if the operation is a well defined expression in linear algebra.)


Interesting. Does it result in a compile error if you try arrays of different sizes? (If it's just a run time error, I don't see how it's different to implementing this with operator overloading in any language that supports it).


Sure, if you try with non conformable arrays, gfortran for example would tell you something like this:

  Different shape for array assignment at (1) on dimension 1
Note that a scalar is conformable to any array, so that A = 2 for example would work for any shape of A, and would fill A with 2's.


That is witty on so many levels I'm simply in awe


IIRC, most people who use Fortran use it because there are extremely specialized physics libraries (or astronomy, or chemistry, and so on) that were probably written three to four decades ago by an extremely intelligent PhD student who lacks a proper programming background.

Which suggests that the biggest source of horror would be that most code you'll come across was written by a non-programmer who is intelligent enough to translate an extremely complicated and niche problem domain into very clever and completely un-idiomatic code, who likely was unable to read their own library mere months after graduating because most PhDs don't fully understand their own thesis within months after graduating.


"Idiomatic" depends on language and domain and era and purpose. Consider the dgemm method in BLAS (double general matrix multiply, powers all sorts of modern code like numpy). It's such a workhorse, used by everything under the sun. But it does one thing. It multiplies two matrices of doubles. You're going to want to optimize the hell out of it and ensure it's correct, then never revisit it. It's been around (in the exact same form as far as I can tell) nearly as long as I've been alive. The right, idiomatic code for that is going to look different from the right, idiomatic code for something at the business logic level that changes every 6 months. It does it a disservice to shit on it like you are.


You're going to want to optimize the hell out of it and ensure it's correct, then never revisit it.

Balderdash. Claptrap. Codswallop. Poppycock. What if you want to revisit it to make it run on parallel processors? Optimize it for a new CUDA architecture or caching scheme? A new instruction set that handles sparse matrices better? Code lives forever, at every level.


For this kind of linear algebra, you're going to have a different routine for a sparse matrix (dcoomm) or a symmetric one (dsymm) or a triangular one (dtrmm) or symmetric banded matrix vector product (dsbmv) or whatever. Hopefully those are separate functions from your dense one (dgemm), which is what I'm talking about. You'll also have a different function for CUDA (which exists in cuBLAS) than for multi-core (in ScaLAPACK? PLASMA?) than for single-CPU (in BLAS), which is what I'm talking about. General matrix multiply in this sense isn't "all the different matrix multiplies." It's a function with a very specific purpose, and yeah you don't really need to revisit it every decade.

Either way, it's not that much horror: http://www.netlib.org/lapack/lapack-3.1.1/html/dgemm.f.html


For sure, there are a few cases like the one you mention (I personally work with this kind of code in a daily basis), but that's not the reason most people use Fortran. There are many libraries with a level of robustness and performance that you won't find anywhere else, there are highly optimized compilers, and it is the standard language in many fields.

It's also quite safe for non-programmers. Those non-programmers write complex algorithms, but the code itself is not "clever and completely un-idiomatic". Most often, it is quite naive, just a series of formulas one after the other. You will find very long subroutines, bad names, some obvious inefficiencies, and maybe some spaghetti, but nothing to be afraid of.


My understanding from supercomputing center folks is that, if you just want to run these half a dozen equations for a bajillion nodes in parallel as fast as possible, it is still the best language for the job. Not speaking from personal experience, though.


I like it when some of my colleagues use Fortran, because their code in other languages is even more horrible.

BTW, the 'non-programmer' you seem to not think much of probably used an RPN calculator. Food for thought.


> the 'non-programmer' you seem to not think much of

If what I wrote lead to that as your main take-away, then I apologize for expressing myself badly, because nothing could be further from the truth! I studied physics myself at one point, and base this on conversations with friends who (unlike me) didn't drop out. Specifically, those who were asked by their professor to update old libraries during their PhD thesis.


Could be worse... Matlab?


Coming from a C++/Java background, I loved the fact that in 2 hours I was able to learn enough Fortran 77 to take ownership of a very large app.


I think one of the reasons its still frequently used in some parts of academia is a Grad student can figure it out in a few day of self-study and move on to doing their actual research from there. As opposed to if you want them to work with a large C++ program, getting them familiar enough with the language to be useful can take half a semester.


I’m a professional FORTRAN developer. I would say even FORTRAN 90 has a very rich and capable feature set. The problem is not the language, it’s the other developers :). A lot of codes have been developed by people used to FORTRAN 77, so even if they’re using new features the codes are poorly designed.


Should have been called Fortran on Freeways.


Fortran On Fighter-Jets


Super-sonic fighter jets probably were designed using a lot of Fortran.


For sure, but their avionics software is written in Ada or C++.


And that's relevant, because... ?


How is it not? Someone said something about fighter jets being written in Fortran. Someone corrected them saying they aren't, but they are modeled using Fortran. I tried to add to the conversation by saying what they were written in assuming that would also be of interest to the person who made the original comment. I'm sorry this has perturbed you.


There is likely a lot more railway traffic control software written in Simula ( "The basic simulation kernel, the TTS module based on the SIMON/TTS system, is today implemented in Simula and running on a UNIX platform." https://pdfs.semanticscholar.org/ed7e/d90fbc57b7ee0473d41edb... [2008] -- last listed author works for and the work is funded by Banverket, the Swedish National Rail Administration ) than in Ruby, and yet it's still rubyonrails.org.


Ah. Thanks for the explanation.


Lol no biggie. Your bio seem pretty awesome btw! What technologies have you used to do all that?


Fortran on Funicular :)


That goes well with COBOL on Cogs!

Funicular railway:

https://www.youtube.com/watch?v=GO9J7NsM0Ck

Cog railway:

https://www.youtube.com/watch?v=VYjj0rwriEA


You likely would recognize the melody of "Funiculì, Funiculà", an audio sample of which can be found at https://en.wikipedia.org/wiki/Funicul%C3%AC,_Funicul%C3%A0

Amusingly, a short while ago I HN-commented on the eruption of Vesuvius that destroyed Pompeii and nearby settlements. Vesuvius also destroyed the funicular that the song advertised, in a later eruption.


I honestly am not too sure why there is so much derision regarding Fortran (ageism?). A lot of people don't seem to realize the importance this language still has in scientific computing.


Seems that many CS folks don't think much of Fortran because their programming languages survey course professor made fun of Fortran 77.

Fortran has moved on, even if you professor hasn't.


The opposite, in fact. My professor talked about the amazing things he did with Fortran. Then I had to learn it in order to find the bugs. I'd never use it by choice.

I know of 4 major languages created in the 1950's: Lisp, Algol, Fortran, and Cobol. Lisp and Algol were brilliantly designed, years ahead of their time, and have inspired nearly every programming language since then. Fortran and Cobol not only look like evolutionary dead ends in hindsight, but they weren't pleasant to use even when they were popular.

There's a reason we still see Lisp posts on HN almost every day, and very rarely Fortran posts.


You seem to be exhibiting the problem I describe -- Fortran 2018 is not the same as Fortran 77.


Fortran is absolutely still an important language...for scientific computing. Not my first choice for a web framework.

Although, I don't guess it's any worse than wanting to use your client-side scripting language for your server, or you database query language.


The more I look at Fortran, the more I appreciate its beauty.

I think some of its infamy is unjustified.


My contribution to an example of something done right in the first FORTRAN and then done wrong many times afterwards:

"The March of Progress

> * 1956: Fortran I:

> PRINT 1, X

> 1 FORMAT (F10.2)

* 1980: C

printf("%10.2f", x);

* 1988: C++

cout << setw(10) << setprecision(2) << showpoint << x;

* 1996: Java

java.text.NumberFormat formatter = java.text.NumberFormat.getNumberInstance(); formatter.setMinimumFractionDigits(2); formatter.setMaximumFractionDigits(2); String s = formatter.format(x); for (int i = s.length(); i < 10; i++) System.out.print(' '); System.out.print(s);

* 2004: Java

System.out.printf("%10.2f", x);

* 2008: Scala and Groovy

printf("%10.2f", x)

"

The reference and the details (2012):

https://news.ycombinator.com/item?id=3964475


In my work life, I have a problem where I have to interface with a stringly typed API. For instance, characters 20-25 in a 400 byte string are a floating point number. No it's not a good API but I don't make decisions. So you need a six character float. "12" is wrong because it's two characters, also "12<four spaces>" is invalid. "1.2e01" is wrong compared to "12.000" because it has (many) more significant digits. ".00001" is wrong compared to "1.2e-5".

I don't know how to solve this problem. printf doesn't cut it. It simply can't specify an exact number of characters.


You could try:

  %06.*g
for width 6 down to 1.

I believe this will always find an answer, and the one with the largest number of significant digits.

For example:

    void format(double val, int width, char* buf) {
        for (int w = width; w > 0; w--) {
            int count = sprintf(buf, "%06.*g", w, val);
            if (count == width) return;
        }
    }
"Working" example (unsafe, exits on failure, etc):

https://godbolt.org/z/WiaPkT

Note that you can't represent negative values equal or below -1e-100 since those take 7 characters.


Also see COBOL on Cogs, http://www.coboloncogs.org/INDEX.HTM


And Intercal on Interstates: http://intercaloninterstates.org/


The latest version of Visual COBOL[1] supports .NET Core. And I believe you can extend C# classes in COBOL. So you might just be able to write an entire ASP.NET Core application in COBOL.

Whether you'd want to is another question rntirely. Might make for a fun weekend project. :)

[1] https://www.microfocus.com/en-us/products/visual-cobol/overv...


There is also a hilarious SQL on Rails video but I'm on my phone and don't have the link right now.


https://www.youtube.com/watch?v=0_PK1eDQyVg

The attention to detail in this video is amazing (e.g., MS-DOS window on Mac), and very funny.


Pascal web framework?


You joke, but a JavaScript backend was added to Freepascal last year.

https://wiki.freepascal.org/pas2js


This thread brings up a lot of interesting stuff (Fortran soon in LLVM earlier, too).



Delphi has had one since early 2000, with all the goodies, including MVC and services support.


Oxygen


Google has this thing called “Testing on the Toilet” where they post factoids about Google’s codebase and best practices. I read one and it said FORTRAN is a highly secure language (or there haven’t been any security vulnerabilities found in the language in a while).

Devil’s advocate for not writing FORTRAN off completely at first sight.


Or is it just because no one uses it?

Apple used to run ad campaigns about how Macs were immune to viruses and therefore highly secure. Turns out enough people just weren't using Mac, and viruses started popping up once it went mainstream.


Have there been Mac viruses that actually spread user-to-user in the wild? The articles I've seen talk about malware, which is different. Also the occasional proof-of-concept.


Elk Cloner


There are entire scientific fields where Fortran is extremely popular. But mostly not ones that involve easy insecurities.


Apple once advertised "Macs don't get PC viruses", which was true, but misleading. Microsoft often claimed that the reason there weren't Mac viruses was that it wasn't as attractive a target. They kept this trope going for a very long time. They said it for so long, that one would think hackers would want to crack MacOS just as a way to stand out from the crowd. The higher prevalence of PCs was never enough to explain the orders of magnitude difference in exploits. Macs may not have been super secure, but they were way more secure than Windows, and the number of viruses for Macs never "popped" in any way comparable.


Very much could be! Then again, I doubt Google would want to intentionally spin technical content for its own engineers, that might have implications for its own codebase, culture, and other deliverables.


Note that Fortran on the Flusher last year's April 1st edition of Testing on the Toilet.


Why do they even do it in the first place?


Are you asking why Google has Testing on the Toilet?

Captive audience.


Let’s all get good at Fortran in 3 Months get those cushy six figure salary jobs those Fortran devs make building web apps, sounds easy enough.


brb starting my venture-backed FORTRAN coding school


You joke, but a lot of places are still looking for Fortran devs.


Link me The Who’s hiring Fortran devs please, I’m slightly bored of React.


I'm not jumping on the Fortran bandwagon until it runs natively in the browser! No transpiling to Javascript or any other such monkey business either. I want a browser with true Fortran support built right in.


For a while the site returned 502 Bad Gateway. I literally thought that was the whole point and upvoted for appreciative sarcasm... until the link started actually working.


If the website is already dead, is that a bad sign?


It's a feature.


I thought maybe 502 Bad Gateway was the web framework.


In German we call that a Luftkurort. Google it.


It would be accurate in this case!


Perfect security.


I think it's highly appropriate for a Fortran framework to give installation instructions for Ubuntu 14.04 :)


Sure, translating formulas is cool and all, but where's the framework that lets me use PL/1 on OS/2 to build Web 3.0 apps?


Is this the new Zawinski law? No programming language is complete until someone builds a web framework with it.


Here’s an IDE for FORTRAN [1] if anyone is interested to write some programs in it.

> A modern Fortran development environment for Microsoft Windows, Apple macOS, and GNU/Linux systems.

It was quite interesting to read the list of available features [2].

[1] https://simplyfortran.com/

[2] https://simplyfortran.com/features/


Thanks for a shout-out to Simply Fortran! I'm the primary developer behind it (had to finally create an account to reply...), and we're always trying to improve the development environment.


ODBC? :) surely someone else needs to hit a SQL db without having a script hit the DB and pipe the data to your Fortran app?


It seems the home page is down (sub-pages are working) and I coincidentally visited this site several months ago and it wasn't working back in the day either.


Did it _ever_ work?


Ah, Fortran, how I miss you. Unfortunately it seems like the last commit to this project is from three years ago.


I think you meant to say 1. Zero open issues :o 2. Code is stable - no recent fixes :P 3. Not even the mighty spectre and meltdown demanded a fix

PS: I am sure this is meant as a parody to make fun of other frameworks.


I did get a patch some time ago which was really well-written. Almost all pull requests to serious Fortran projects (mostly OpenBLAS) get merged in. I wrote about it here: https://medium.com/@mapmeld/fortran-culture-on-github-a257dd...


Interesting! Fun game too.


That's how python's Flask framework started too

https://lucumr.pocoo.org/2010/4/3/april-1st-post-mortem/


Finally! I can't wait to be part of the first GenX college coding projects as a service (GXCCPAAS) unicorn!


My favorite web framework is still Bash on Balls but I like to see that Frotran has entered the game.


Wow. This sounds intense and interesting. However I will stick to C++ but it is great to see power languages up and ready for new generations of hardware, software and people.


Once nice thing about Fortran's evolution is that when you lag behind the bleeding edge, you make fewer mistakes. If you like C++ that's great; you might notice that everyone else doesn't agree that it's the future.


C/C++ will be most likely used for many many decades ahead.


Yes? So will Cobol. So will Latin! I suspect discussing design choices might be more interesting than a popularity comparison.


Funny that you should mention it. To me it looks like web development is mostly driven by fashion rather then practical choices. Also I suspect that unless something radically good comes out (I do not consider Rust, Go and likes to be the case) C and C++ will stay very active, including new development. I actually do hope that this happens but so far no real candidate.



I saw the reference to Ubuntu 14 and thought "what the?" Then I noticed the code hasn't been updated in 3yrs. One question answered but the bigger one remains: Why do we (meaning: me, I) gauge a project by it's recentness? I've written plenty of LOC that are structurally sound that I've not needed to touch up/ improve upon... And as a FORTRAN project you can be pretty sure updates can be glacial but still very viable.

Love the idea.


I gauge projects based on recentness: recent for me suggests it's likely buggy, unoptimized and with an unstable interface. Is that what you meant?


I wouldn't care if this weren't a .io site. But now I am intrigued by how cool this looks.


And _none_ of the code is in "unsafe" blocks so the author won't be bullied.


New York server serving static files? Gotta put that static site on cdn!


Now I can write dank front ends to my awesome monte-carlo simulations!


Irrelevant, can't do serverless and no support for MongoDB.


Just about time. Cobol has had one since when, a decade?



The coolest named language ever. Forth is cool too :)


I just hoped it was a website for formula translations


This is neat. I'd like to see some CUDA code sprinkled in eventually...very interesting possibilities


The physicists have gone too far.


I'd like to have a COBOL one. COBOL on Cogs is too "micro" for me.


Just because? Why not spend all these coding months to help humanity somehow


The wanted to do something the liked in their free time I guess?


I refuse to jump on every shiny new framework and language.


i think only half of this would be shiny ;)


It’s Stainless Fortran (TM)


if you craft a new car out of old parts, is the car new or old? and what is the difference between an old car with old parts?


The framework may be new, but the language? 1957 was a long time ago


You're obviously in the pocket of big Fortran.


You win the internet for today. Brilliant.


That was the joke, I think :)


I store all of my private keys on punched tape..


These Boomers with their new languages.


I actually started softly clapping after reading the title without even realizing it. I too wish to be able to make a framework from a language like fortran, one day I might be good enough or motivated enough.


Heads up to op-the website is giving 502 bad gateway.


Cue a cacophony of “were migrating to FORTRAN.”


It's about time we had another fad.


Website is down. Ironically.


502... Bad Gateway


> .. Fortran 90 ...

Peasants. FORTRAN-77 is the real Fortran.


Fortran 77 has a lot of luxuries compared to FORTRAN IV, I have to say.


Paradise! We had to wake up a half an hour before we went to bed to use FRTRN 0.1, with punch cards made from rolled-up, discarded newspapers!


> We had to wake up a half an hour before we went to bed

I love it


In case you're not aware, it's a riff on an old Monty Python skit (RIP Terry Jones). An absolute classic, that one.

https://www.youtube.com/watch?v=ue7wM0QC5LE



I can write Fortran in any language!


I have a Fortran codebase which takes about a week to finish a simulation on 10 cores. Rewriting this code in Python and running would take about 50 years to finish the same simulation in serial mode, and at best 5 years if run in parallel on 10 cores.


... which instantly qualifies you as a Real Programmer :)


But why?


Every day we drift further from god.



Thanks for posting this! I was about to say "it could be worse: it could be COBOL". Especially since every couple of months someone posts an enthusiastic article about COBOL and people get intrigued by it and I feel it's my God-mandated duty to warn them about its mind-numbing unholiness.

COBOL-ON-COGS is hilarious though.


Every day we drift further from God.

To my mindset, this brings us closer.

  GOD = ASSEMBLER .AND. FORTRAN .GT. NODE.JS


lol. Perfect articulation of something I couldnt verbalize.


God is real (unless declared integer).


Since integer means undividable then he must be both.


Thank you.


In the projective geometry of our world (R3 X god) moving away from the god is just coming to him from the other side.


I'm not picking on your metaphor, but I don't understand it either. Mind elaborating?


Drifting further from god is a metaphor for moving closer to hell. He is implying that developing for the web using FORTRAN is moving us closer to existing in hell.


It isn't meant literally. It is another way of saying that using FORTRAN for this kind of work is unholy.

And of course "unholy" is itself a metaphor for something which should not be allowed.


It's a meme, not a metaphor. Though you can argue that a meme is at its core a metaphor, in this case you're better off trying to understand this specific meme, it's quite funny.



> Fortran never left us... it runs our mainframes today.

What? That's where you lost me.


IIRC, FORTRAN is used heavily in the LINPACK/LAPACK libraries which are further used in higher level libraries such as numpy. So it’s likely true that it still sees use in control software running trains, banks, air traffic control, etc.


It is still heavily used for numerical code in various places, especially aerospace (to avoid reverifying code with new languages or just to ensure new work can be accurately compared with old work)

There's even a CUDA compiler.

As I understand it, lack of pointers makes it possible for fortran compilers to parallelize code quite well.


Fortran does have pointers, though. [1] Maybe they are just not common for most math operations on matrices?

[1] http://www.personal.psu.edu/jhm/f90/lectures/42.html


Multidimensional arrays are first-class in Fortran, so there's no need to use pointers there. There are some rules around pointers and subarrays that I expect make any possible aliasing undefined behaviour.


It has things called pointers, but they are significantly different than pointers in C. They can't return memory addresses and so there's no pointer arithmetic or other difficult to optimize actions.

And after writing a significant amount of fortran 90 code I didn't even know the feature existed...


Dude, my alma mater is still TEACHING 3 or 4 semesters worth of fortran for numerical work.


I'd like to hear some thoughts as to why one would choose FORTRAN over R, JMP, Excel, Python... other than for fun or familiarity?


Very different use-cases. All the other languages you've mentioned are great for statistical analysis and plotting but are quite slow - for R and Python by virtue of being interpreted rather than compiled. I dont know about the others. Fortran is often used in academia for intensive number crunching and parallelized code to run on computing clusters with 10s or 100s of cores. I work with some people who run hydrodynamics sims using a code written in FORTRAN. Their workflow is to pull data from these sims into Python scripts for subsequent analysis. It's just a different tool for a different job. People talk about C++ and FORTRAN being competitors, and there the difference might have something to do with familiarity.


> with 10s or 100s of cores

You could probably even add another zero (almost two) to this


As a computational scientist who knows C, Fortran, IDL, MATLAB, Mathematica, Pascal, Python, and R for about two decades, I can tell you that the unique advantage of (modern) Fortran (2008/2018) over any other languages mentioned here is in its great flexibility and high-performance for numerical computing and basically any mathematical problem. It has a coding and syntax flexibility comparable to MATLAB and Python, but the runtime speed of C, and in many cases, beyond C. It is the only high-performance programming language that has native built-in mechanism for one-sided parallel programming on shared and distributed architectures that can scale your code from your laptop to the largest supercomputers in the world (Coarray Fortran parallelism). Checkout the OpenCorrays software on GitHub and Intel's amazing Coarray-enabled Fortran compiler. Also, Fortran along with C, are the only two languages for which the MPI standard is written. OpenMP parallelism also has its roots in the Fortran language. Nvidia is also working on a new F18 compiler for Fortran which enables GPU parallel computing via the same Coarray Fortran syntax and rules. One parallel programming API for all architectures and platforms. No programming language has this capability that I am aware of.

Typically, simulations that I run in MATLAB, Python or R, take 100 to 500 times longer than the equivalents that I write in Fortran. I am sure you can achieve a similar performance with C as well. But the development cost in C is much higher for numerical computing than in Fortran, given Fortran's native array-based syntax, parallelism features, optimization hints to the compiler, and the new high-level string and memory manipulation tools that it has.

That said, every language has its usage and place in the world of programming. In my case, all of the post-processing analyses of my Fortran/C simulation results are done in either Python or MATLAB, and sometimes in R. They complement each other rather than being rivals to each other.


Add Julia to the list. But if you're thinking of using Fortran, then Excel is likely the wrong tool for the job. Fortran is usually used for heavy duty number crunching and supercomputing simulations. Or in this case, web sites???


No one would choose it. This is a terrible idea. No one asked for this. No one wanted this. No one needed this, yet here it is.

Like some many-angled Lovecraftian horror, it spites reality, sanity and common sense merely by existing.


There are people who want it. Those who know the value of each language.


Speed and for MP / cluster computing where there are fortran extensions.


I would argue that fun and familiarity at 95% of the reasons why anyone chooses any framework.

I don't know Fortran, so I won't use this. But as a dedicated ruby enthusiast, I won't be throwing stones either ;)


Why?


Because there's nothing harder to stop than an idea who's time has come to pass.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: