This was probably the single worst design decision of JavaScript. It is the only language I know of that has two "absent value" types. If null was a billion dollar mistake, then undefined adds an order of magnitude to the problem. There is absolutely no reason for undefined to exist.
We've generally adopted strict mode. I think it is now time to come up with and adopt a new no-undefined mode. And yes, it should break the standard library. It is worth it.
> It is the only language I know of that has two "absent value" types.
It's not very common but not unique either. Perl has `undef` that roughly works identically, which became generalized into the "undefinedness" concept in Raku [1]. Ruby doesn't have `undef` value, but it could have been in the alternative universe [2]. Even more languages have multiple absent values which are not necessarily compatible to the null-undef distinction (e.g. Objective-C).
---
I think the separate `undef` value was mainly regarded as a solution to the apparent problem of detecting the absence in general, for example the absence of index or argument. Consider the following Python program for example:
def foo(obj=None): ...
It is clear that the optional `obj` argument cannot alone distinguish `foo(obj=None)` from `foo()`. A common idiom is to have a private object in place of `None`:
It is still possible to somehow obtain a reference to `_NOT_GIVEN` and therefore use `foo(obj=_NOT_GIVEN)` which is indistinguishable from `foo()`, but why would you do that? `None` is sometimes a valid argument to the optional argument, but `_NOT_GIVEN` is clearly designated to be invalid for that. Now rename `_NOT_GIVEN` and make it a language construct---voila, you've got `undef`.
`Undef` might have been a working solution a decade ago, when we were still struggling with dynamically typed languages in general and systematic approaches were less common. Lua for example uses `nil` for both purposes; `t[key] = nil` is a valid way to remove given key from the table `t` (with a caveat that it doesn't shift any subsequent keys if the key was an integer) and an excess argument is filled with `nil` [3]. This is painful from time to time, say, a table of optional integers is not straightforward. `Undef` might have been a good compromise under this observation... if we didn't have any algebraic/sum data type like today.
[3] Lua even tried hard to remove any visible distinction between the actual `nil` and real absence of value! But it's still not perfect, and the discrepancy is much easier to detect from the C API.
I mean, it's common when the argument itself has a domain of all possible Python objects and therefore `None` should be usable as a literal value. The requirement itself is not very common, but virtually all code with that requirement uses this idiom in my knowledge.
This was a nice read, well-written and informative!
I could add that in my mental model, the reason that an object property with an undefined value is different from an actually undefined property, is that since JS objects are basically key/value maps, there is a difference between a key existing with an undefined value and the key itself not existing at all.
I never use null in Javascript, because I don't want attributes of type null, I just want attributes to either be present or not. This usually simplifies all designs and there is no need to think about null separately; it's treated in code an invalid value for an attribute.
The remaining caveat is that when serializing an object, attributes of type undefined need to be skipped, and some libraries get that wrong. So sometimes it is necessary to delete them from objects before serialization.
So in my view, things would be better if null didn't exist at all and if assigning an undefined value to an object attribute would actually delete the attribute from the object. Of course the language can't be changed now, but that is the philosophy I try to use in code design.
I do realize there is the use case of passing an update object e.g. with HTTP PUT/PATCH and wanting to make a distinction between "don't update an attribute" and "remove an attribute". In that case, null kind of makes sense as a designator to remove an attribute.
> void doesn’t introduce any new semantics to the language.
I'm sure the author is most likely aware of this plus the context is around TypeScript but I think it can still be misunderstood and if you haven't been around when IIFE were very common you might not be aware that:
`void` is a javascript operator and it does have semantic meaning, it evaluates an expression and returns undefined.
My answer from decades of experience is "No, why would you?". But like the precious gadget that turned out to be better used as a door stopper `void null` is sometimes used as a safeguard instead of `undefined` because in JS you can re-define `undefined`. >>Sigh<<
As mentioned in the article, you can declare a binding named `undefined` and assign a value to it. This shadows the base `undefined` binding (since it's NOT a keyword).
`void 0` ensures that you get the `undefined` value regardless of any shadowing that may be happening with the `undefined` binding.
Some functions accept a callback and behave differently depending on whether or not the callback returns a value. If your callback is a single expression arrow function, then you might find it cleaner to just void the result instead of wrapping the body of the function in a code block (which prettier will format using multiple lines). Super contrived example:
import { produce } from 'immer'
let i = 1
const a = produce({ x: 0 }, draftState => i++)
const b = produce({ x: 0 }, draftState => void i++)
const c = produce({ x: 0 }, draftState => {
i++
})
console.log({ a, b, c }) // Logs: { a: 1, b: { x: 0 }, c: { x: 0 } }
It’s good for firing off an async function in sync contexts, like for example a fetch request with side-effects being called by React‘s useEffect hook.
That‘s the only context where I found any value for it though, and its usefulness is only in describing that we really don’t care if it finishes or not (since handling that case will happen somewhere else) directly in the function call.
> JavaScript [...] doubled it by having yet another null-ish value.
And did so 24 years later!
In defense of JS they supposedly designed and implemented the language in 10 days.
I find "implicit nulls" less defensible for Golang (38 years after C) and Java (23 years after C), as they had plenty of time to learn from other's mistakes.
The billion dollars mistake is not having nulls, you need some concept of null.
The mistake was allowing nulls to be a valid value for all pointers/reference at the type level.
You cannot make this mistake in an untyped language like JavaScript*
The solution to this is not eliminating nulls but rather separating nullable pointers/references from non-nullable ones
* Ok ironically JavaScript has an actual historical implementation bug where typeof null === "object" (basically Netscape used tagged pointers, the object tag was 0, and null was represented as a null pointer, so typeof would read the tag and return "object"
> The billion dollars mistake is not having nulls, you need some concept of null.
> The mistake was allowing nulls to be a valid value for all pointers/reference at the type level.
Hence I call it "implicit nulls" in my post. I know `null` as a concept is important, even if you have it as `Unit`, `Void` or `Nothing`.
You are totally right thought that in languages that do not force you to specify types it is basically needed as you cannot hint a type to begin with.
The TypeScript portion is a little bit wrong. Return position void in TypeScript means the value is unobserved by the caller, not that it's undefined. It's subtly different.
Would it really kill us to include an interpreter for literally any other language for the browser? Same for HTML. It's not just that Google owns everything browser-related in 2024, it's that the concept of an application-as-a-platform for apps doesn't need to be locked down to any one set of components like it is now.
WebAssembly as a lower level abstraction is going to make web dev very interesting once the tech finally pays off, but it's been almost 10 years since it was announced and there's still no viable alternative to javascript to come out of it.
It would have been nice if a cleaner, stricter and more "designed" alternative with full access to browser features had emerged in the meantime. I'd even accept Dart if it had managed to get broad browser support.
Instead we're in a waiting game where we're all forced to use Javascript for any real work until the horizon is crossed and we can finally use a dozen of wasm-targeted languages.
AssemblyScript can't access the dom without interfacing with Javascript which is a major piece that needs to be solved with wasm generally.
I'm arguing we should have invented a new native language to replace javascript, which could have provided a stop-gap solution until wasm targeted languages are viable.
> which is a major piece that needs to be solved with wasm generally.
I use WASM everyday and I don't agree that this is anywhere near the top of the things that need to be fixed. The DOM is inherently slow by design, accessing it through a JS shim or "directly" from WASM really wouldn't make any difference in performance.
> Most languages aren’t infinitely backwards-compatible so unless you want to be running Python 2.7 forever or whatever…
This is an important point that I think gets overlooked quite a bit in these discussions. Web tech is essentially append only — you can add things but it's incredibly difficult for a browser vendor to make a breaking change because no vendor wants to be perceived as the one that "breaks the Web". Even technologies that were never technically standardized like Web SQL end up sticking around for far too long. I'm not sure if many programming language communities would like to be constrained by this and the slow TC39 language proposal process.
This is mostly conceptual rather than technical, but I view pragmas like use strict as an append operation. If browsers dropped sloppy mode completely and made strict mode the default for all scripts, then that would be a true delete operation. I wouldn't be surprised if browsers still support sloppy mode cruft like the with operator in 2034.
> I wouldn't be surprised if browsers still support sloppy mode cruft like the with operator in 2034
And may they do it, I don't care except for the weight of the little-used lines of code that are carried forward indefinitely. And it's not only 'append'; appending stuff we can without breaking backwards compatibility, it's the 'update' operations that curiously many view as infeasible despite there already being `'use strict'`.
The problem for me is that you can't ever irrevocably remove a feature from the language, which means you'll end up having to read and interact with code written in the original version if you spend enough time in the language ecosystem. I've seen developers that continue to litter their codebases with `var` in 2024 despite `const` and `let` replacing almost every valid use case for `var`.
Server side developers yearn for their favorite language in the browser, but are used to living in a world where language deprecations mean "this feature will disappear from all modern version of the language eventually" not "you can opt-out of this feature by placing a string somewhere in your code". To me that's a hollow definition of 'update'. Golang for example completely changed the semantics of for loops this year [1]. Moving forward, if you use a modern version of Go, you will accept the way for loops work now — there's no
magical 'use old for loops' string. It's awesome that they were able to unilaterally make a breaking change to the language and force users to accept it moving forward. We can't do that in JS, that's the point I'm making.
But with WebAssembly, you need to DIY a lot and/or send a lot of runtime/stdlib code to the browser. It definitely has its place but something like Dart with excellent stdlib would be amazing (and I know that actually was the intent when Dart was being developed, I know, and I did oppose it at the time). Browsers then could ship different versions, and you can enable them, like "use strict" in js.
On the other hand, I understand why not: You ship Dart and Python folk will be WTF, you ship Python and that would annoy the Java/.NET folk, and so on...
I guess we could all try to write things in WebAssembly hosted forth... however that sounds like a giant nightmare once I consider any project of scale and how it would be more like trying to mod a modern JIT program at runtime with reflection.
I think the parent post wanted a more easily composed, modular sources of a whole, but better than JavaScript / ECMAScript. Intended to also have something similar to modern dev consoles attached with ease.
My suggestion would be to add more pragmas akin to `'use strict'` to the language and it's a complete enigma to me why TC39 seemingly does not go in this general direction. Pragmas would allow to break backward compatibility, this green-eyed monster from the swamps that keeps from just doing the right thing because it would break code. I'd be fine if new syntax for that was introduced, like `%use sane`. JavaScript is too precious to be discarded. Nobody is an expert in an entirely new language, I say let's rather build on the experience of gazillions of well-informed JavaScript users and rip out the bad parts. With pragmas we can do that.
I think the main issue is the DOM API is designed around JS. Running another scripting language with different semantics in the browser is trivial, having it interact with the page and do anything useful is not
The only mainstream language I'm aware of which supports high / arbitrary precision decimal numbers as a language feature is C#. Java, Python and Ruby have them as libraries included with the standard distribution, but to my knowledge they don't have any capabilities you couldn't implement yourself in a library.
The thing is though that `{ a: 1, d: undefined, }` and `{ a: 1, }` make a difference that shouldn't be there at all, and in this case its name is not very fitting as it does define an property of the object. This can be extremely annoying and the only way out is to always have tests around `null`, `undefined` and missing keys and have coding standards like I for one never use `undefined` explicitly; whenever it does occur it's likely a mistake; even functions without return value I make return `null` ("see, I have nothing to say here") explicitly instead of relying on the implicit `undefined`.
For me, the flexibility that comes from loose typing is one of the biggest strengths in JavaScript. For example, using loose equality (double equals) is truthful for both null and undefined, but false for 0.
I consider loose equality to be a bug not a feature. It's nice that `undefined != null` is true but to me it's a "trick" and not something that I want to see used all that often. It's really not that much extra to do `x !== undefined && x !== null` and keep the code intuitive.
As long as you are comparing with a literal then == with nulls is very intuitive.
I agree that if it did not exist it would be not worth to be added and that almost every other use is bad, but != null and == null are very easy to learn idioms.
I like how Groovy defines truth.
Anything that is not one of these is true:
* 0
* null
* false
* empty String/collection
It may seem like too many rules, but it's very intuitive.
For example, it's common in Java to do:
if (str != null && str.length > 0)
Or:
if (str != null && !str.isEmpty())
There's even a apache.commons helper function for that:
if (!StringUtils.isNullOrEmpty(str))
While in Groovy that's just:
if (str)
It's just handy. It also reminds me of Common Lisp, where `false` is defined to be the empty list itself! It just makes sense and in Common Lisp code the same idiom can thus be used, which makes it very easy to read and write.
> An empty string being false encourages the poor practice of using empty strings to mean "there is no value here".
That is not bad practice, as I said before, you almost always want to only process non-empty Strings, but sometimes an empty String comes in from user input or whatever and you either have to check it everywhere like in Java, or in Groovy that is just the default... you can still do 'if (str != null)' to be more explicit, but in my experience that is almost never what you want. Bad practice is making the more usual case look more unusual.
Also, it's nothing like JS because in JS empty lists and maps are `true`! The commonality with Common Lisp is exactly that "not being empty" is what the concept of true represents.
This is so arbitrary. State what you want, make it explicit.
if ( list_of_names ) { ... }
doesn't make any sense at all. For one thing, `list_of_names` should only be allowed to be `null` in exceptional cases; disallowing it removes a whole class of errors.
Second, imagine you have two handy operators / functions `not` and `is_empty`, then the above becomes either
if ( not is_empty list_of_names ) { ... }
or
if ( is_empty list_of_names ) { ... }
as the case may be. In which universe is that too long for a load-bearing program that people depend on?
Loose typing and truthy/falsy values are a disaster waiting to happen, not least b/c it obfuscates intention and also because every language that has it at all has slightly different rules. If I had a say I'd want an opt-in for JS that makes it illegal (a runtime error; compilation error for constants) to have anything but a Boolean in the conditional part of an `if` statement.
Null is the sole case in js that should always be == checked. Null or undefined should never be === unless you're explicitly encoding a semantic difference in your types.
We've generally adopted strict mode. I think it is now time to come up with and adopt a new no-undefined mode. And yes, it should break the standard library. It is worth it.