> It is the only language I know of that has two "absent value" types.
It's not very common but not unique either. Perl has `undef` that roughly works identically, which became generalized into the "undefinedness" concept in Raku [1]. Ruby doesn't have `undef` value, but it could have been in the alternative universe [2]. Even more languages have multiple absent values which are not necessarily compatible to the null-undef distinction (e.g. Objective-C).
---
I think the separate `undef` value was mainly regarded as a solution to the apparent problem of detecting the absence in general, for example the absence of index or argument. Consider the following Python program for example:
def foo(obj=None): ...
It is clear that the optional `obj` argument cannot alone distinguish `foo(obj=None)` from `foo()`. A common idiom is to have a private object in place of `None`:
It is still possible to somehow obtain a reference to `_NOT_GIVEN` and therefore use `foo(obj=_NOT_GIVEN)` which is indistinguishable from `foo()`, but why would you do that? `None` is sometimes a valid argument to the optional argument, but `_NOT_GIVEN` is clearly designated to be invalid for that. Now rename `_NOT_GIVEN` and make it a language construct---voila, you've got `undef`.
`Undef` might have been a working solution a decade ago, when we were still struggling with dynamically typed languages in general and systematic approaches were less common. Lua for example uses `nil` for both purposes; `t[key] = nil` is a valid way to remove given key from the table `t` (with a caveat that it doesn't shift any subsequent keys if the key was an integer) and an excess argument is filled with `nil` [3]. This is painful from time to time, say, a table of optional integers is not straightforward. `Undef` might have been a good compromise under this observation... if we didn't have any algebraic/sum data type like today.
[3] Lua even tried hard to remove any visible distinction between the actual `nil` and real absence of value! But it's still not perfect, and the discrepancy is much easier to detect from the C API.
I mean, it's common when the argument itself has a domain of all possible Python objects and therefore `None` should be usable as a literal value. The requirement itself is not very common, but virtually all code with that requirement uses this idiom in my knowledge.
It's not very common but not unique either. Perl has `undef` that roughly works identically, which became generalized into the "undefinedness" concept in Raku [1]. Ruby doesn't have `undef` value, but it could have been in the alternative universe [2]. Even more languages have multiple absent values which are not necessarily compatible to the null-undef distinction (e.g. Objective-C).
---
I think the separate `undef` value was mainly regarded as a solution to the apparent problem of detecting the absence in general, for example the absence of index or argument. Consider the following Python program for example:
It is clear that the optional `obj` argument cannot alone distinguish `foo(obj=None)` from `foo()`. A common idiom is to have a private object in place of `None`: It is still possible to somehow obtain a reference to `_NOT_GIVEN` and therefore use `foo(obj=_NOT_GIVEN)` which is indistinguishable from `foo()`, but why would you do that? `None` is sometimes a valid argument to the optional argument, but `_NOT_GIVEN` is clearly designated to be invalid for that. Now rename `_NOT_GIVEN` and make it a language construct---voila, you've got `undef`.`Undef` might have been a working solution a decade ago, when we were still struggling with dynamically typed languages in general and systematic approaches were less common. Lua for example uses `nil` for both purposes; `t[key] = nil` is a valid way to remove given key from the table `t` (with a caveat that it doesn't shift any subsequent keys if the key was an integer) and an excess argument is filled with `nil` [3]. This is painful from time to time, say, a table of optional integers is not straightforward. `Undef` might have been a good compromise under this observation... if we didn't have any algebraic/sum data type like today.
[1] https://docs.raku.org/language/typesystem#Undefinedness
[2] https://stackoverflow.com/questions/6975266/what-is-the-unde...
[3] Lua even tried hard to remove any visible distinction between the actual `nil` and real absence of value! But it's still not perfect, and the discrepancy is much easier to detect from the C API.