You. You are the single point of failure; if you are compromised, then all your accounts can be accessed by the compromisor.
If you're looking for a point outside yourself, then memorising all your passwords would be an option.
But beyond that, I don't think your criticism is warranted. There's always a single point of failure - sure - but we can still consider gradations of how centralised that point is, and how likely it is to fail.
With a hosted password manager, you're at the mercy of their server code; specifically, at least for 1password, I think they have a 'dead man's switch' which lets you get at the encrypted content without the master password. This is more likely to fail than a password manager which stores all its content locally and really encrypts it (e.g. keepass). In this case, human error outside of yourself can't compromise you. But technical error can, which is why there are more steps that can meaningfully increase your level of security. Like running your password manager on a separate, air-gapped computer; or sandboxing everything you run a la qubes.
Are any of these especially likely to compromise you, as a user? No, but reducing centralisation and dependency still improve your chances, and are definitely worth considering if you are e.g. running a drug smuggling ring.
This is not that great of an article, IMO. A prominent developer of one of the roguelikes mentioned says:
> you can immediately see that this article was stitched together by throwing some wikipedia articles together if it references the Berlin interpretation
> it's also hilarious if the guild of disgruntled adventurers is referenced as fun addition :)
> I'm not sure if I should feel insulted [by the article's description of my roguelike]
I mostly agree with these. I also feel a bit slighted by one of the descriptions. I'm also not sure quite what to make of the fact that they don't mention the two most prominent recent roguelikes: caves of qud, and cogmind.
I wrote on reddit[1] about why I prefer this over c++:
> It's not a technical problem, but a social problem. Yes, I would definitely prefer the c++ RAII (and refcounts would be nice too). If you say 'my project is in c++', that sends a certain message to prospective contributors, about what your priorities and ideals are. It can attract certain kinds of contributors and discourage others. Then you have the problem of how to define your subset of c++. It's easy to say 'no exceptions, no RTTI, no STL'. But there are subtler things. As you mention, templates are occasionally useful. But sometimes they're completely superfluous. Do you allow virtual functions? Multiple inheritance? The answer is almost invariably 'maybe'; you have to exercise taste. I can do that by myself, for my own project. But if I want to be able to accept contributions from others, I need a clearer set of contribution guidelines than 'wherever my whimsy takes me', and for such a purpose 'whatever the c compiler accepts' is the best I can do.
> Also, tcc is about 10x faster than gcc and clang, which makes development a joy.
The compiler speed pain is real, and excessive use of templates to do compile time metaprogramming can be intractable, but really it's not the 2000s any more (+) and we should stop pretending that C++ is just C with classes; exceptions, RTTI and STL do actually work now and are portable.
(+) offer not valid on microcontrollers or proprietary C++ compilers
Because the C++ compiler accepts an absolutely insane variety of code styles. I say this as someone who mainly writes C++. If I write my own project, or a project where I only have to work with a small number of people all of whom I know have extensive C++ experience, or a project that is so large that I can afford to consistently rewrite parts, educate contributors and maintain style guides, I choose C++. But anything that doesn't hit these bells, I'd rather use C (or something else). People mixing completely different styles in C++ is a nightmare.
> People mixing completely different styles in C++ is a nightmare.
This is really not my experience. Any decent-sized program will have parts more in a functional style, others in a more OOP one, others in regular-typed Stepanov bliss, others in template or constexpr metaprograms.... and this causes zero issues in practice once people get past their assumptions about what clean code should look like.
I think most large scale projects, or those with special requirements (things like embedded systems) are defining what subset of C++ they are ok with. Here are some examples:
Large commercial projects. The problem is the mid sized projects, especially FOSS ones, there isn’t the time needed to curate every pull request and bikeshed. There is also something to be said for having less abstractions, moderately wet code can be easier to follow and fix than needlessly dry code.
Why not use __attribute__((cleanup))? It's widely used in real code (eg. systemd has been using it for years) and there's effort going on to get the mechanism standardized in the next C standard.
Where's the effort? I happen to follow WG14 proposals closely, but I haven't seen this particular one discussed prominently or incorporated into the latest working draft. I don't even remember a single paper with this feature submitted to WG14 for consideration.
Does MSVC support C99 yet? Much less C11 or a future C2x?
Their priority has always been C++ and for some time their stance on C was basically "I guess we're willing to compile the subset of C that is valid C++."
Do people even use MSVC to compile C (not C++) code? Last I heard it was years behind on the C standards. In any case few in the Linux community care too much about closed source compilers. Lack of MSVC support for cleanups would be amongst the least of your problems if you were trying to use it to compile systemd or libvirt.
Hyperrogue[1] takes place in hyperbolic space. Definitely a good way to gain an intuition for it. It offers several different projections, so you can try them out. It's also open source[2].
Another game to check out in this vein is hypernom, developed by henry segerman, vi hart, and some other people that I don't remember: http://hypernom.com/
In a less principled way, there's also 'Antichamber' and 'Tea for God'.
Both use Escher-like spaces that are locally Euclidean but don't connect in proper ways. Eg if you go 360 degrees around a column, you might arrive at a different place in the level from where you started.
In Antichamber it's an interesting gimmick. In 'Tea For God' the mechanic is actually useful, because it's a way to fold a big level into the small boundaries of the VR space you defined on the Oculus Quest in your living room.
If you're looking for a point outside yourself, then memorising all your passwords would be an option.
But beyond that, I don't think your criticism is warranted. There's always a single point of failure - sure - but we can still consider gradations of how centralised that point is, and how likely it is to fail.
With a hosted password manager, you're at the mercy of their server code; specifically, at least for 1password, I think they have a 'dead man's switch' which lets you get at the encrypted content without the master password. This is more likely to fail than a password manager which stores all its content locally and really encrypts it (e.g. keepass). In this case, human error outside of yourself can't compromise you. But technical error can, which is why there are more steps that can meaningfully increase your level of security. Like running your password manager on a separate, air-gapped computer; or sandboxing everything you run a la qubes.
Are any of these especially likely to compromise you, as a user? No, but reducing centralisation and dependency still improve your chances, and are definitely worth considering if you are e.g. running a drug smuggling ring.