I did a little bit of Ada programming, and one of the distinct features it has is you can't do anything clever without being extremely explicit about it. This language has the most aggressive type-correctness checking I have ever seen. When you finally get your code to compile, it's likely that it will work reliably.
Ada uses its type system plus managed memory pools for manual memory management, it's very cool. Safe manual memory management with memory control of C/C++ but the safety of an ML.
People usually think of Haskell when strict type systems come up.. Ada is a "different kind" of strict though.
It doesn't do any of the fancy/confusing inference stuff that Haskell does in order to verify that type conversions/transformations progress along the intended paths, but instead requires you to be very explicit about what you're doing.
E.g. (1) Trying to read something from stdin is a minor challenge, because of how Strings are usually fixed-length and don't interoperate with special arbitrary-length strings. (2) If you want to do dynamic memory stuff yourself, you need to declare your packages as such (with the intent of this showing up during reviews and thus leading to the code being more thoroughly scrutinized.) (3) It allows for creating special sub-types of primitives, like integers of a limited/well-speicified range, so that they can be put into APIs intead of ordinary machine/compiler-dependent integers. (4) Conditions (in IFs and such) don't (by default) short-circuit evaluate, and you have to use speical key words if you want short-circuit evaluation. And the list goes on..
All in all, Ada is quite enjoyable imho. It allows you to write code without thinking too much about it, since you won't be missing (many) corner cases.
On the other hand of course all of this results in more work, which can feel tedious to some - but at least it's not the mind-boggling "what is this *?!" strictness that has you pulling your hairs out in confusion/desperation, that usually accompanies Haskell.
You can make an integer range like 2..15 if you know thats your possible input values, and nothing else, only that will be accepted - no need to parse it yourself or check for upper/lower-bounds/unsigned-ness, the compiler and later runtime does it for you.
I did a little bit of Ada programming, and one of the distinct features it has is you can't do anything clever without being extremely explicit about it. This language has the most aggressive type-correctness checking I have ever seen. When you finally get your code to compile, it's likely that it will work reliably.