The only mainstream language I'm aware of which supports high / arbitrary precision decimal numbers as a language feature is C#. Java, Python and Ruby have them as libraries included with the standard distribution, but to my knowledge they don't have any capabilities you couldn't implement yourself in a library.
Regardless, there is an active proposal to add a 128-bit decimal type to JS: https://github.com/tc39/proposal-decimal