The authors previous article about "...(not) using #!/usr/bin/env whatever" doesn't sit well with me.
"The only reason to start your script with '#!/usr/bin/env <whatever>' is if you expect your script to run on a system where Bash or whatever else isn't where you expect (or when it has to run on systems that have '<whatever>' in different places, which is probably most common for third party packages)."
His very first point is how you should only use it don't know where to expect bash binary, when I feel like, while it's probably safe in most nix os', assuming it limits future enhancements by requiring that binary be in place. However unlikely it would need to or someone would want to move it.
I've encountered systems that only have bash in /bin/bash, or in /usr/bin/bash, and it's a hell of a pain to have to fix every script when using different distros (I think it must've been an old Fedora and Ubuntu?).
Nowadays, most distros are moving towards having /bin be a symlink to /usr/bin, so it's mattering less and less, but I see no reason not to just do /usr/bin/env which is supposed to be on the same place on every distro.
I use `#!/usr/bin/env bash` because the /bin/bash on macOS is usually stupidly ancient, and I install a recent version of bash to ~/.local/bin/bash
Is it bad? Well it's less secure. But if you're worried about `/usr/bin/env` calling a malicious program then you need to call out the path for every executable in the script and there's a hell of a lot more other things to worry about too.
It's the same for `#!/usr/bin/env python3` in python scripts. Python3 itself might be ancient at system install, but you might need to be using a venv. So /usr/bin/env python3 works correctly while /usr/bin/python3 works incorrectly.
Never thought of it this way; isn’t it always safe to assume env is in PATH?
Maybe `#! env <shell>` could be considered a DSL for hashbangs. My reasoning is that `/usr/bin/env` is the thing that seems to be hard-coded to a system path, in most cases.
This character was code 0x01 in the original IBM PC code page (https://en.wikipedia.org/wiki/Code_page_437), and hence in DOS. It was displayed single-width and monochrome just like any other 8-bit character, never causing any rendering issues, unlike emojis today. It was added to Unicode for round-trip compatibility with that code page.
Emoticons are not the same as emojis. For one they allow for more expression or personal style by having different variants, e.g. :-) vs :) or for absolute maniacs: (:
They are also not limited to what some consortium and a couple of megacorporations think you should be able to express.
I think there's a difference. The code point will always mean "eggplant", it just happens that the concept can be interpreted in different ways according to context—just like the word itself. But ":-)" can only ever mean "colon minus rparens" before further interpretation.
Actually, according to Unicode, "-" doesn't mean minus - U+002D is hyphen-minus.
And as for the eggplant, your semantics-as-specified are useless when 99.9% of the usage has a different intended meaning due to the inherent lack of expressiveness in a corporate-approved emoji language.
What’s it called is syntax, what it’s means is always context dependent. That’s why we invented formal notation, so that we can have context free interpretation (it’s bundled with its semantic so you don’t need to apply some context to it)
- That's crazy. I would say no way.
- I hope your in a position to move on. Like someone else said, they badically just fired you.
- You could try to fight it. Probably expensive and no guaranted payday.
- Companies should be able to ask whatever they want.
- Would you want to work for someone like this anyways? Hopefully no one would.
Dual video with Optimus manager works quite well. You can't truly shut off the Nvidia card from what I understand but switching between Intel/hybrid/Nvidia still yields decent battery life and performance optimizations.
Although it requires the Nvidia drivers and blacklisting nouvea driver still. And I hate relying on Nvidia drivers for anything Linux personally.
"The only reason to start your script with '#!/usr/bin/env <whatever>' is if you expect your script to run on a system where Bash or whatever else isn't where you expect (or when it has to run on systems that have '<whatever>' in different places, which is probably most common for third party packages)."
His very first point is how you should only use it don't know where to expect bash binary, when I feel like, while it's probably safe in most nix os', assuming it limits future enhancements by requiring that binary be in place. However unlikely it would need to or someone would want to move it.