It's easy to "trivially know" because it's really expensive to disprove the statement, so you continue believing in your original one.
I am not in game development, but was recently looking in some deep learning networks. A lot of compute, bazillion layers of abstractions (erm python), but at the end of the day, 98% of the time was really spent calculating the matmul. I had luck that it was easy to check it and estimate the abstraction cost (not into great detail), and disprove my original thought.
Had i not checked it, I'd still trivially know that far more perf loss happened because of abstractions.
Thanks a lot for your support! We are ourselves not yet sure in which direction will Wasp evolve but find it very exciting to work on it, especially when we see other people being excited about it too :)
Docker is definitely an interesting tool for that, but my biggest problems is that I have to teach them Docker, which is a totally new layer of abstraction they haven't seen before.
How do you approach this? How technical are people you prepare Docker images for?
You don't need to teach docker. All you need is providing a docker image with everything pre-installed such as Julia, R language, Python, numpy, pandas, Tensorflow and maybe Vscode. And also any Linux distribution, then one can just type "$ docker --rm -it -v $PWD:/cwd -w /cwd my-image ipython" For better convenience, it is better creating a command line wrapper or shell script that saves one from typing that such as $ ./run-my-image ipython. I don't prepare anyone, but I guess that if I knew anything about docker and was given a docker image with everything ready and pre-configured and also a shell script or command line encapsulating all docker command line switches, I would find it more convenient than installing everything myself or fighting some dependency conflict or dependency hell. So, docker can be used as a portable environment development. VScode, aka visual studio code, also supports remote development within docker containers with extensions installed per container. I am a mechanical engineer by training, but I found docker pretty convenient for getting Julia, Octave, R language, Python, Jupyter Notebook server without installing anything or fighting with package manager of my Linux distribution when attempting to install a different version of R, Julia or Python. This approach makes easier for getting bleeding edge development tools without breaking anything that is already installed. I even created a command line wrapper tool for using docker in this way that simplifies all those case: $ mytool bash jupyter-image; $ mytool daemon jupyter-notebook ...
Does it pin versions of 2nd degree dependencies too? Like pip freeze would do?
Also, when you remove a package, does it know to clear packages that were its deps and are not needed anymore?
For the second, pip-compile computes the new requirements.txt (which is effectively the lockfile) from scratch, and pip-sync (not shown in that Makefile fragment) removes packages that are no longer listed there.
Pipenv is almost exactly pip-tools + venv + a custom UI over the two. And slightly easier / more automatic venv management.
From the times I've looked at it (almost a year ago and older): pip-tools is / was the core of Pipenv... but it has been a fair distance ahead in terms of bug fixes and usable output for diagnosing conflicts. It seemed like pipenv forked from pip-tools a couple years prior, and didn't keep up.
Given that, I've been a happy user of v(irtual)env and pip-tools. Pipenv has remained on the "maybe some day, when it's better" side of things, since I think it does have potential.
I think this is not a problem specific to python packages, but a general problem of how we compile C/C++ software. There is no concept of packages and compiling one thing often requires installing a -dev package of some other library.
The issue is that lack of packaging C/C++ world spreads to all other communities that depend on them.
Because underscore-functions aren't truly private, I have once seen an upgrade from 2.7.8 to 2.7.13 fail. A commonly-used package was importing one from a core python module.
It's just about backward compatibility. If you don't run the exact version of python that's running in production, how do you know that you're not using some method that does not exist in production yet (because you run older version there)?
Also, libraries with binary component often have to be compiled against specific version of python.
I am not in game development, but was recently looking in some deep learning networks. A lot of compute, bazillion layers of abstractions (erm python), but at the end of the day, 98% of the time was really spent calculating the matmul. I had luck that it was easy to check it and estimate the abstraction cost (not into great detail), and disprove my original thought.
Had i not checked it, I'd still trivially know that far more perf loss happened because of abstractions.