Hacker Newsnew | past | comments | ask | show | jobs | submit | autograd's commentslogin

It's easy to "trivially know" because it's really expensive to disprove the statement, so you continue believing in your original one.

I am not in game development, but was recently looking in some deep learning networks. A lot of compute, bazillion layers of abstractions (erm python), but at the end of the day, 98% of the time was really spent calculating the matmul. I had luck that it was easy to check it and estimate the abstraction cost (not into great detail), and disprove my original thought.

Had i not checked it, I'd still trivially know that far more perf loss happened because of abstractions.


It is unclear to me whether it's archived because they've done everything they wanted or they just stopped all the efforts?


The latter. The project was not close to completion.


It's pretty refreshing that people have courage and start ambitious projects like this one :).

I hope projects like this get some traction and prove that we can move higher level than where we are atm with the frontend/frontend-backend code.


Thanks a lot for your support! We are ourselves not yet sure in which direction will Wasp evolve but find it very exciting to work on it, especially when we see other people being excited about it too :)


Thank you :)


Docker is definitely an interesting tool for that, but my biggest problems is that I have to teach them Docker, which is a totally new layer of abstraction they haven't seen before.

How do you approach this? How technical are people you prepare Docker images for?


You don't need to teach docker. All you need is providing a docker image with everything pre-installed such as Julia, R language, Python, numpy, pandas, Tensorflow and maybe Vscode. And also any Linux distribution, then one can just type "$ docker --rm -it -v $PWD:/cwd -w /cwd my-image ipython" For better convenience, it is better creating a command line wrapper or shell script that saves one from typing that such as $ ./run-my-image ipython. I don't prepare anyone, but I guess that if I knew anything about docker and was given a docker image with everything ready and pre-configured and also a shell script or command line encapsulating all docker command line switches, I would find it more convenient than installing everything myself or fighting some dependency conflict or dependency hell. So, docker can be used as a portable environment development. VScode, aka visual studio code, also supports remote development within docker containers with extensions installed per container. I am a mechanical engineer by training, but I found docker pretty convenient for getting Julia, Octave, R language, Python, Jupyter Notebook server without installing anything or fighting with package manager of my Linux distribution when attempting to install a different version of R, Julia or Python. This approach makes easier for getting bleeding edge development tools without breaking anything that is already installed. I even created a command line wrapper tool for using docker in this way that simplifies all those case: $ mytool bash jupyter-image; $ mytool daemon jupyter-notebook ...


Does it pin versions of 2nd degree dependencies too? Like pip freeze would do? Also, when you remove a package, does it know to clear packages that were its deps and are not needed anymore?


pip-tools can do both of those, yes.

For the second, pip-compile computes the new requirements.txt (which is effectively the lockfile) from scratch, and pip-sync (not shown in that Makefile fragment) removes packages that are no longer listed there.


Thanks a lot for the reply. I'll include it some time soon in the article update


Yes, and yes


Thanks for mentioning it! How would you compare it to the other tools from the original post?


Pipenv is almost exactly pip-tools + venv + a custom UI over the two. And slightly easier / more automatic venv management.

From the times I've looked at it (almost a year ago and older): pip-tools is / was the core of Pipenv... but it has been a fair distance ahead in terms of bug fixes and usable output for diagnosing conflicts. It seemed like pipenv forked from pip-tools a couple years prior, and didn't keep up.

Given that, I've been a happy user of v(irtual)env and pip-tools. Pipenv has remained on the "maybe some day, when it's better" side of things, since I think it does have potential.


I think this is not a problem specific to python packages, but a general problem of how we compile C/C++ software. There is no concept of packages and compiling one thing often requires installing a -dev package of some other library.

The issue is that lack of packaging C/C++ world spreads to all other communities that depend on them.


Hey xapata, thanks for pointing this out.

Any chance you could give me some reference so I can fix it in the original article?


https://docs.conda.io/projects/conda/en/latest/user-guide/ta...

Basically, use Conda to manage environments, use Pip to install packages. If you're using Conda to install anything, do that first.


I think that works when you use Python cli tools, but not when you're working on 5 different projects, each running different python version.


Outside of Python 2/3 differences, are Python interpreters not backwards compatible?

In other words, while obviously a program written for 3.3 won't work in 2.7, but will a program written for 3.3 fail to run in 3.8?

If it runs fine, why the need for multiple interpreters? I'd think you'd get by just fine by having the latest 2.x and 3.x installed.


Because underscore-functions aren't truly private, I have once seen an upgrade from 2.7.8 to 2.7.13 fail. A commonly-used package was importing one from a core python module.


It's just about backward compatibility. If you don't run the exact version of python that's running in production, how do you know that you're not using some method that does not exist in production yet (because you run older version there)?

Also, libraries with binary component often have to be compiled against specific version of python.


Sometimes! A very simple example is code that uses "async" as a variable name. It became a keyword in 3.5, which was an enormous pain in the ass.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: