by jtanderson on 8/12/19, 4:33 AM with 75 comments
by SmirkingRevenge on 8/12/19, 12:48 PM
- vanilla virtualenv: I don't even bother with the wrapper most of the time.
- vanilla setup.py/setup.cfg: Its just really not that bad. Forget about project.toml or whatever the next big thing is.
- pip-tools: ditching pipenv and using this for pinning app requirements, has made my life so much simpler.
by 1337shadow on 8/12/19, 1:41 PM
Using --user here, no per-project virtualenv, used to have a virtualenv in ~/.env, where i keep everything up to date and have many -e installs from ~/src. For production deployments: --user in containers.
This means all my projects have to work with the same versions of dependencies.
This means all my dependencies have to work with the same versions of dependencies.
All my dependencies have to be up to date: all versions must be latest, or next to-be-released.
When don't support new versions of other dependencies: I deploy a fork and open a PR.
At such, I'm not spending time trying to make obsolete versions work, rather, I'm spending time making new versions work together.
My perception is that this strategy offers better ROI than all others which I have tried, not only for me, but for the whole ecosystem. That's also how I made my Python 2 to 3 transition, and have been using 100% Python 3 for a while (once you've ported one big software, porting others and their smaller dependencies is easy).
I'm extremely happy about this strategy and my life has just been better since I started working this way, and highly recommend it.
For my own (countless) packages: I use setupmeta which simplifies auto-updates (I think it succeeds what pbr tried to achieve).
by micimize on 8/12/19, 4:42 PM
With poetry+pyproject.toml, I have one file that is flat, simple, and readable. @sdispater has done incredible work on this project, and I hope they get more resources for development.
by korijn on 8/12/19, 7:23 AM
Anyway, has Poetry caught up to the dependency resolution provided by the pipenv lock command yet? Last time I tried it (~6 months ago), it couldn't produce working environments for my requirements.
BTW, for anyone who wants to educate themselves, read the issue description here and follow the links: https://github.com/pypa/pip/issues/988
by angrygoat on 8/12/19, 8:18 AM
One solution at the moment is to run 'pip freeze' and put that in as your requirements file, but that very much feels like an 'and now I have a different problem' solution.
by 0xbadcafebee on 8/12/19, 3:04 PM
The biggest problem with Python packages isn't a dependency manager, it's that Python developers don't reuse and extend existing projects. They just keep churning out new similar projects with different features. All of the most commonly used Python packages are basically one-offs that use a bunch of other one-offs. This creates a cycle of reinventing the wheel and friction in ramping up on existing technology, due to all the extra integration work and split development, and later lift-and-shift into completely different dependencies that implement the same thing. PyPI is awash with duplicate abandoned one-offs to the point that just trying to just find a useful module to do what you want can take a long time.
by andyljones on 8/12/19, 7:43 AM
For people working outside of scientific Python: conda is a package and env manager maintained by a private company that's become the go-to because it's really good at handling binary dependencies.
by HunOL on 8/12/19, 7:57 AM
by macawfish on 8/12/19, 3:23 PM
I can't be too mad at pipenv, but all in all poetry is a better experience.
by dissent on 8/12/19, 11:36 AM
by ilovecaching on 8/12/19, 3:08 PM
Python makes huge sacrifices in readability and efficiency for a hypothetical win in writeability. It's also fundamentally at odds with the multicore world we're living in an will continue to live in, an issue that many people have tried to fix and failed at.
I can't count the number of times I've had to spend my entire night reading through an older Python service without type annotations, desperately trying to understand the types of inputs and outputs. It's just a mess, and the bias of programmers to believe that they are writing readable code (when in reality it's only readable to them) exacerbates the use of names that provide little context to solve the problem.
Python is awful, and it's an uphill battle to fix it. Asyncio can solve some performance issues, but negates the benefit of Python's simplicity by destroying the sequential mental model that Programmers can easily grok. It also requires a complete rewrite of libraries and a split between non-asyncio land and asyncio land.
Type annotations can make Python more readable and thus more maintainable, but gradual typing still leaves lots of unanswered questions when you're interfacing with untyped code.
Python is really not good unless you are using it to prototype or build something on your own. It's led to a world of slow, buggy software that is impossible to rewrite. It's downsides are easy to measure, but its benefits are difficult to quantify.
by bbmario on 8/12/19, 4:18 PM
by zys5945 on 8/12/19, 7:35 AM
by sametmax on 8/12/19, 6:49 AM
by julienkervizic on 8/13/19, 7:22 PM
I see though that it only supports pure python for building packages, does that mean that it doesn't build if you are dependent on compiled libraries?
Is there also a plan to add some of the functionality of bundling tools such as web-pack into this build phase? like automated css optimization, image compression... Could be handy for some django/flask projects.
by j88439h84 on 8/12/19, 6:21 AM
I hope the rest of the ecosystem can catch up quickly.
Tox and pip and and pex need full support for PEP 517/518.
by oweiler on 8/12/19, 7:20 AM
by markandrewj on 8/12/19, 4:57 PM
by pbreit on 8/12/19, 8:18 AM
by Areading314 on 8/12/19, 5:31 AM
by heyoni on 8/12/19, 5:52 AM
by fouc on 8/14/19, 9:36 AM
by ausjke on 8/12/19, 11:54 PM