Hacker News new | past | comments | ask | show | jobs | submit login

I speculate that the group of users that are both A: Willing to install something beyond Pip for package management, and B: aren't willing to install a proper dependency resolver (uv, etc) is small.

If you are willing to use a third-party package tool (big leap), I think it's a small step to use one that fixes all of pip's limitations, vice a single one.






Perhaps it's not clear from my description above, but I'm afraid the flaw is in the Python package ecosystem itself rather than pip. I'm not very familiar with uv, but from what I can tell from the documentation, it needs to execute the same steps as pip to resolve metadata, as this is required by various PEPs. (You can have a look at the diagram in the linked blog post https://medium.com/data-science-collective/pipask-know-what-...).

But I also get your point - advanced users who care about security may not be using pip. Implementing the functionality as a plugin for uv or poetry is actually the next step I'm considering, if people find the concept of pipask useful. What do you think?


I simply wouldn't use this as is but I would like it if it was a uv plugin, poetry seems like a dead end in 2025 to me.

Why don’t we update pypi to require publishing dependency metadata along with packages, so that the deps can be resolved without running code?

Isn't this what pyproject.toml solves? Genuine question as I am blissfully unaware of the intricacies dependency resolution.

Short version, although you already got an answer:

If everyone had to use it, and everyone were only allowed to use "static" dependencies determined ahead of time, yes. But:

* legacy projects that don't use pyproject.toml are still supported

* it's possible to publish an "sdist" source package that's built on the user's machine (for example, because it includes C code that's highly machine specific and needs to be custom built for some reason; or because the user wants to build it locally in order to link against large, locally available libraries instead of using a massive wheel that copies them)

* When something is built locally, it's permissible to determine the dependencies during that build process (and in some rare cases, that may be another reason why an sdist gets used - the user's environment needs to be inspected in order to figure out what dependencies to fetch)

* Even if it did work, `pyproject.toml` is really more like "source code" for the metadata (about dependencies and other things). The real metadata is a file called `PKG-INFO` when it appears in an sdist, or `METADATA` in a wheel. The format is based on email headers (yes, really).


Have a look at the diagram in the accompanying blog post https://medium.com/data-science-collective/pipask-know-what-... , it explains how the process works.

In short, you can get metadata from pyproject.toml, but (a) it can still involve executing code due to PEP 517 hooks, and (b) a malicious package would use the legacy setup.py to get their code executed.


That's a super helpful diagram. Saved it for later in case I have to explain to someone else. Thank you. I can see why something like pipask would be helpful. I saw in another comment that you are looking to make a uv plugin. I'll be on the lookout for that getting released!

Pip is "a proper dependency resolver". It just perhaps doesn't have the best heuristics for performance, but they're working on that.

What pip isn't is a workflow tool for developers, or "project manager" (terminology uv uses in its marketing; "package manager" seems to be not well enough defined to argue about). Pip doesn't install Python, create or manage virtual environments (or real ones for the separately installed Python versions), upload your packages to PyPI, view a lockfile as a way to keep track of an environment (although they have just added experimental support for creating PEP 751 lockfiles and are planning support for installing from them), do one-off runs of Python applications (by installing them in an ephemeral environment, possibly installing PEP 723 inline-specified dependencies), define or manage "workspaces", have its own `[tool.pip]` section in pyproject.toml, or possibly other things I forgot.

But it absolutely does determine the dependencies of the packages you're currently asking to install, transitively, attempt to figure out a compatible set of versions, and figure out which actual build artifacts to use (i.e., "resolve dependencies"). Its logic for doing so has even been extracted and made available as the `resolvelib` package (https://pypi.org/project/resolvelib/).

My own project, PAPER, is scoped to fix pip's problems and also do basic environment management (so as to install applications or do temporary runs). The point is to satisfy the needs of Python users, while not explicitly catering to developers. (I'll probably separately offer some useful developer scripts that leverage the functionality.)

I also, incidentally, intend to allow for this sort of prompt during the install procedure. (Although the planned default is to refuse sdists entirely.)


Do you have a link where I can learn more about PAPER?

Ah, I misread your OP - the only part I really planned to support is "this package requires building from source, and may run arbitrary code now (not just after installation) to do so; are you sure you want to proceed?". Although the other stuff certainly seems worthwhile - I think it would be easier to plug into my design than into pip. Especially since I'm explicitly creating an internal API first and wrapping a separate CLI package around that.

The project is still in quite early development stages, and not yet really usable for anything - I've been struggling to make myself sit down and implement even simple things, just personal issues. But the repository is at https://github.com/zahlman/paper and I am planning a Show HN when it seems appropriate. Hopefully the existing code at least gives an idea of the top-level design. I also described it a bit in https://news.ycombinator.com/item?id=43825508 .

I've also written a few posts on my blog (https://zahlman.github.io) about design problems with pip, and going forward I'll be continuing that, along with giving a basic overview of the PAPER design and explaining how it addresses the problems I've identified.


Isn't pip third party? Which makes your point stronger.

Yes and no.

Pip is nominally developed by separate people and isn't part of the standard library. However, it does ship with Python by default (Debian-based Linux distributions go out of their way to remove it), in the form of a wheel vendored within the standard library folders. The standard library module `ensurepip` is used to install that wheel - it bootstraps Pip's own code from within that wheel. This is also used indirectly by default when you create a new venv with the standard library `venv`.

(The reason uv can create environments quickly is that it skips that part, while otherwise following nominally the same logic. You can get the same effect by passing `--without-pip` to the `python -m venv` invocation, and it's actually faster (on my machine at least) than using uv. However, you then need to understand how to use pip cross-environment (it wasn't designed for that from the start, but modern pip offers support that's only a little bit buggy). I discuss this on my blog in https://zahlman.github.io/posts/2025/01/07/python-packaging-... .)




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: