Skip to content

Conversation

@mvandenburgh
Copy link
Member

I think we should continue to pin abstract dependencies in certain scenarios. In the two cases in this PR, we are explicitly pinning the dependencies to specific versions for application-level reasons; including the pinned version in pyproject.toml makes this clearer.

Overall, I consider the relationship between pyproject.toml and uv.lock to be analogous to setup.py and requirements.txt; i.e., pyproject.toml should include abstract dependencies that are pinned as liberally as possible (ideally, not pinned at all), while uv.lock should include a concrete list of dependencies that are guaranteed to work/are reproducible.

In DANDI's case, the application code dictates that we support exactly one version of dandischema, and explicitly does not support versions of djangorestframework newer than a certain version. IMO this is the type of abstract information that should be present in pyproject.toml.

A more practical reason to make this change is that PRs like #2588 are hard to verify in review without this type of pinning.

I think we should continue to pin abstract dependencies where it makes
sense. In these cases, we are explicitly pinning two dependencies to a
specific version for application-level reasons.
We're not pinning `allauth` to an older version, so I don't see this
comment as being helpful anymore with the introduction of `uv.lock`.
@mvandenburgh mvandenburgh changed the title Include pinned versions in pyproject.toml Include pinned versions in pyproject.toml when necessary Oct 10, 2025
Copy link
Collaborator

@brianhelba brianhelba left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Your rationale makes sense and I agree that this is a good idea.

Do you want to merge this first, then rebase #2588 on top?

@yarikoptic
Copy link
Member

I think, overall, I am very inline with you @mvandenburgh ! moreover, I thought that dandi-archive already pins dandi-schema very specifically but you showed me that I was wrong, hence:

In DANDI's case, the application code dictates that we support exactly one version of dandischema,

I do not think code dictates that at all. Moreover we have upgrade paths so dandi-cli using older dandi-schema can upload and we would migrate records to newer schema on the fly IIRC. Since it is a pydantic model, there indeed could be some compatibility concerns. Hence, if we properly used semver, we should have had smth like ~= 0.11 thus allowing for any version within 0. compatibility past 0.11. But since we do not, but to still allow for bug fixes etc, I think stating ~= 0.11.1 would be sensible.

and explicitly does not support versions of djangorestframework newer than a certain version.

most likely there is also a minimal version we could state compatible with? might be worth specifying too for completeness, and could be then also ~= 3.14.0 (although there were no bugfixes since then until 3.15!)

But since it was not pinned before, how are we functioning since there were releases all the way until 3.16.1??? Let's continue on that issue

@yarikoptic
Copy link
Member

Your rationale makes sense and I agree that this is a good idea.

Do you want to merge this first, then rebase #2588 on top?

which version of djangorestframework are we using in deployment ATM?

@mvandenburgh mvandenburgh self-assigned this Oct 14, 2025
@mvandenburgh
Copy link
Member Author

In DANDI's case, the application code dictates that we support exactly one version of dandischema,

I do not think code dictates that at all. Moreover we have upgrade paths so dandi-cli using older dandi-schema can upload and we would migrate records to newer schema on the fly IIRC. Since it is a pydantic model, there indeed could be some compatibility concerns. Hence, if we properly used semver, we should have had smth like ~= 0.11 thus allowing for any version within 0. compatibility past 0.11. But since we do not, but to still allow for bug fixes etc, I think stating ~= 0.11.1 would be sensible.

I was talking more in the context of dependency pinning - at any point in time, there is exactly one version of dandischema that we want to be installed with dandi-archive, since we are very intentional with coordinating the dandischema version between the django app, dandi-cli, etc. As opposed to other dependencies, where we do want to pin them for reproducibility, but we don't care about which specific version we install in the same way we do for dandischema.

which version of djangorestframework are we using in deployment ATM?

We're using 3.14.0 (see https://github.com/dandi/dandi-archive/blob/master/uv.lock#L1102-L1104)

@mvandenburgh mvandenburgh added the dependencies Update one or more dependencies version label Oct 14, 2025
@mvandenburgh mvandenburgh merged commit e62b397 into master Oct 14, 2025
8 of 9 checks passed
@mvandenburgh mvandenburgh deleted the pyproject-toml-pinning branch October 14, 2025 16:16
@yarikoptic
Copy link
Member

IMHO setup.* and pyproject.toml should declare compatibility version (ranges), not deployment specific scenarios. That allows to test for different deployments (e.g. current and perspective) without changes to those files by merely installing corresponding version (like we already do I believe in downstream testing for dandi-cli and dandischema). If you know that djangorestframework is not compatible with 3.15.0 - then we indeed should limit via <3.15.0, or otherwise might need some ~= 3.15.0 if issue was fixed after that release. But again, not to specify specific version.

As you pointed out then, you do use uv.lock, and that is along with requirements.txt etc are the configurations for deployments through which we could/should sync. FWIW we should actually strive to step away from such a tight dependency in dandischema lib version.

"dandi", # minimal version is also provided in API /info
# Pin dandischema to exact version to make explicit which schema version is being used
"dandischema", # schema version 0.6.10
"dandischema==0.11.1", # schema version 0.6.10
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note: I have concerns with such strict specification which were not resolved. Let's hope it would not immediately affect downstream testing anywhere etc.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

uv.lock is the source of truth for dependency installs, and dandischema is already set to 0.11.1 in there. So this shouldn't break anything in downstream projects.

I'm not super familiar with the release process/the way breaking changes and/or new features happen for dandischema, but generally we are very intentional about which version we're using. As opposed to, take django-allauth for example (just a random dependency - this applies to most of our other dependencies as well). That is a third-party dependency that we try to keep up to date for bugs, security fixes, etc, but we don't necessarily care about the changes in new releases that aren't applicable to dandi-archive. Whereas, dandischema development is typically driven solely by dandi-archive's requirements (and vice versa), so we usually upgrade that intentionally to pull in some new feature or schema change that dandi-archive needs.

And when doing dependency upgrades (like in #2588), my thinking is that dandischema should be treated differently and not get lumped in with third-party dependencies. I suppose another way to look at it is, while dandischema is a separate python library, it is essentially "our code", as opposed to a third-party package like django-allauth.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree with @mvandenburgh about pinning dandischema. This is not the general case of wanting to absorb new features/enhancements and security fixes from upstream as with a lot of third-party dependencies; here we really do want to declare that the software works with this exact version of dandischema, and we're accepting the penalty of manual work to upgrade that dependency when that is necessary. (This is just another way of stating what Mike wrote above, that dandischema is "our code", and it's only inlined directly in the dandi-archive codebase solely because other components in the ecosystem need to use it.)

Put one more way: indeed, we want pyproject.toml to include acceptable ranges; for dandischema, the acceptable range is precisely one version.

Copy link
Member

@yarikoptic yarikoptic Oct 24, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

here we really do want to declare that the software works with this exact version of dandischema

that's exactly my point is that it actually works with many other versions of dandischema. As you both describe above -- it is just that our deployment ATM requires a specific one. But for instance development does not have that strict requirement (as e.g. we should be testing against another (development) version). Hence my comment that stating our strict deployment requirement (ok to be in uv.lock) as compatibility (what "works") requirement (what we state in pyproject.toml or setup.*) might give side-effects, as e.g. resolvers complain or refuse to use any other version while developing/testing etc.

@mvandenburgh
Copy link
Member Author

If you know that djangorestframework is not compatible with 3.15.0 - then we indeed should limit via <3.15.0, or otherwise might need some ~= 3.15.0 if issue was fixed after that release. But again, not to specify specific version.

Agreed. djangorestframework is not just incompatible with 3.15.0, but all versions > 3.15.0 due to some breaking changes (see #2589)

@dandibot
Copy link
Member

🚀 PR was released in v0.16.3 🚀

@dandibot dandibot added the released This issue/pull request has been released. label Oct 23, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Update one or more dependencies version released This issue/pull request has been released.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants