|
| 1 | +--- |
| 2 | +layout: single |
| 3 | +title: "Why We Choose What We Choose" |
| 4 | +excerpt: "In selecting one way to package Python, pyOpenSci chooses between a lot of code tools. These choices often come after months of exploration and debate. Find out what motivates us to make the decisions that we do." |
| 5 | +author: "Jeremiah Paige" |
| 6 | +permalink: /blog/pyopensci-why-we-choose.html |
| 7 | +categories: |
| 8 | + - blog-post |
| 9 | + - community |
| 10 | +classes: wide |
| 11 | +toc: true |
| 12 | +comments: true |
| 13 | +last_modified: 2025-10-02 |
| 14 | +--- |
| 15 | + |
| 16 | +A primary focus at pyOpenSci, one of our petals of support, is selecting packaging tools that work well for our users and work well together. We use our curated selection of tooling in our packaging guide, in our online tutorials, and in our trainings. We don’t require the use of any of our selected tools to submit a package to our peer review program but we will suggest them if the package authors ask for help trying to clean up or adopt new workflows. |
| 17 | + |
| 18 | +The members of pyOpenSci spend a great deal of time selecting these tools, debating tradeoffs, test-driving them in new situations, looking at new community trends, and listening to feedback from events we run. But up until now we haven't typically posted in the open either how or why we have come to the decisions we did. This post outlines, in not very strict terms, the rubric we use when selecting a project that we want to recommend. We focus on a beginner-leaning happy path for packaging workflows. In other words, what works really well for new projects that don't have a lot, if any, non-conventional requirements to share their code. |
| 19 | + |
| 20 | +Our rubric comes from five categories, approximately ordered as follows: |
| 21 | + |
| 22 | +## Tools That Are Free and Open |
| 23 | + |
| 24 | +We love open software! It's kind of in our name. We are always looking to nurture and support open source software, even beyond packaging projects. It should come as no surprise then, that we only choose tools in our packaging guide that are themselves open source. |
| 25 | + |
| 26 | +We don’t just appreciate open source though, we also look for projects that are open contribution – that do most of their maintenance, stewardship, and designing in public. This means that there is a public bug tracker, that new issues to that bug tracker are accepted from anyone, and also fixes for those bugs are accepted from non-maintainers. It may also mean that new features ideas are accepted from the community, or even given a period of public comment. |
| 27 | + |
| 28 | +Our commitment to open software goes beyond just projects that choose to host their code and bugs in a public manner. We also value Free Software; both as in Beer and as in Freedom. Permissive open source software empowers its users to take control of their tools and fix, extend, secure, and adapt code for the purposes that will best fit their own needs. Choosing projects that do not require a financial exchange in order to be used ensures that we can recommend our choices to anyone no matter their situation or location. |
| 29 | + |
| 30 | +## Tools That Are Inclusive |
| 31 | + |
| 32 | +Inclusivity is very important to us; it is a critical component of tooling projects we select. Programming, including packaging of that software, is a skill that should be available to everyone. |
| 33 | + |
| 34 | +The tools we advocate for should not seek to limit their use through actions such as: adoption of a restrictive license; poor, missing, obfuscated, or expert-only documentation; charging for use of the tool or any process related to its successful usage; not supporting mainstream operating systems. The projects behind the tools should welcome constructive bug reports from users of all levels, and ideally also welcome contributions from all their users. |
| 35 | + |
| 36 | +There are some signals that we look for to tell if the project is inclusive. We want all of our recommended tools to have a code of conduct for their project. The project should also have a contributors guide that is easy to find. Labeling issues or running sprints aimed at fostering commits from new contributors is also a great indicator. |
| 37 | + |
| 38 | +Projects that manage to attract and maintain a large collection of contributors will be viewed much more positively (not only code, but documentation if it is separate, engagement with the bug tracker, external write ups and tutorials, and so forth). |
| 39 | + |
| 40 | +## Tools That Implement Open Standards |
| 41 | + |
| 42 | +It is very important to us that the tools and processes we stand behind support the full set of community standards. |
| 43 | + |
| 44 | +For Python, this typically means conforming to [PEPs](pep.python.org), but may also involve other standard bodies such as [SPECs](https://scientific-python.org/specs/). |
| 45 | + |
| 46 | +Supporting community standards demonstrates that the project respects the community it is working within and is serious about interoperability with other tools and processes. When done right, these tools empower their users to move this workflow, or any of its inputs or outputs, to another standards-compliant tool or process with little to no friction. It also makes it easier to teach since many of the concepts, as well as in some cases entire parts of project data, are tool-agnostic and can feel "familiar" even to those that have never used the tool before. There is also likely to be more documentation developed through forums, blogs, workshops, and other online platforms, because they apply to more than one tool. |
| 47 | + |
| 48 | +It can be a lot of work for tool maintainers to keep up-to-date with changes in standardization, especially in a large and eclectic community such as Python. While we understand that it can take time to implement new features imposed from outside of a project, we also know that a selectively-implemented standard is often worse than no standard. |
| 49 | + |
| 50 | +## Tools That Are Well Supported |
| 51 | + |
| 52 | +We would like to only recommend projects that we can confidently say are healthy, correct, and here to stay. A well-maintained project is a somewhat subjective metric that is hard to pin down, but whenever possible we would apply our [same standard](https://www.pyopensci.org/software-peer-review/how-to/author-guide.html#does-your-package-meet-packaging-requirements) for Peer Reviews of Scientific Software. |
| 53 | + |
| 54 | +Authors and maintainers should respond to open issues and continue to make fixes to the project. We do not have any expectation or metric of response time, open bug count, time to close commit requests, security artifacts, or any other level of effort requirement; only that the project is alive and healthy to the degree that is appropriate for its function. We also strongly prefer projects that have a team of core maintainers as opposed to an individual maintainer. |
| 55 | + |
| 56 | +## Tools That Reduce User Choices |
| 57 | + |
| 58 | +Python packaging suffers, perhaps infamously, from [Too Many Options](https://www.pyopensci.org/blog/python-packaging-friends-dont-let-friends-package-alone.html#just-say-no-to-tmo). We would like to make as many choices as we can on behalf of the learner. Better yet is to make choices that will eliminate further choices to be made later on in the process; this can help stop runaway analysis paralysis. |
| 59 | + |
| 60 | +This means that these tools should implement sane defaults for any configurable value. Like with Python, they should make the simple easy and the difficult possible. |
| 61 | + |
| 62 | +It also means that we will generally select one tool when two or more could do the same job. So long as the one tool fits our other criteria it doesn't have to "win" at every single task it is capable of doing when compared against a plethora of other tools. |
0 commit comments