Modernise build and install tooling
This revamps the build and install tooling to satisfy the latest PEP 517 standard.
The direction of travel in the world of Python is to use
setup.cfg to fully separate build and runtime dependencies and requirements for a Python project. This is to define a standard to allow a wider range of package managers than
pip to use the same interface (i.e. not require explicit support from package developers), and makes requirements declarative which avoids the need to execute a
setup.py file in order to figure out how to build a project (the old "chicken and egg" problem in package building - previously solved in a hacky way with
PEP 517 defines a standard interface for frontend tools to use to build projects, and recommends that the build should normally be made in an isolated, temporary environment before being moved to the final installation location. This has many benefits, such as disassociating much of the setup procedures from the build tool used, not having to have build dependencies installed in the user's path, and the guarantee that build tooling doesn't rely on some unlisted implicit dependencies (unless they're system packages) - if a build works on the developer's system, it should work on other systems (as long as system dependencies are met, of course).
It's optional whether runtime and extra (e.g. development tool) dependencies are defined in
setup.cfg. The latter has the advantage of, like
pyproject.toml, being declarative and avoiding the temptation to write Python code that dynamically sets requirements depending on external factors. It also makes it easier for our CI to figure out what to install for building, testing and documenting without having to duplicate the requirements in
requirements*.txt files. However, for us it's not easy to move everything to
setup.cfg because we also build Cython extensions with various flags. In principle Cython can work with
setup.cfg too by installing a build dependency such as
extension-helpers, but when I tried I ran into issues possibly stemming from our use of a
/src subdirectory. There may be an upstream bug to report. For now, in this MR keeps part of the old
setup.py to handle the Cython bits, and moves everything that can be moved to
Originally we had pinned the versions in this file and left the dependencies in
setup.py unpinned, as described in #58 (closed). Revisiting the reasons for this I couldn't find any real justification for this approach and it appears to be an uncommon one. It also made it difficult to produce a wheel with pinned dependencies, because
pip normally reads the dependencies and versions from
setup.py, where we were defining our unpinned dependencies. I looked but couldn't find a way to tell
pip to use a different list of dependencies to those defined in
setup.cfg, which makes me think the way we're doing it is "wrong". Instead I pinned the requirements in
setup.cfg for the sake of the wheels which has the side effect of pinning them for developers installing with
pip install -e . too. Anyone wishing to test updated versions of the dependencies now has to install them manually. We should periodically bump these to later versions. I left the development dependencies unpinned, so we should be using the latest versions of these. This is good because we can easily catch issues with new versions of packages such as
sphinx as they arise.
- Dependencies now defined in
setup.cfg(runtime and extras).
requirements-build.txtfiles, which duplicated the requirements in
setup.py, have been removed. The CI jobs that previously used these to install test/doc/build dependencies now grab them from parsing and reading
setup.cfgusing a small script (copied from GWpy).
- The "dev"
extrashave been split into separate "test", "docs", "lint" and "inplacebuild" groups (first two for the benefit of the CI). There doesn't seem to be a way to specify an "install all extras" flag in pip though, so to install everything we just have to ask for everything explicitly:
pip install -e .[test,docs,lint,inplacebuild]. The "inplacebuild" group is to allow
maketo work now that we're using PEP-517 compatible builds.
- The conda
environment-win.ymlfiles now specify to run
pip install -e .[test,docs,lint,inplacebuild]after installing the conda dependencies, which means Finesse is fully set up and ready to run with a single
conda env create -f environment.yml -n finesse3(no need to run
pip install -e .[test,docs,lint,inplacebuilt]or
- The conda environments grab their packages from
conda-forgenow, since the
c-compilermetapackage was only available there. Plus, Duncan put to rest our fears about
- Matched the
setup.cfgNumpy versions such that we solve (I think) the binary incompatibility warnings we sometimes get when running Finesse.
- Got rid of
environment.yml, again for binary incompatibility and single source of truth reasons.
pyproject.tomlso we always use the latest version and catch bugs that arise.
- Removed need for
xindydependency for building the Sphinx PDF docs now that we use Sphinx 4 (it was only a requirement on 3.5 and earlier).
- Like the Windows conda environment YAML file, the Linux one now installs the C compiler provided by the local conda installation, which is recommended practice. It also means the CI job that tests the conda installation no longer needs to install
Makefileto detect number of CPUs to build with.
- Updated docs. As discussed in the telecon on 2021-06-03, removed all information about compiling Finesse from source from the user-facing side, and merged these into the developer docs. The developer docs now describes how to install Finesse either in a Conda (development) environment or just using pip.
Can we get rid of the separate Windows conda environment file? Using
conda-forgewe can in principle install
suitesparseand a C compiler with the same set of packages as Linux/OSX.
As a test of the new
make command, I tested the time to build with 1-12 threads. It seems building Finesse is entirely CPU bound right now, so the new behaviour in the
Makefile should keep using as many threads as there are CPUs.