Template Configuration¶
I will explain how the continuous integration is configured and, consequently, how you can configure those same services for you repository.
Python Project Layout¶
The file structure for this Python project follows the src
, tests
,
docs
and devtools
folders layout.
The src layout¶
I discovered storing the project’s source code underneath a src
directory
layer instead of directly in the project’s root folder is one of the most
controversial discussions regarding the organization of Python projects. Here, I
adopted the src
-based layout discussed by ionel in his blog post.
After more than one year of using src
, I have nothing to complain about; on
the contrary, it saved many issues related to import
statements. Either
importing from the repository or the installed version? The src
guarantees
stable imports. More on ionel’s blog (see also src-nosrc example).
The src/sampleproject
folder hosts the actual source of the project. In the
current version of this template, I don’t discuss how to organize a source code
of a project. I am looking forward doing that in future versions.
Testing¶
Here, tests are encapsulated in a separate tests
folder. With this
encapsulation, outside the main library folder, it is easier to control that
tests do not import from relative paths and can only access the library code
after library installation (regardless of the installation mode). Also, having
tests
in a separated folder facilitates the configuration files layout on
excluding tests from deployment (MANIFEST.in
) and code quality
(.codacy.yaml
) or coverage (.coveragerc
).
Documentation¶
All documentation related files are stored in a docs
folder. Files in
docs
will be compiled using Sphinx to generate the HTML documentation web
pages. You can follow the file and folder structure in this repository as an
example of how to assemble the documentation. You will see files that contain
text and others that import text from relative files. The latter strategy avoids
repeating information.
devtools¶
The devtools
folder hosts the files related to development. In this case, I
used the idea explained by Chodera Lab in their structuring your project
guidelines.
Continuous Integration¶
Here, I overview the implementation strategies for the different continuous integration and quality report platforms. In the previous versions of this skeleton template, I used Travis-CI and AppVeyor-CI to run the testing workflows. In the current version, I have migrated all these operations to GitHub Actions.
Does this mean you should not use Travis or AppVeyor? Of course not. Simply, the needs of my projects and the time I have available to dedicate to CI configuration do not require a dedicated segregation of strategies into multiple servers and services and I can perfectly accommodate my needs in GitHub actions.
Are GitHub actions better than Travis or AppVeyor? I am not qualified to answer that question.
When using this repository, keep in mind that you don’t need to use all services adopted here. You can drop or add any other at your will by removing the files or lines related to them or adding new ones following the patterns presented here.
The following list summarizes the platforms adopted:
- Building and testing
GitHub actions
- Quality Control
Codacy
Code Climate
- Test Coverage
Codecov
- Documentation
Read the Docs
I acknowledge the existence of many other platforms for the same purposes of those listed. I chose these because they fit the size and scope of my projects and are also the most used within my field of work.
Configuring GitHub Actions¶
You may wish to read about GitHub actions. Here, I have one Action workflow per
environment defined in tox
. Each of these workflows runs on each of the
python supported versions and OSes. These tests regard unit tests,
documentation build, lint, integrity checks, and, finally, version bump and
package deploying on PyPI.
The CI workflows trigger every time a new pull request is created and at each
commit to that PR. However, the lint
tests do not trigger when the PR is
merged to the main
branch. On the other hand, the version bump
workflow
triggers only when the PR is accepted.
In this repository you can find two PRs demonstrating: one where tests pass and another where tests fail.
Version release¶
Every time a Pull Request is merged to main branch, the deployment workflow triggers. This action bumps the new version number according to the merge commit message, creates a new GitHub tag for that commit, and publishes in PyPI the new software version.
As discussed in another section, here I follow the rules of Semantic Versioning 2.
If the Pull Request merge commit starts with [MAJOR]
, a major version
increase takes place (attention to the rules of SV2!), if a PR merge commit
message starts with [FEATURE]
it triggers a minor update. Finally, if the
commit message as not special tag, a patch update is triggered. Whether to
trigger a major, minor, or patch update concern mainly the main
repository maintainer.
This whole workflow can be deactivate if the commit to the main
branch
starts with [SKIP]
.
In conclusion, every commit to main
without the [SKIP]
tag will be
followed by a version upgrade, new tag, new commit to main
and consequent
release to PyPI. You have a visual representation of the commit workflow in the
Network plot.
How version numbers are managed?
There are two main version string handlers out there:
bump2version and versioneer. I chose bump2version for this repository
template. Why? I have no argument against versioneer or others, simply I
found bumpversion
to be so simple, effective, and configurable that I could
only adopt it. Congratulations to both projects nonetheless.
Code coverage¶
Codecov
is used very frequently to report test coverage rates. Activate
your repository under https://about.codecov.io/
, and follow their
instructions.
Coverage reports are sent to Codecov servers when the test.yml
workflow
takes place. This happens for each PR and each merge commit to maint
.
The .coveragerc file, mirrored below, configures Coverage
reports.
[paths]
source =
src
*/site-packages
[run]
branch = true
source =
sampleproject
parallel = true
[report]
show_missing = true
precision = 2
omit = *migrations*
exclude_lines =
if __name__ == .__main__.:
The .coveragerc
can be expanded to further restraint coverage analysis,
for example adding these lines to the exclude
tag:
[report]
exclude_lines =
if self.debug:
pragma: no cover
raise NotImplementedError
if __name__ == .__main__.:
Remember that if you don’t want to use these services, you can simply remove the respective files from your project.
Code Quality¶
Here, we have both Codacy
and Code Climate
as code quality inspectors.
There are also others out there, feel free to suggested different ones in the
Discussion tab.
Codacy¶
There is not much to configure for Codacy as far as this template is
concerned. The only setup provided is to exclude the analysis of test scripts,
this configuration is provided by the .codacy.yaml
file at the root
director of the repository. If you wish Codacy to perform quality analysis on
your test scripts just remove the file or comment the line. Here we mirror the
.codacy.yaml file:
---
exclude_paths:
- 'tests/**'
Code Climate¶
There is not much to configure for Code Climate, as well. The only setup
provided here is to exclude the analysis of test scripts and other dev files
Code Climate checks by default, the .codeclimate.yml
file at the root
directory of the repository configures this behavior (look at the bottom
lines). If you wish Code Climate to perform quality analysis on your test
scripts just remove the file or comment the line.
Code Climate provides a technical debt percentage that can be retrieved nicely with Badges.
Read the Docs¶
Activate your project at Read the Docs platform (RtD), their web interface is easy enough to follow without further explanations. If your documentation is building under the tox workflow it will build in at Read the Docs.
Docs Requirements¶
Requirements to build the documentation page are listed in
devtools/docs_requirements.txt
:
sphinx>=2.2
sphinx-py3doc-enhanced-theme
sphinx-argparse
CommonMark
mock
Here we use Sphinx as the documentation builder and the sphinx-py3doc-enhanced-theme as theme, though you can use many different theme flavors, see Sphinx Themes.
Build version¶
By default, RtD has two main documentation versions (also called builds): the
latest and the stable. The latest points to the master
branch while
the stable points to the latest GitHub tag. However, as we have discussed
in The rationale behind the project section, here we keep only the
latest version (that of the master
branch) and other versions for the
different releases of interest.
Google Analytics¶
Read the Docs allows straight forward implementation of Google Analytics tracking in the project documentation, just follow their instructions.
Local Build¶
The [testenv:docs]
in tox.ini
file simulates the RtD execution. If that
test passes, RtD should pass.
To build a local version of the documentation, go to the main repository folder and run:
tox -e docs
The documentation is at dist/docs/index.html
. The tox
run also reports
on inconsistencies and errors. If there are inconsistencies or errors in the
documentation build, the PR won’t pass the CI tests.
Badges¶
Badges point to the current status of the different Continuous Integration tools
in the master
branch. You can also configure badges to report on other
branches, though. Are tests passing? Is the package building properly? Is
documentation building properly? What is the quality of the code? Red lights
mean something is wrong and you should attend it shortly!
Each platform provide their own badges, and you can modify and emulate the badge
strategy in the README.rst
file. Yet, you can further tune the badges style
by creating your own personalized badge with Shields.io.
You have noticed already that there is no badge for PyPI. That is because
“python-project-skeleton” is deployed at test.pypi.org
. You will find also
at Shields.io how to add the PyPI badge.
Configuration Files¶
MANIFEST.in¶
The MANIFEST.in
file configures which files in the repository/folder are
grabbed or ignored when assembly the distributed package. You can see that I
package the src
, the docs
, other *.rst
files, the LICENSE
and
nothing more. All configuration files, tests, and other Python related or IDE
related files are excluded.
There is a debate on whether tests should be deployed along with the library source. Should they? Tox and the CI integration guarantees tests are run on installed versions of the software. So, I am the kind of person that considers there is no need to deploy tests alongside with the source. Users aren’t going to run tests. Developers will.
Let me know if you think differently.
It is actually easy to work with MANIFEST.in file. Feel free to add or remove files as your project needs.
tox.ini¶
Tox configuration file might be the trickiest one to learn and operate with
until you are familiar with tox’s workflows. Read all about tox in their
documentation pages. The tox.ini
file contains several comments explaining the implementations I adopted.
bumpversion.cfg¶
The .bumpversion.cfg
configuration is heavily related with the GitHub
Actions workflows for automated packaging and deployment and the
CONTRIBUTING.rst
instructions. Specially, the use of commit and tag
messages, and the trick with CHANGELOG.rst.
I have also used bumpversion in other projects to update the version on some badges.