Skip to content
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
Show all changes
22 commits
Select commit Hold shift + click to select a range
21dff06
Start to add onto collecting metrics & add files for reporting metrics
ArBridgeman Jul 29, 2025
007eca3
Move and try to reformat/reduce metrics.rst to project_report.rst
ArBridgeman Jul 29, 2025
0cb855f
Remove empty nox.rst and nothing-to-see-here.png as no longer used
ArBridgeman Jul 29, 2025
dacca29
Move sonar configuration to sonar file
ArBridgeman Jul 29, 2025
605d56e
Fix links
ArBridgeman Jul 29, 2025
756b0a7
Merge branch 'main' into documentation/510_consolidate_metrics_and_sonar
ArBridgeman Jul 29, 2025
a9ed17c
update text to be more direct with PTB as actor per review
ArBridgeman Jul 30, 2025
e94fc8d
reduce words & use proper nouns to make intent clearer per review
ArBridgeman Jul 30, 2025
f0a23a6
be more explicit that GitHub repository, not just project, per review
ArBridgeman Jul 30, 2025
ff93440
be more explicit that GitHub repository, not just project, per review
ArBridgeman Jul 30, 2025
99d8482
Switch one .lint.json to .lint.txt
ArBridgeman Jul 30, 2025
c53fc4b
Fix underline issue from review comments
ArBridgeman Jul 30, 2025
8810c34
Switch to paragraph form for instructions so clearer who is actor & w…
ArBridgeman Jul 30, 2025
97d1177
Reduce duplicated text. Most details should be on individual pages & …
ArBridgeman Jul 30, 2025
c7d8597
Switch from direct to generated metrics
ArBridgeman Jul 30, 2025
3b845e2
Switch text to be a bit shorter & still accurate
ArBridgeman Jul 30, 2025
b303973
Remove length text and use passive to avoid other issues
ArBridgeman Jul 30, 2025
17cea71
Switch to passive text
ArBridgeman Jul 30, 2025
d1cf11e
Switch we with fact statement with are
ArBridgeman Jul 30, 2025
f31c76b
Alter text to be clearer what we do per review
ArBridgeman Jul 30, 2025
521fc44
Fix typo
ArBridgeman Jul 30, 2025
0fbc465
Apply changes per review
ArBridgeman Jul 30, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file removed doc/_static/metrics-artifact.png
Binary file not shown.
Binary file removed doc/_static/metrics-workflow-summary.png
Binary file not shown.
Binary file removed doc/_static/nothing-to-see-here.png
Binary file not shown.
4 changes: 4 additions & 0 deletions doc/changes/unreleased.md
Original file line number Diff line number Diff line change
@@ -1 +1,5 @@
# Unreleased

## Documentation

* #510: Consolidated information of metrics & updated to include more about Sonar
3 changes: 2 additions & 1 deletion doc/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -88,5 +88,6 @@
linkcheck_allowed_redirects = {
# All HTTP redirections from the source URI to
# the canonical URI will be treated as "working".
r"https://github\.com/.*": r"https://github\.com/login*"
r"https://github\.com/.*": r"https://github\.com/login*",
r"https://sonarcloud\.io.*": r"https://www.sonarsource\.com/products/sonarcloud/.*",
}
1 change: 0 additions & 1 deletion doc/developer_guide/modules/modules.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,5 +5,4 @@ Modules
:maxdepth: 2

sphinx/sphinx
nox
nox_tasks
10 changes: 0 additions & 10 deletions doc/developer_guide/modules/nox.rst

This file was deleted.

9 changes: 1 addition & 8 deletions doc/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -37,17 +37,11 @@ Documentation of the Exasol-Toolbox

Document outlining the architectural and design principles and decisions in this project.

.. grid-item-card:: :octicon:`graph` Metrics
:link: metrics
:link-type: ref

Details on metrics collection and support related to and supported by the Python toolbox.

.. grid-item-card:: :octicon:`question` FAQ
:link: faq_toolbox
:link-type: ref

Frequently asked questsions.
Frequently asked questions.


.. toctree::
Expand All @@ -58,6 +52,5 @@ Documentation of the Exasol-Toolbox
developer_guide/developer_guide
tools
github_actions/github_actions
metrics
faq
changes/changelog
123 changes: 0 additions & 123 deletions doc/metrics.rst

This file was deleted.

37 changes: 0 additions & 37 deletions doc/user_guide/features/collecting_metrics.rst

This file was deleted.

2 changes: 1 addition & 1 deletion doc/user_guide/features/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Features
.. toctree::
:maxdepth: 2

collecting_metrics
metrics/collecting_metrics
creating_a_release

Uniform Project Layout
Expand Down
90 changes: 90 additions & 0 deletions doc/user_guide/features/metrics/collecting_metrics.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,90 @@
Collecting metrics
==================

.. toctree::
:maxdepth: 2

project_report
sonar

.. _direct_metrics:

Direct metrics
++++++++++++++

The PTB allows you to collect various metrics on the quality of your project
regarding Coverage, Security, and Static Code Analysis.

For each metric, there is a dedicated nox session, generating one or multiple
files and based on a selected external Python tool.

+------------------------------------+-----------------------------+--------------+
| Nox session | Generated files | Based on |
+====================================+=============================+==============+
| ``lint:code`` | ``lint.txt``, ``lint.json`` | ``pylint`` |
+------------------------------------+-----------------------------+--------------+
| ``lint:security`` | ``.security.json`` | ``bandit`` |
+------------------------------------+-----------------------------+--------------+
| ``test:unit -- --coverage`` | ``.coverage`` | ``coverage`` |
+------------------------------------+-----------------------------+--------------+
| ``test:integration -- --coverage`` | ``.coverage`` | ``coverage`` |
+------------------------------------+-----------------------------+--------------+

These metrics are computed for each point in your build matrix, e.g. for each
Python version defined in the file ``noxconfig.py``:

.. code-block:: python

@dataclass(frozen=True)
class Config:
python_versions = ["3.9", "3.10", "3.11", "3.12", "3.13"]

The GitHub workflows of your project can:

* Use a build matrix, e.g. using different Python versions as shown above
* Define multiple test sessions, e.g. for distinguishing fast vs. slow or expensive tests.


Reporting metrics
+++++++++++++++++

Currently, within the PTB, we offer two methods by which the :ref:`direct_metrics`
are summarized into reports:

#. the nox session ``project:report``
This summarization tool can generate either a JSON or markdown summary where
it provides an overall coverage percentage, maintainability grade, and security
grade. The generated JSON follows a code-agnostic format which can be used
for code quality analysis across Exasol. The markdown summary can be displayed in
the GitHub Action's Summary for a given CI run. For more information, see :ref:`project_report`.

#. SonarQube analysis
This summarization tool feeds into a feature-rich UI provided by
`Sonar <https://docs.sonarsource.com/sonarqube-server/latest/>`__. The total coverage
is calculated as a percentage and visually displayed per line on the altered code.
Security & linting issues are displayed in a list with links to the code & textual
descriptions on why they should be fixed and how they can be fixed. Additionally,
this information can be paired per project with a bot, which reports the code
quality analysis results within a PR. For further details, see :ref:`sonarqube_analysis`

Both of these reporting options require that the generated files from the :ref:`direct_metrics`
are existing and in the expected formats. As we have metrics for different Python
versions, we pass on the metrics associated with the Python version named first in
attribute ``python_versions`` of class ``Config`` to the reporting metrics tools.

To perform this validation, we have defined two nox sessions. Due to the direct
dependence on the nox session ``project:report`` and SonarQube Analysis on the
aforementioned nox sessions, all of these are executed in succession in the CI's ``report.yml``.

+--------------------------+----------------------------------------------------------+
| Nox session | Actions |
+==========================+==========================================================+
| ``artifacts:copy`` | * Combines coverage artifacts from various test sources |
| | (unit, integration ...) |
| | * Copies downloaded artifacts to their parent directory |
+--------------------------+----------------------------------------------------------+
| ``artifacts:validate`` | * Verifies that the ``.lint.json``, ``.lint.json``, |
| | ``.security.json``, and ``.coverage`` are present |
| | * Checks that each file contains the expected attributes |
| | for that file type |
+--------------------------+----------------------------------------------------------+
53 changes: 53 additions & 0 deletions doc/user_guide/features/metrics/project_report.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
.. _project_report:

``project:report``
==================
The nox session ``project:report`` provides an overall coverage percentage,
maintainability grade, and security grade based on the :ref:`direct_metrics` collected.
The definitions used for evaluating the quality of the Python code is defined in the
`metrics.py`_ file, and the required fields are specified by the code-agnostic
:ref:`metrics_schema` for Exasol. This nox session can return its analysis in two forms:

* JSON
This directly meets the requirements for the :ref:`metrics_schema`. In our CI runs,
a JSON is created & uploaded as an artifact, which can be downloaded later by the
crawler project.
* markdown
This is displayed in the GitHub Action's Summary for a given CI run. Displaying
this content per CI run gives the developer immediate feedback as to how the code
quality has changed between code modifications.


.. _metrics_schema:

Metrics schema
++++++++++++++
The metrics schema has been established to provide a unified approach for metrics and
reporting across our projects. More details on its motivation, development, & usage
can be found in the following resources:

* `Exasol Schemas`_
* `Metrics Schema`_
* `Metrics Schema Project`_

For our open-source projects, there is a scheduled job that regularly collects metrics
from projects. This data is then aggregated and added to a central data store. For more
details, please refer to the crawler project documentation.

Development
-----------

If the metrics schema needs to be updated, the `Metrics Schema Project`_ provides a
convenient way (via a Pydantic model) to update and generate an updated schema for the
metrics.

.. note::

The updated version needs to be first integrated into the `Exasol Schemas Project`_.


.. _Exasol Schemas: https://schemas.exasol.com
.. _Exasol Schemas Project: https://github.com/exasol/schemas
.. _Metrics Schema: https://schemas.exasol.com/project-metrics-0.2.0.html
.. _metrics.py: https://github.com/exasol/python-toolbox/blob/main/exasol/toolbox/metrics.py
.. _Metrics Schema Project: https://github.com/exasol/python-toolbox/tree/main/metrics-schema
Loading