Please follow the link to the
+ ${def_branch}' branch documentation.
+
+
+ EOF
git add index.html
- name: Commit changes to the GitHub Pages branch
run: |
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index b827eb9e0..5db7734b2 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -23,7 +23,7 @@ into three categories:
### Your first issue
-1. Read the project's [README.md](https://github.com/NVIDIA-Merlin/systems/blob/main/README.md)
+1. Read the project's [README.md](https://github.com/NVIDIA-Merlin/systems/blob/stable/README.md)
to learn how to setup the development environment.
2. Find an issue to work on. The best way is to look for the [good first issue](https://github.com/NVIDIA-Merlin/systems/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22)
or [help wanted](https://github.com/NVIDIA-Merlin/systems/issues?q=is%3Aissue+is%3Aopen+label%3A%22help+wanted%22) labels.
diff --git a/README.md b/README.md
index 9c0c23a71..0baeb92f4 100644
--- a/README.md
+++ b/README.md
@@ -3,7 +3,7 @@

[](https://pypi.python.org/pypi/merlin-systems/)

-[](https://nvidia-merlin.github.io/systems/main/README.html)
+[](https://nvidia-merlin.github.io/systems/stable/README.html)
Merlin Systems provides tools for combining recommendation models with other elements of production recommender systems like feature stores, nearest neighbor search, and exploration strategies into end-to-end recommendation pipelines that can be served with [Triton Inference Server](https://github.com/triton-inference-server/server).
diff --git a/docs/source/conf.py b/docs/source/conf.py
index 452b68c9a..b84e45a9c 100644
--- a/docs/source/conf.py
+++ b/docs/source/conf.py
@@ -11,6 +11,7 @@
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
import os
+import re
import subprocess
import sys
@@ -96,12 +97,13 @@
if os.path.exists(gitdir):
tag_refs = subprocess.check_output(["git", "tag", "-l", "v*"]).decode("utf-8").split()
+ tag_refs = [tag for tag in tag_refs if re.match(r"^v[0-9]+.[0-9]+.[0-9]+$", tag)]
tag_refs = natsorted(tag_refs)[-6:]
smv_tag_whitelist = r"^(" + r"|".join(tag_refs) + r")$"
else:
smv_tag_whitelist = r"^v.*$"
-smv_branch_whitelist = r"^main$"
+smv_branch_whitelist = r"^(main|stable)$"
smv_refs_override_suffix = r"-docs"
@@ -110,11 +112,11 @@
"cudf": ("https://docs.rapids.ai/api/cudf/stable/", None),
"distributed": ("https://distributed.dask.org/en/latest/", None),
"torch": ("https://pytorch.org/docs/stable/", None),
- "merlin-core": ("https://nvidia-merlin.github.io/core/main/", None),
+ "merlin-core": ("https://nvidia-merlin.github.io/core/stable/", None),
}
html_sidebars = {"**": ["versions.html"]}
-html_baseurl = "https://nvidia-merlin.github.io/systems/main"
+html_baseurl = "https://nvidia-merlin.github.io/systems/stable/"
autodoc_inherit_docstrings = False
autodoc_default_options = {
diff --git a/docs/source/index.rst b/docs/source/index.rst
index 889c8f949..c1026def9 100644
--- a/docs/source/index.rst
+++ b/docs/source/index.rst
@@ -13,7 +13,7 @@ Merlin Systems GitHub Repository
About Merlin
Merlin is the overarching project that brings together the Merlin projects.
- See the `documentation `_
+ See the `documentation `_
or the `repository `_ on GitHub.
Developer website for Merlin
diff --git a/examples/Serving-An-Implicit-Model-With-Merlin-Systems.ipynb b/examples/Serving-An-Implicit-Model-With-Merlin-Systems.ipynb
index b0346a91d..f2be2dace 100644
--- a/examples/Serving-An-Implicit-Model-With-Merlin-Systems.ipynb
+++ b/examples/Serving-An-Implicit-Model-With-Merlin-Systems.ipynb
@@ -80,7 +80,7 @@
"source": [
"In this tutorial our objective is to demonstrate how to serve an `Implicit` model. In order for us to be able to do so, we begin by downloading data and training a model. We breeze through these activities below.\n",
"\n",
- "If you would like to learn more about training an `Implicit` model using the Merlin Models library, please consult this [tutorial](https://github.com/NVIDIA-Merlin/models/blob/main/examples/07-Train-traditional-ML-models-using-the-Merlin-Models-API.ipynb)."
+ "If you would like to learn more about training an `Implicit` model using the Merlin Models library, please consult this [tutorial](https://github.com/NVIDIA-Merlin/models/blob/stable/examples/07-Train-traditional-ML-models-using-the-Merlin-Models-API.ipynb)."
]
},
{
diff --git a/examples/Serving-An-XGboost-Model-With-Merlin-Systems.ipynb b/examples/Serving-An-XGboost-Model-With-Merlin-Systems.ipynb
index b1eea93a2..88cc2be8c 100644
--- a/examples/Serving-An-XGboost-Model-With-Merlin-Systems.ipynb
+++ b/examples/Serving-An-XGboost-Model-With-Merlin-Systems.ipynb
@@ -80,7 +80,7 @@
"source": [
"In this tutorial our objective is to demonstrate how to serve an `XGBoost` model. In order for us to be able to do so, we begin by downloading data and training a model. We breeze through these activities below.\n",
"\n",
- "If you would like to learn more about training an `XGBoost` model using the Merlin Models library, please consult this [tutorial](https://github.com/NVIDIA-Merlin/models/blob/main/examples/07-Train-an-xgboost-model-using-the-Merlin-Models-API.ipynb)."
+ "If you would like to learn more about training an `XGBoost` model using the Merlin Models library, please consult this [tutorial](https://github.com/NVIDIA-Merlin/models/blob/stable/examples/07-Train-an-xgboost-model-using-the-Merlin-Models-API.ipynb)."
]
},
{
diff --git a/examples/Serving-Ranking-Models-With-Merlin-Systems.ipynb b/examples/Serving-Ranking-Models-With-Merlin-Systems.ipynb
index 47c445180..1376c4f03 100644
--- a/examples/Serving-Ranking-Models-With-Merlin-Systems.ipynb
+++ b/examples/Serving-Ranking-Models-With-Merlin-Systems.ipynb
@@ -35,7 +35,7 @@
"\n",
"# Serving Ranking Models With Merlin Systems\n",
"\n",
- "This notebook is created using the latest stable [merlin-tensorflow](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow/tags) container. This Jupyter notebook example demonstrates how to deploy a ranking model to Triton Inference Server (TIS) and generate prediction results for a given query. As a prerequisite, the ranking model must be trained and saved with Merlin Models. Please read the [README](https://github.com/NVIDIA-Merlin/systems/blob/main/examples/README.md) for the instructions.\n",
+ "This notebook is created using the latest stable [merlin-tensorflow](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow/tags) container. This Jupyter notebook example demonstrates how to deploy a ranking model to Triton Inference Server (TIS) and generate prediction results for a given query. As a prerequisite, the ranking model must be trained and saved with Merlin Models. Please read the [README](https://github.com/NVIDIA-Merlin/systems/blob/stable/examples/README.md) for the instructions.\n",
"\n",
"## Overview\n",
"\n",
@@ -57,7 +57,7 @@
"\n",
"### Dataset\n",
"\n",
- "We use the synthetic train and test datasets generated by mimicking the real [Ali-CCP: Alibaba Click and Conversion Prediction](https://tianchi.aliyun.com/dataset/dataDetail?dataId=408#1) dataset to build our recommender system ranking models. To see how the data is transformed with NVTabular and how a DLRM model is trained with Merlin Models check out the [04-Exporting-ranking-models.ipynb](https://github.com/NVIDIA-Merlin/models/blob/main/examples/04-Exporting-ranking-models.ipynb) example notebook which is a prerequisite for this notebook.\n",
+ "We use the synthetic train and test datasets generated by mimicking the real [Ali-CCP: Alibaba Click and Conversion Prediction](https://tianchi.aliyun.com/dataset/dataDetail?dataId=408#1) dataset to build our recommender system ranking models. To see how the data is transformed with NVTabular and how a DLRM model is trained with Merlin Models check out the [04-Exporting-ranking-models.ipynb](https://github.com/NVIDIA-Merlin/models/blob/stable/examples/04-Exporting-ranking-models.ipynb) example notebook which is a prerequisite for this notebook.\n",
"\n",
"It is important to note that the steps take in this notebook are generalized and can be applied to any set of workflow and models. \n",
"\n",
@@ -96,7 +96,7 @@
"source": [
"## Load an NVTabular Workflow\n",
"\n",
- "First, we load the `nvtabular.Workflow` that we created in with this [example](https://github.com/NVIDIA-Merlin/models/blob/main/examples/04-Exporting-ranking-models.ipynb). "
+ "First, we load the `nvtabular.Workflow` that we created in with this [example](https://github.com/NVIDIA-Merlin/models/blob/stable/examples/04-Exporting-ranking-models.ipynb). "
]
},
{
@@ -156,7 +156,7 @@
"source": [
"## Load the Tensorflow Model\n",
"\n",
- "After loading the workflow, we load the model. This model was trained with the output of the workflow from the [Exporting Ranking Models](https://github.com/NVIDIA-Merlin/models/blob/main/examples/04-Exporting-ranking-models.ipynb) example from Merlin Models."
+ "After loading the workflow, we load the model. This model was trained with the output of the workflow from the [Exporting Ranking Models](https://github.com/NVIDIA-Merlin/models/blob/stable/examples/04-Exporting-ranking-models.ipynb) example from Merlin Models."
]
},
{
diff --git a/requirements/docs.txt b/requirements/docs.txt
index a320f14d5..9445871d9 100644
--- a/requirements/docs.txt
+++ b/requirements/docs.txt
@@ -15,3 +15,8 @@ myst-nb==0.13.2
linkify-it-py==1.0.3
sphinx-external-toc==0.2.4
attrs==21.4.0
+
+# keep support for numpy builtin type aliases for previous tags
+# numpy builtin aliases like np.str were removed in 1.24
+# This can be unpinned when we no longer build docs for versions of Merlin prior 23.05
+numpy<1.24