This python server is made to launch Jupyter notebooks (*.ipynb) and get results from them.
notebooks(Default value: /home/jupyter-notebook/) - path to the directory with notebooks.j-spsearch files withipynbextension recursively in the specified folder.results(Default value: /home/jupyter-notebook/results) - path to the directory for run results.j-spresolves result file withjsonlextension against specified folder.results-images(Default value: /home/jovyan/j-sp/results/images) - path to the directory for images prepared during notebook run.j-spprovides/image?path=<full path to image>endpoint for getting stored images.logs(Default value: /home/jupyter-notebook/logs) - path to the directory for run logs.j-spputs run logs to specified folder.out-of-use-engine-time(Default value: 3600) - out-of-use time interval in seconds.j-spunregisters engine related to a notebook when user doesn't run the notebook more than this timerestart-kernel-on-error(Default value: False) - if Truej-sprestart Kernel when executed notebook raisesExceptionotherwise in special cases:DeadKernelError, etc.cleanup-horizon-days(Default value: 14) -j-sprecursively removes files older this day's number fromresults,results-images,logsdirectories.- zero value - all files from the directories are removed before each notebook executed.
- negative value - disabled the cleanup functionality.
virtual-environment-dir(Default value: /home/json-stream/.venv) -j-spcreates python virtual environment from this folder or reuse virtual environment if folder already exists. Please note:j-spdocker image creates/opt/conda/bin/pythonand/opt/conda/bin/piplinks to mimics environment ofjupter/datascience-notebookdocker imagepython-kernel-name(Default value: .venv) -j-spisntall ipykernel with this name using virtual environment specified invirtual-environment-dir
/home/json-stream/- is home folder for j-sp. It can contain: installed python library, pip.conf and other useful files for run notebooks./home/jupyter-notebook/- is shared folder between this tool and any source of notebooks. In general j-sp should be run with jupyter notebook/lab/hub. User can develop / debug a notebook in the jupyter and run via j-sp
th2-rpt-viewer since the 5.2.7-TH2-5142-9348403860 version can interact with j-sp by the http://<cluster>:<port>/th2-<schema>/json-stream-provider/ URL
j-sp use pod resources to run notebooks. Please calculate required resource according to solving issues.
apiVersion: th2.exactpro.com/v2
kind: Th2Box
metadata:
name: json-stream-provider
spec:
imageName: ghcr.io/th2-net/th2-json-stream-provider-py
imageVersion: 0.0.2
type: th2-rpt-data-provider
customConfig:
notebooks: /home/jupyter-notebook/
results: /home/jupyter-notebook/j-sp/results/
results-images: /home/jovyan/j-sp/results/images
logs: /home/jupyter-notebook/j-sp/logs/
out-of-use-engine-time: 3600
restart-kernel-on-error: false
cleanup-horizon-days: 14
virtual-environment-dir: /home/json-stream/.venv
python-kernel-name: .venv
loggingConfig: |
[loggers]
keys=root,jsp,aiohttp_access
[handlers]
keys=consoleHandler
[formatters]
keys=formatter
[logger_root]
level=INFO
handlers=consoleHandler
propagate=0
[logger_jsp]
level=INFO
qualname=j-sp
handlers=consoleHandler
propagate=0
[logger_aiohttp_access]
level=WARN
qualname=aiohttp.access
handlers=consoleHandler
propagate=0
[handler_consoleHandler]
class=StreamHandler
formatter=formatter
args=(sys.stdout,)
[formatter_formatter]
format=%(asctime)s - %(name)s - %(levelname)s - %(message)s
extendedSettings:
mounting:
- path: /home/jupyter-notebook/
pvcName: jupyter-notebook
- path: /home/json-stream/
pvcName: json-stream-provider
resources:
limits:
memory: 1000Mi
cpu: 1000m
requests:
memory: 100Mi
cpu: 100m
service:
enabled: true
ingress:
urlPaths:
- '/json-stream-provider/'
clusterIP:
- name: backend
containerPort: 8080
port: 8080- Cell tagged
parameters(required). You can read detail about this cell below. - Cell with dependencies (optional) - server doesn't include third-party packages by default.
You can install / uninstall packages required for your code in one of cells. All installed packages are shared between runs any notebook.
Installation example:
import sys !{sys.executable} -m pip install <package_name>==<package_version>
Notebook must have a first cell with the parameters tag. This cell is used to configure parametrized run.
- this cell should be listed parameters only.
- parameters could have typing and value, but value must be constant and have primitive type like boolean, number, string.
output_path- path to JSONL file. Server considers a content of this file as run results.js-pgenerates and passes a file path in the folder configured byresultssetting for this parameter
customization_path- path to JSON file. Server considers a content of this file as run customization.js-pgenerates and passes a file path in the folder configured byresultssetting for this parameteroutput_images_path- path to image folder. Server provides/image?path=<full path to image>endpoint for getting stored images.js-ppasses a folder path in the value configured byresults-imagessetting for this parameter
Parameter names can have special suffixes to help a viewer apples different controls for them
_file- string parameter is containing path to a file on the server node.js-pprovides paths to a files located in the folders or sub-folders of configured dirs bynotebooksandresultssettings_timestamp- string parameter is containing datetime in the ISO format like2024-07-01T05:06:59.664Z_pycode- string parameter is containing a code. A viewer can apply highlighting for this parameter
Each JSON in JSONL contain can contain special fields. A viewer can have special logic for handling these fields.
#display-timestamp- contain Unix time in nanoseconds. on example:{ "#display-timestamp": 1737048910123000000, "field": "value" }#display-name- short name of JSON Node. A viewer can show this value instead or together with full node content. example:{ "#display-name": "Root node", "field": "value", "sub-node": { "#display-name": "Sub node", "sub-field": "sub-value" } }#display-table- table view of JSON Node. This value can cover the whole content of the Node or cover only part of data. example:
{
"#display-table": [
["int", "float"],
[19700, 19.700000000000003],
[39400, 39.400000000000006]
],
"results": 2,
"array": [
{
"name": "str_100_test",
"int": 19700,
"float": 19.700000000000003
},
{
"name": "str_200_test",
"int": 39400,
"float": 39.400000000000006
}
]
}{"#display-timestamp": 1737388741801564430, "#display-name": "2 - 2025-01-20 19:59:01.801564", "#display-table": [["int", "float"], ["45200", "45.2"], ["90400", "90.4"]], "results": 2, "array": [{"#display-name": "str_100_test:2024-07-01T05:06:59.664Z:False", "name": "str_100_test", "int": 45200, "float": 45.2, "flag": false, "custom_time": "2024-07-01T05:06:59.664Z"}, {"#display-name": "str_200_test:2024-07-01T05:06:59.665Z:True", "name": "str_200_test", "int": 90400, "float": 90.4, "flag": true, "custom_time": "2024-07-01T05:06:59.665Z"}]}
{"#display-timestamp": 1737388741801596018, "#display-name": "3 - 2025-01-20 19:59:01.801596", "#display-table": [["int", "float"], ["23800", "23.8"], ["47600", "47.6"]], "results": 2, "array": [{"#display-name": "str_100_test:2024-07-01T05:06:59.664Z:False", "name": "str_100_test", "int": 23800, "float": 23.8, "flag": false, "custom_time": "2024-07-01T05:06:59.664Z"}, {"#display-name": "str_200_test:2024-07-01T05:06:59.665Z:True", "name": "str_200_test", "int": 47600, "float": 47.6, "flag": true, "custom_time": "2024-07-01T05:06:59.665Z"}]}This JSON should have list of pattern. Each pattern should have the fields:
pattern- text which should be highlightedcolor- color for highlighting
[
{
"pattern": "str_100_test",
"color": "#3CD91F"
},
{
"pattern": "str_200_test",
"color": "#9356D5"
}
]You can put required files for you jupyter notebooks into local-run/with-jupyter-notebook/user_data folder. Please note that this folder is read-only for containers.
Or you can mount own folder by changing value of USER_DATA_DIR environment variable in the local-run/with-jupyter-notebook/.evn file.
Or change the local-run/with-jupyter-notebook/compose.yml file. Please note you should mount the same dictionary by the same path to jupyter_notebook and json_stream_provider services.
jupyter-notebook and json-stream-provider use user from default linux users group.
It means that:
user_datafolder internal folder should haverwxpermission forusersgroup.- files in
user_datafolder should haverwpermission forusersgroup.
Perhaps you will need sudo permission for the next commands
cd local-run/with-jupyter-notebook
chgrp -R users user_data/
chmod -R g=u user_data/-
cd local-run/with-docker docker compose up --build -
cd local-run/with-docker docker compose rm --force --volumes --stop docker compose down --volumes docker compose build
-
cd local-run/with-jupyter-notebook docker-compose up --build -
cd local-run/with-jupyter-notebook docker-compose build -
cd local-run/with-jupyter-notebook docker-compose rm --force --volumes --stop docker-compose down --volumes docker-compose build
- http://localhost:8080 - th2-rpt-viewer
- http://localhost:8082 - jupyter-notebook.
You can authorise via token printed into
jupyter_notebooklogs:- if you use docker
cd local-run/with-jupyter-notebook docker compose logs jupyter_notebook | grep '/lab?token=' | tail -1 | cut -d '=' -f 2
- if you use podman
cd local-run/with-docker docker-compose logs jupyter_notebook | grep '/lab?token=' | tail -1 | cut -d '=' -f 2
- if you use docker
- updated: python-3.12.9
- changed local run with jupyter-notebook:
- used jupyterhub/singleuser:5.2.1
- avoided
json-streamuser creation in Dockerfile - updated: aiohttp~=3.13.2
- implemented GH-29: Use Python virtual environment instead of PIP_TARGET
- changed local run with jupyter-notebook:
- application urls were changed.
- Migration from earlier versions:
- Execute clean command because
jupyter-notebookprepare.venvwitch should be shared tojson-stream-provider
- Execute clean command because
- fixed GH-21: json-stream-provider doesn't restart Python Kernel automatically
- added
restart-kernel-on-erroroption to custom settings - added
cleanup-horizon-daysoption to custom settings
- updated:
- respond short error in case notebook run failure
- added
results-imagesoption to custom settings
- fixed:
#character in string property issue
- update local run with jupyter-notebook:
- updated th2-rpt-viewer:
- added custom.json file for local run.
- updated version to
5.2.12
- updated th2-rpt-viewer:
- j-sp generates cookies with
engine_user_idfield to identify user for creating unique python engine. - Custom engine holds separate papermill notebook client for each
engine_user_idand file combination. - update local run with jupyter-notebook:
- updated th2-rpt-viewer:
- added pycode parameter type
- added ability to save/load presets for notebooks
- compare mode was changed to have ability to launch notebooks
- added ability to move to nearest chunk in compare mode
- added ability to off parameter in notebook
- updated th2-rpt-viewer:
- Added papermill custom engine to reuse it for notebook execution. A separate engine is registered for each notebook and unregistered after 1 hour out-of-use time by default.
- update local run with jupyter-notebook:
- updated th2-rpt-viewer:
JSON Readerpage pulls execution status each 50 ms instead of 1 secJSON Readerpage now uses virtuoso for rendering listsJSON Readerpage now has search, it's values could be loaded fromjsonfile containing array of objects containingpatternandcolorfields for searching content. Execution of notebook could create such file and it will be loaded into UI if it would be created in path ofcustomization_pathparameter.- Added ability to create multiple
JSON Readerpages. JSON Readerpage now has compare mode.
- updated th2-rpt-viewer:
- added
umask 0007to~/.bashrcfile to provide rw file access forusersgroup - added
/filerequest for loading content of single jsonl file - removed ability to get any file from machine via
/fileREST APIs - added sorting on requests
/files/notebooksand/files/results - added
/files/allrequest to list all files in/notebooksand/results/directories - added
convert_parameterfunction for parsing parameter depending on it's type - update local run with jupyter-notebook:
- updated th2-rpt-viewer:
- added option to change default view type of result group
- added display of #display-table field in Table view type
- added option to view last N results of Notebook
- added validation of Notebook's parameters
- added timestamp and file path parameter types
- fixed clearing of Notebook's parameters on run
- increased width of parameters' inputs
- updated compose:
- changed use data access from
rotorw
- changed use data access from
- updated th2-rpt-viewer:
- added
${HOME}/python/libintoPYTHONPATHenvironment variable - update local run with jupyter-notebook:
- updated jupyter-notebook Dockerfile:
- used
jupyter/datascience-notebook:python-3.9 - defined
PYTHONPATH,PIP_TARGETenvironment variables
- used
- updated compose:
- added
python_libvolume
- added
- updated jupyter-notebook Dockerfile:
- added saving of current tasks
- task contains status(success, failed, in progress) and id using which task can be stopped
- added end-point
/stopfor stopping requested task - updated end-point
/resultit now requests task by id and returns file, reason for failed run or informs that task is 'in progress' depending on task status
- Added
json-streamuser to users group - Added docker compose for local run