Integrate Workbench with Your Own JupyterLab Server

If you are using JupyterLab on a self-hosted server, you can still take advantage of the ML Workbench by installing the ML Workbench JupyterLab extension and the Python package tgml.

Prerequisite

  • Your JupyterLab version is 3.0 or later.

Procedure

  1. Navigate to JupyterLab web interface and open a terminal.

  2. From the terminal, run pip install tigergraph_mlworkbench. This will install the ML Workbench Jupyter Lab extension.

  3. From the terminal, run the following command to install the Python kernel for the ML Workbench:

    $ conda env create -f https://raw.githubusercontent.com/TigerGraph-DevLabs/mlworkbench-docs/main/conda_envs/tigergraph-torch-cpu.yml && conda activate tigergraph-torch-cpu && python -m ipykernel install --user --name tigergraph-torch-cpu --display-name "TigerGraph Pytorch (cpu)"

    If you are using a GPU for training, replace the all the "cpu" above with "gpu":

    $ conda env create -f https://raw.githubusercontent.com/TigerGraph-DevLabs/mlworkbench-docs/main/conda_envs/tigergraph-torch-gpu.yml && conda activate tigergraph-torch-gpu && python -m ipykernel install --user --name tigergraph-torch-gpu --display-name "TigerGraph Pytorch (gpu)"
  4. Once installation finishes, refresh your browser. You should see a small TigerGraph logo on the very left navigation bar and a new Python kernel called TigerGraph Pytorch on the launch page.

Next steps

With the ML Workbench JupyterLab extension and the TigerGraph Pytorch kernel installed, the next step is to deploy GDPS on your TigerGraph instance so the Workbench can communicate with your TigerGraph database.