Integrate Workbench with Google Vertex AI

This guide walks you through integrating the ML Workbench with a notebook on Google Vertex AI.

1. Prerequisite

  • A running Google Vertex AI instance

2. Procedure

  1. From the Workbench page in Vertex AI, find your notebook instance and click OPEN JUPYTERLAB to navigate to the JupyterLab web interface. Click File  New  Terminal to open a terminal.

  2. From the terminal, run pip install tigergraph_mlworkbench. This installs the ML Workbench JupyterLab extension.

  3. From the terminal, run the following command to install the tigergraph-torch Python kernel. Choose the appropriate command depending on whether you are using a CPU or GPU for training:

    • For CPU

    • For GPU

    $ conda env create -f https://raw.githubusercontent.com/TigerGraph-DevLabs/mlworkbench-docs/main/conda_envs/tigergraph-torch-cpu.yml
    $ conda env create -f https://raw.githubusercontent.com/TigerGraph-DevLabs/mlworkbench-docs/main/conda_envs/tigergraph-torch-gpu.yml
  4. Once installation finishes, refresh your browser. You should see a small TigerGraph logo on the very left navigation bar and a new Python kernel called TigerGraph Pytorch on the launch page.

3. Next steps

With the ML Workbench JupyterLab extension and the tigergraph-torch kernel installed, the next step is to deploy GDPS on your TigerGraph instance so the Workbench can communicate with your TigerGraph database.