🌐 Run the Giskard Server#

Complementing the Giskard Python library, the Giskard server is the app that you can install either locally, on your cloud instance or on Hugging Face Spaces. The Giskard server offers a bunch of features such as:

  • Debugging your tests to diagnose issues

  • Comparing models to decide which one to promote

  • Gathering all of your team’s internal tests in one place to work more efficiently

  • Sharing your results and collecting business feedback from your team

  • Creating more domain-specific tests based on model debugging sessions

To run the Giskard server and use all the features mentioned above, you need to complete the following 2 steps:

  • Start the server: the server contains all the UI components to test, debug and compare your ML models

  • Start the worker: the worker enables the server to execute your model directly within your Python environment. This is important to be able to execute the model with all its dependent libraries.

1. Start the server#

To run the Giskard server you need:

  • A Linux, macOS machine, or WSL2 in Windows

  • To install the Giskard Python library, see here.

You can either install and run the server locally or on an external server (ex: cloud instance)

Hint

Docker requirements: To install Giskard locally, you need a running docker. After installation of Docker, you can run it in the background by just opening the Docker app (Mac or Windows)

  • For an easy installation of Docker you can execute:

sudo curl -fsSL https://get.docker.com -o get-docker.sh
sudo sh get-docker.sh
  • If you don’t have the sudo rights to run docker, please see the Docker setup page

To start the Giskard server, install the Giskard Python library (see here) and execute the following command in your terminal:

giskard server start

You’ll then be able to open Giskard at http://localhost:19000/

Warning

  • Make sure to run Docker before starting the Giskard server

  • If the giskard command is not found then you need first to install the Giskard Python library (see the doc section)

  • To see the available commands of the giskard server, you can execute:

    giskard server --help
    

Installing Giskard in the cloud is preferable if you want to use the collaborative features of Giskard: collect feedback on your model from your team, share your Quality Assurance results, save and provide all your custom tests to your team, etc.

Since Giskard uses a TCP port: 19000, make sure that the port is open on the cloud instances where Giskard is installed. For step-by-step installation steps in the cloud, please go to the AWS, GCP, and Azure installation pages.

Hosting Giskard in Hugging Face Spaces leverages Giskard’s collaboration features, as highlighted in the Cloud installation option. This option is especially useful for new users of Giskard or users entrenched in the Hugging Face ecosystem.

Warning

In Hugging Face Spaces, Giskard operates within a Docker container activated by a Space. You can opt for:

  • A public Space: open to everyone (ideal for showcasing your models and their performance).

  • A private Space: restricted to your organization or yourself (facilitates internal collaboration and ensures security for your data and models).

For private Hugging Face Spaces, you’ll need an extra token (YOUR_SPACE_TOKEN) to connect the Giskard Client and ML worker.

If you’re new to Giskard, we recommend trying this method. For comprehensive details, explore the guide on Installation in Hugging Face Spaces or visit our Hugging Face organization page if you’re acquainted with Hugging Face Spaces.

2. Start the ML worker#

Giskard executes your model using a worker that runs the model directly in your Python environment, with all the dependencies required by your model. You can either execute the ML worker:

  • From your local notebook within the kernel that contains all the dependencies of your model

  • From Google Colab within the kernel that contains all the dependencies of your model

  • Or from your terminal within the Python environment that contains all the dependencies of your model

To start the ML worker from your notebook, you need to start Giskard in the deamon mode by providing the token in the Settings tab of the Giskard server (accessible via http://localhost:19000/).

  • If Giskard server is installed locally, run in a cell in your notebook:

    !giskard worker start -d -k YOUR_TOKEN
    
  • If Giskard server is installed on an external server (for instance in AWS ec2 instance), or a public Space on Hugging Face Spaces, run the following in your notebook:

    !giskard worker start -d -k YOUR_TOKEN -u http://ec2-13-50-XXXX.compute.amazonaws.com:19000/
    
  • If Giskard server is hosted on a private Space on Hugging Face Spaces, run the following in your notebook:

    !giskard worker start -d -k YOUR_TOKEN -u https://huggingface.co/spaces/<user-id>/<space-id> -t YOUR_SPACE_TOKEN
    

Hint

To see the available commands of the worker, you can execute:

!giskard worker --help

You’re all set to try Giskard in action. Upload your first model, dataset or test suite by following the upload an object page.

To start the ML worker from your Colab notebook, you need to start Giskard in the deamon mode by providing the token in the Settings tab of the Giskard server (accessible via http://localhost:19000/).

  • If the Giskard server is installed locally:

    Run in your local terminal (not the the terminal from Colab):

    giskard server expose --token <ngrok_API_token>
    

    Read the flowing instructions in order to get the ngrok_API_token. Then run the below 4 lines of code in a cell of your Colab notebook:

    %env GSK_API_KEY=YOUR_API_KEY
    !giskard worker start -d -k YOUR_TOKEN -u https://e840-93-23-184-184.ngrok-free.app
    
  • If the Giskard server is installed on an external server (for instance on an AWS ec2 instance), or a public Space on Hugging Face Spaces:

    Run on a cell in Colab:

    !giskard worker start -d -k YOUR_TOKEN -u http://ec2-13-50-XXXX.compute.amazonaws.com:19000/
    
  • If Giskard server is hosted on a private Space on Hugging Face Spaces:

    Run on a cell in Colab:

    !giskard worker start -d -k YOUR_TOKEN -u https://huggingface.co/spaces/<user-id>/<space-id> -t YOUR_SPACE_TOKEN
    

Hint

To see the available commands of the worker, you can execute:

!giskard worker --help

You’re all set to try Giskard in action. Upload your first model, dataset or test suite by following the upload an object page.

  • If Giskard server is installed locally:

    Run this command within the Python environment that contains all the dependencies of your model:

    giskard worker start -u http://localhost:19000/
    

    You then will be asked to provide your API token. The API access token can be found in the Settings tab of the Giskard server (accessible via: http://localhost:19000/)

  • If Giskard server is installed in an external server (for instance in AWS ec2 instance), or a public Space on Hugging Face Spaces:

    Run this command within the Python environment that contains all the dependencies of your model:

    giskard worker start -u http://ec2-13-50-XXXX.compute.amazonaws.com:19000/
    
  • If Giskard server is hosted on a private Space on Hugging Face Spaces:

    Run this command within the Python environment that contains all the dependencies of your model:

    !giskard worker start -d -k YOUR_TOKEN -u https://huggingface.co/spaces/<user-id>/<space-id> -t YOUR_SPACE_TOKEN
    

Hint

To see the available commands of the worker, you can execute:

!giskard worker --help

You’re all set to try Giskard in action. Upload your first model, dataset, test suite, or slicing & transformation functions by following the upload an object page.