Install AI Server

A step-by-step guide for deploying a DataForge AI Server with Docker Compose.

This chapter covers the installation of a DataForge AI Server using Docker Compose and provides a reference docker compose file to use for deployment.

Deploying AI Server using Docker Compose

1. Prerequisites

To set up a new AI Server, a number of prerequisites have to be met.

  • Network access to a DataForge Server (gRPC and gRPC-Web endpoints)
  • Network access to Zabbix server and database
  • A Linux host with Docker and Docker Compose installed
  • An AI Server Token from an AI Server on a Zabbix server

You can check whether Docker and Docker Compose are installed by running the following commands:

docker version
docker compose version

2. Docker Compose

The following docker-compose.yaml file defines how the AI Server will start.

You need to modify the docker-compose.yaml file to correctly include the following information:

  • Connection details to your DataForge server
  • An AI Server token from your DataForge server (see Zabbix server setup)
  • Connection details and credentials to your Zabbix database
  • Connection details to the trapper port of your Zabbix server
services:
    df-ai-server:
        image: images.intellitrend.de/dataforge/dataforge-core/df-ai-server:7.10.12
        restart: always
        environment:
            GOMEMLIMIT: "600MiB"
            DF_AI_SERVER_LOG_LEVEL: "info"
            DF_AI_SERVER_METRICS_IDENTITY: "df_ai_server"
            # DataForge Server connection
            DF_AI_SERVER_AISERVER_DATAFORGE_SERVER_GRPC_ADDRESS: "dataforge.intellitrend.de"
            DF_AI_SERVER_AISERVER_DATAFORGE_SERVER_GRPC_PORT: "8090"
            DF_AI_SERVER_AISERVER_DATAFORGE_SERVER_GRPC_TLS_ENABLE: "false"
            DF_AI_SERVER_AISERVER_DATAFORGE_SERVER_GRPC_WEB_ADDRESS: "dataforge.intellitrend.de" # Usually the same as the GRPC_ADDRESS, only with a different port
            DF_AI_SERVER_AISERVER_DATAFORGE_SERVER_GRPC_WEB_PORT: "8091"
            DF_AI_SERVER_AISERVER_DATAFORGE_SERVER_GRPC_WEB_TLS_ENABLE: "false"
            # Zabbix Database connection
            DF_AI_SERVER_AISERVER_ZABBIX_DATABASE_ADDRESS: "[REPLACE_ME]"
            DF_AI_SERVER_AISERVER_ZABBIX_DATABASE_NAME: "zabbix"
            DF_AI_SERVER_AISERVER_ZABBIX_DATABASE_PASSWORD: "[REPLACE_ME]"
            DF_AI_SERVER_AISERVER_ZABBIX_DATABASE_PORT: "5432"
            DF_AI_SERVER_AISERVER_ZABBIX_DATABASE_USER: "[REPLACE_ME]"
            DF_AI_SERVER_AISERVER_ZABBIX_DATABASE_DRIVER: "postgres"
            # Zabbix Server connection
            DF_AI_SERVER_AISERVER_ZABBIX_SERVER_ADDRESS: "[REPLACE_ME]"
            DF_AI_SERVER_AISERVER_ZABBIX_SERVER_TRAPPER_PORT: "10051"
            # AI Server configuration
            DF_AI_SERVER_AISERVER_STATE_FILE: "/usr/local/intellitrend/df/var/state.json" # Stores the time of the latest metric-data read from the Zabbix database
            DF_AI_SERVER_AISERVER_AI_SERVER_TOKEN: "[REPLACE_ME]" # AI Server token on the Zabbix server
            DF_AI_SERVER_AISERVER_CONFIG_POLL_INTERVAL: 15 # Interval in seconds in which the AI Server checks for new deployments
            DF_AI_SERVER_AISERVER_MODEL_PRIMING_DURATION: 60 # Warm-up time of the model in seconds
        volumes:
            - ./df-ai-runner/data/var/:/usr/local/intellitrend/df/var/:rw

Running the AI Server

To start the AI Server, simply use the following commands, pulling the required docker image from our docker registry.

# Pull image
docker compose pull

# Start in background
docker compose up -d

# Check logs
docker compose logs -f df-ai-runner

The AI Server should connect to the DataForge Server, poll AI deployments, and begin evaluating models.