Self-hosted AI Starter Kit - AI Model Serving Tool

Overview

An open-source Docker Compose template that sets up a local AI and low-code development environment. Curated by n8n, it integrates n8n, Ollama, Qdrant, and PostgreSQL for secure, self-hosted AI workflows.

Key Features

  • Docker Compose template to orchestrate local AI and supporting services
  • Integrated self-hosted n8n low-code automation platform
  • Ollama integration for running local large language models
  • Qdrant vector database for embeddings and similarity search
  • PostgreSQL for structured data and workflow persistence
  • Curated by n8n with ready-to-use service wiring

Ideal Use Cases

  • Local AI prototyping and testing with self-hosted components
  • Private, on-premise AI workflows avoiding external cloud services
  • Low-code automation connecting local LLMs to databases and services
  • Evaluate and compare local LLMs without external API calls

Getting Started

  • Clone the GitHub repository to your machine
  • Install Docker and Docker Compose if not already present
  • Copy or review the provided docker-compose.yml and environment files
  • Adjust environment variables for database and model paths as needed
  • Start services with docker-compose up -d and monitor logs
  • Access n8n UI and other service endpoints locally

Pricing

No pricing disclosed; project is an open-source GitHub template.

Key Information

  • Category: Model Serving
  • Type: AI Model Serving Tool