Skip to content

Conversation

@ChillBill77
Copy link
Collaborator

Pull Request Title: n8n with Ollama

Description

This pull request adds a complete Docker-based setup for running n8n integrated with a Tailscale sidecar for secure network access, and includes support for Ollama (local LLM inference) integration.

The update introduces:

  • A fully configured docker-compose.yml that securely proxies n8n through Tailscale.
  • A structured .env file for environment variables and Tailscale authentication.
  • A standard config/serve.json enabling HTTPS proxying via Tailscale Serve.
  • An updated README.md documenting setup, usage, and troubleshooting steps for n8n and Ollama.

This configuration allows users to:

  • Self-host n8n securely on their Tailnet (no public exposure).
  • Connect n8n workflows with a local Ollama instance for AI-assisted automation.

Related Issues


Type of Change

  • New feature
  • Documentation update
  • Bug fix
  • Refactoring

How Has This Been Tested?

The configuration was tested on Docker Desktop (v4.33+) and Ubuntu Server (22.04) with Tailscale v1.66.

  1. Compose Validation
    • Ran docker compose config to validate YAML and environment variable resolution.
  2. Deployment
    • Executed docker compose up -d to verify both n8n and tailscale services start correctly.
  3. Connectivity
    • Confirmed Tailnet access to https://<TAILSCALE-HOSTNAME> loads the n8n web interface.
  4. Ollama Workflow
    • Verified n8n node successfully communicates with local Ollama API endpoint (http://ollama:11434).

Checklist

  • I have performed a self-review of my code
  • I have added configuration and verification tests for the setup
  • I have updated necessary documentation (e.g., frontpage README.md)
  • Any dependent changes (Ollama integration) have been merged and tested

Screenshots (if applicable)

asciicast


Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant