Gald3r

Valhalla

Backend MCP server for teams

42 server-backed MCP tools: semantic memory search, Oracle DB, MediaWiki, platform crawling, video analysis, and cross-machine session continuity.

Docker · Self-hostedv0.9.0

Quick Start

git clone https://github.com/wrm3/gald3r.git && cd gald3r/gald3r_docker && docker compose up -d
View on GitHub →

Features

What you get

Coming soon

Video analysis pipeline (frame extraction + vision) is in active development and not yet feature-complete. Track progress on GitHub.

  • Video analysis pipeline

42 MCP server tools

8 categories: Memory, Vault, Install, Crawling, Oracle, MediaWiki, Video, Utility.

Semantic memory search

pgvector-backed embeddings. Search across all session summaries and project captures.

memory_capture_session

AI self-reports session summary to persistent memory after every conversation.

memory_context

Token-budgeted context block injected at session start — cross-machine memory.

Oracle DB query/execute

Full Oracle DB read/write access via MCP tools. Source + target DB separation.

MediaWiki integration

Get, create, update wiki pages. Full-text search. Integrate team knowledge bases.

Platform doc crawling

Weekly-refreshed docs for Cursor, Claude, Gemini — stored in pgvector for semantic search.

gald3r_install tool

Initialize gald3r in any project remotely via MCP tool call.

gald3r_health_report

Per-subsystem health score 0-100. Stale claims, blocked tasks, Firecrawl status.

Video analysis pipeline

Frame extraction + transcript + vision analysis → vault notes.

Setup

docker-compose

docker-compose.yml

gald3r_docker/
# gald3r_docker/docker-compose.yml — minimal stack
services:
  gald3r-db:
    image: pgvector/pgvector:pg16
    environment:
      POSTGRES_USER: gald3r
      POSTGRES_PASSWORD: changeme
      POSTGRES_DB: gald3r
    volumes:
      - gald3r-db-data:/var/lib/postgresql/data
    ports:
      - "5432:5432"

  gald3r-mcp:
    build: ./gald3r_mcp
    environment:
      DATABASE_URL: postgres://gald3r:changeme@gald3r-db:5432/gald3r
      OPENAI_API_KEY: ${OPENAI_API_KEY}
    depends_on:
      - gald3r-db
    ports:
      - "7820:7820"

  gald3r-crawl4ai:
    image: unclecode/crawl4ai:latest
    ports:
      - "11235:11235"

volumes:
  gald3r-db-data:

Requirements

What you'll need

  • Docker Engine 24+ and Docker Compose v2

  • Postgres 16 with the pgvector extension (bundled as a service in the provided compose file)

  • OpenAI API key (used for memory embeddings — `OPENAI_API_KEY` env var)

  • ~2 GB free disk for Postgres volume; additional space if crawling many platform docs

  • Reachable MCP-capable client: Cursor, Claude Code, Codex, or Gemini CLI

Agents

Specialized agents

g-agnt-project

g-agnt-project.md

Manages PROJECT.md, goals, and project identity

g-agnt-task-manager

g-agnt-task-manager.md

Owns TASKS.md and individual task files

g-agnt-code-reviewer

g-agnt-code-reviewer.md

Adversarial code review — separate from implementing agent

g-agnt-qa-engineer

g-agnt-qa-engineer.md

Bug filing, QA workflow, quality metrics

g-agnt-verifier

g-agnt-verifier.md

Independent verification of [🔍] items

g-agnt-infrastructure

g-agnt-infrastructure.md

DevOps, CI/CD, Docker, cloud infra

g-agnt-ideas-goals

g-agnt-ideas-goals.md

IDEA_BOARD → task promotion + goal tracking

g-agnt-test

g-agnt-test.md

L1/L2/L3 test plan creation and execution

g-agnt-pcac-coordinator

g-agnt-pcac-coordinator.md

Orchestrates cross-project PCAC operations

Workflows

How it works

Typical workflow

  1. 1

    Deploy Docker Stack

    docker compose up -d → Postgres, pgvector, gald3r API, crawl4ai

  2. 2

    Configure MCP

    Add gald3r MCP server to Cursor/Claude MCP config

  3. 3

    Capture Sessions

    memory_capture_session → session summaries persist to pgvector

  4. 4

    Search Memory

    memory_search → semantic query across all stored sessions

  5. 5

    Inject Context

    memory_context → token-budgeted block injected at next session start

Valhalla Docker Stack

flowchart TD
  IDE([Cursor / Claude Code\nGemini / Codex]) -->|MCP protocol| G([gald3r MCP server\nDocker])
  G --> DB[(Postgres\n+ pgvector)]
  G --> C([crawl4ai\nplatform docs])
  G --> W([MediaWiki\nteam knowledge])
  G --> O[(Oracle DB\nread / write)]
  G --> V([Video analysis\nyt-dlp + vision])
  style IDE fill:#1e293b,stroke:#60a5fa,color:#60a5fa
  style G fill:#1e293b,stroke:#c9922a,color:#c9922a
  style DB fill:#1e293b,stroke:#a78bfa,color:#a78bfa
  style C fill:#1e293b,stroke:#6ee7b7,color:#6ee7b7
  style W fill:#1e293b,stroke:#6ee7b7,color:#6ee7b7
  style O fill:#1e293b,stroke:#f87171,color:#f87171
  style V fill:#1e293b,stroke:#94a3b8,color:#94a3b8

Mermaid diagram — paste into any Mermaid renderer to visualize