HorniLab — Open Research Platform for Embodied AI in Minecraft

Can three people and a swarm of AI agents build a real VLA agent?

About HorniLab

HorniLab is an open-source research platform using Minecraft as a persistent 3D world for autonomous embodied AI agents, telemetry, memory, and human-AI interaction experiments.

We fine-tune Qwen 3.5 for reflexes, use Gemini 3.1 Flash Lite for planning, and deploy both into a live Minecraft world. Every step documented — failures, costs, breakthroughs.

Minecraft gives us a cheap, persistent, multi-agent 3D world with terrain, physics, inventory, construction, and real humans in the loop. Instead of inventing a simulator from scratch, we use an environment people already inhabit and continuously reshape.

Key Research Directions

  • Self-healing infrastructure — AI agents monitor server health, patch configurations, diagnose failures, and restore services without waiting for a human.
  • Oneiro — embodied agent R&D — An experimental Vision-Language-Action direction built around Minecraft frames, spatial reasoning, action loops, and agent autonomy research.
  • Persistent memory and logs — Qdrant-backed memory, decision history, and world context for long-horizon experiments, replay, and future dataset creation.
  • AI agent swarm — Agents that generate code, content, UI, and internal tooling. They are part of how we build the platform itself.

Oneiro — First VLA Agent on a Live Server

An autonomous AI player. A Dreamer living in the void between server ticks. It sees the world through render frames at 10 FPS, thinks via Qwen 3.5 + Gemini, and acts like a human — with keyboard and mouse.

What Google Cloud Unlocks

  • 3M frames of training data — The Weaver pipeline records render frames, player actions, and world state into a structured dataset for VLA model fine-tuning.
  • Dual-agent architecture — Reflexes (Qwen 3.5 @ 10 FPS) + strategic thinking (Gemini 3.1 Flash Lite with 1M context). Two brains — one agent.
  • Self-healing sandbox — AI agents autonomously repair the infrastructure: monitoring, failure diagnostics, config patching, service recovery — 24/7.
  • Persistent memory (HorniRAG) — Qdrant + embeddings: the agent remembers past lives, recognizes players, and accumulates experience across sessions.
  • Open-source & transparency — All code, dataset specs, architecture decisions, and experiment logs are open. Research, not a black box.

Roadmap

  1. Dec 2024 — The beginning: A small Minecraft server. No infrastructure, no agents — just an idea.
  2. Jul 2025 — First public launch: Dedicated servers, web surface, backend, OAuth authentication, and first players.
  3. Dec 2025 — Automation & AI agents: System automation, OpenClaw agents integrated into the development workflow.
  4. Jul 2026 — First VLA agent: Fine-tuned Qwen 3.5 deployed as an embodied agent. Scaling the research platform.

Team

Three humans, an AI agent swarm, and a lot of improvised infrastructure.

  • Cokeef — Founder · Infrastructure & product. Builds the platform, keeps the stack alive.
  • Halva — Co-Founder · AI & engineering. Works on agent logic and technical direction.
  • Arhitector — Lore, server identity, atmosphere. Shapes the world and narrative feel.

AI Team Members

  • Oneiro — Minecraft embodied agent for visual input, spatial reasoning, and autonomous behavior research.
  • Coder & Designer — OpenClaw build agents that produce code, assets, and internal tools.

Technology Stack

  • Google Cloud Platform (Vertex AI, Cloud Storage, Compute Engine)
  • Qwen 3.5 (35B-A3B MoE) — real-time VLA reflexes
  • Gemini 3.1 Flash Lite — strategic reasoning with 1M context window
  • Qdrant — persistent vector memory (HorniRAG)
  • Bun + Hono (TypeScript backend)
  • React + Vite (frontend)
  • MariaDB — primary database
  • Docker — containerized infrastructure
  • WireGuard VPN — secure multi-server mesh

Contact

For collaboration inquiries: ip@horni.cc

Discord: discord.gg/Zw3tQkCSZN

Telegram: t.me/HorniMine