Local LLM  ·  Enterprise AI  ·  On-Premise

Your AI.
Your Infrastructure.
Your Rules.

Moordock anchors large language models inside your own infrastructure — no cloud dependency, no data leakage, full enterprise-grade control. Built for D365, ERP, and mission-critical systems.

Request Early Access View on GitHub →
$ moordock init --model llama3 --target d365
✓ Pulling model weights to local registry...
✓ Configuring D365 MCP server connector...
✓ Setting up on-premise inference endpoint...
✓ Zero data leaves your network.
$ moordock status
# Model: Llama 3 8B · Endpoint: localhost:11434 · D365: connected · Cloud: none

Features
01 · local inference

Runs entirely on-premise

Models run on your hardware. Your prompts, data, and outputs never touch an external server.

02 · erp native

Built for D365 & AX

Native MCP server integration for Dynamics 365 and AX. AI-augmented workflows inside systems you already run.

03 · model agnostic

Any model, any size

Llama, Mistral, Qwen, DeepSeek — swap models without changing your integration layer.

04 · enterprise ready

HIPAA, GDPR, SOC 2 ready

Air-gapped deployments available. Designed for compliance-sensitive industries.


Use cases
01

AI-augmented ERP workflows

Embed LLM reasoning into D365 business logic — POs, inventory, GL coding — without exposing financials to the cloud.

02

Private document intelligence

Query contracts, SOPs, and technical manuals with natural language. All processing stays within your four walls.

03

Legacy system modernization

Retrofit AI into AX 2009/2012 environments. No cloud migration required — AI meets your system where it lives.

04

Agentic automation pipelines

Multi-step AI agents that orchestrate across your on-premise stack — fully auditable, fully contained, fully yours.


Tech stack
Runtime
Ollama / llama.cpp
Connector
C# / .NET 8
ERP
D365 / AX 2009–2012
Protocol
MCP (Model Context)
Languages
C#, X++, Python
Models
Llama 3, Mistral, Qwen

Ready to dock your AI on-premise?

Early access now open for enterprise D365 and AX environments.

Request Early Access