OBTO
Open Source Democracy

The Glass Box AI Platform.
Stop renting black boxes — start owning your intelligence.

Build with Claude, GPT, Codex, or any model. Deploy to our cloud — or your own Kubernetes cluster. OBTO is the open, self-hostable platform for Agentic AI. Full observability, zero vendor lock-in, and radical price transparency.

No seat taxes. Pay only for the compute and tokens your agents use.

Any LLM — Claude, GPT, Codex, Qwen
Self-Hostable on your Kubernetes
MCP-Native from day one
Full-Stack — not just React frontends

Not locked to one model. Not locked to one cloud. Not locked to one vendor.

The Glass Box Architecture

1. Observe

Trace every prompt, logic branch, and token cost in real-time. No hidden operations.

2. Connect via MCP

Standardize how your models talk to your data using the open Model Context Protocol (MCP).

3. Orchestrate

Build autonomous Agentic workflows with visual guardrails and versioned deployments.

4. Port Anywhere

Containerized by default. Seamlessly migrate your entire AI runtime from our SaaS to your infrastructure.

What can you build with transparent AI?

IT Helpdesk That Resolves Itself

Deploy an agent that reads tickets, queries your CMDB, and resolves common ITSM requests — without human intervention.

MCP Servers Your Whole Team Uses

Build, host, and version observable MCP tools that Claude, GPT, or Codex can call — all from one deployment target.

Spreadsheets That Clean Themselves

Extract, normalize, and load messy data using LLMs as the transformation layer. No brittle ETL scripts.

Customer Portal With AI Assistants

Role-based access portals where each user gets a tailored AI assistant scoped to their data and permissions.

Live Token Cost & Audit Dashboard

Real-time Glass Box dashboard showing every agent decision, token spend, and API call — no hidden operations.

Multi-LLM Collaborative Workflows

Claude architects the backend, Codex extends the frontend — in the same app, on the same platform. Try that anywhere else.

Cure your SaaS fatigue

The Hybrid-Native Flex

Start fast on our managed cloud. When compliance or cost demands it, shift your workloads to your own Kubernetes infrastructure with zero code changes.

Open Standard Integrations

Connect to ServiceNow, databases, and internal APIs using the Model Context Protocol. No more brittle, proprietary vendor connectors.

No Data Hostages

You built the workflow, it belongs to you. We provide the engine and the built-in storage, but you retain full ownership of your IP and logic.

We believe in Open Source Democracy

True intelligence shouldn't be hidden behind a proprietary API. OBTO forces AI to show its work, giving enterprises the observability they need to trust autonomous agents in production.

30M+
Monthly requests served in production
150+
Institutions onboarded on OBTO
7+
Years of production-grade infrastructure

The OBTO Ecosystem

View all
SOFOS
Education

Transparent, AI-assisted learning and tutoring flows for universities.

Pelatis
ITSM

Agentic copilot for ServiceOps. Resolve IT tickets with fully audited actions.

MCP Tool Builder
Developers

Design, publish, and observe MCP tools for any LLM. Version-controlled.

OBTO Platform
PaaS

The core runtime to build, monitor, and scale transparent AI apps.

Platform Highlights

Multi-Model Agnostic

Run OpenAI, Groq, Ollama, and open-weights models side-by-side.

Agentic Orchestration

Compose tool-using agents with strict, auditable guardrails.

MCP Native

Built from the ground up to support the Model Context Protocol.

Observable Runtime

Secure execution environments with built-in logging and evals.

Deploy Agents You Can Actually Trust

Design purpose-built agents for research, automation, and ETL. Deploy them with versioned releases, enforce strict policy guardrails, and track every action with the Glass Box dashboard.

The Infrastructure of Intelligence

A complete, opinionated runtime for Agentic apps. Designed for the enterprise that wants speed today, and total architectural ownership tomorrow.

AI Workflow Builder
Orchestrate tools, models, and logic explicitly.
Glass Box Auditing
Granular telemetry, traces, and rate limits built-in.
MCP Tool Hosting
Design, host, and version MCP servers effortlessly.
Hybrid Deployment
Run in our cloud or self-host on your K8s clusters.
Enterprise Security
Secrets vault, RBAC, and strict policy engines.
Transparent Pricing
True utility billing. See exactly what each agent costs.
Look inside the Glass Box

Sign in and get your personal MCP endpoint in under a minute. Connect your AI client and start building immediately.

Get your MCP endpoint →

No seat licenses. No setup. Pay only for what your agents use.

Why OBTO?

Visibility

The "Glass Receipt" ensures you never wonder why an agent made a decision or what it cost.

Standardization

Embrace MCP to break free from proprietary tool-calling silos.

Portability

Containerized architecture means your workflows can migrate in-house seamlessly.

Reliability

Policy sandboxing and deep audit trails protect your enterprise data.

Radical Price Transparency

No arbitrary seat licenses. No hidden model markups. You pay strictly for the compute and tokens your agents consume. Start free, scale predictably.

Builder
$0/mo
Prove the concept
  • ✓ Glass Box visual builder
  • ✓ 100K tokens/mo
  • ✓ 1 active MCP server
  • ✓ Community support
Start free
Team
Utility Billing
$49/mo
Scale your agents
  • ✓ Advanced observability traces
  • ✓ 10M tokens/mo pooled
  • ✓ SSO, roles & audit trails
  • ✓ Up to 10 Agentic workflows
Get Team
Enterprise
Custom/mo
Total architectural control
  • ✓ Self-hosted / On-prem porting
  • ✓ Bring your own models (BYOM)
  • ✓ SLA & dedicated support
  • ✓ Custom MCP integrations
Contact sales

Frequently asked

How is pricing calculated?

We meter tokens, requests, and storage like a true utility. You get a "Glass Receipt" showing exactly what models cost you what amounts.

Can I host on-prem to avoid SaaS lock-in?

Yes. Our Enterprise plan is built for Hybrid deployment. You can port the runtime entirely to your own private cloud or bare metal servers.

Which models and tools are supported?

Any OpenAI-compatible endpoint, plus local open-weight models. We use the Model Context Protocol (MCP) to standardize tool connections.

Is the "Glass Box" secure?

Visibility doesn't mean vulnerability. We never train on your data, all connections are encrypted, and we enforce strict Role-Based Access Controls.

Stop renting. Start owning.

Deploy your first observable Agentic workflow today. Scale transparently tomorrow.