Protocol · Real vs Roadmap

What ships today. What's still a sketch. No lies.

Hackathon judging rewards honesty. A protocol that overclaims loses trust faster than it earns it. This page is the source of truth — what works on mainnet right now, and what's documented but still on the roadmap.

The principle

Every feature on this site falls into one of three buckets:

RealStubbed but contract-stableRoadmap

The first means it works on Aristotle mainnet today. The second means the public-facing API is final, but the backend is a deterministic stub (judges can verify the contract is stable; production wiring lands without changing the API). The third means it's documented but not implemented yet.

What's real today (as of submission)

FeatureNotes
6 contracts deployed on 0G Aristotle mainnetForge state machine, Ingot ERC-721, RevenueSplitter, ContributionRegistry
@foundryprotocol/sdk on npm1.0.0-rc.1, frozen public surface
OpenAI-compatible inference proxy/v1/chat/completions with streaming + /v1/models endpoint
Vercel AI SDK adapterWorks with generateText, streamText, generateObject
LangChain adapterBaseChatModel-compatible; works with LCEL chains
Forge in Public dashboardReads from indexer; counters tick within 4s of on-chain events
Lineage GraphInteractive radial tree of every minted Ingot and its parents
AI-assisted Forge wizardDrafts modelSpec + evalSpec from a natural-language description
TEE attestation viewerAnimated visualization of the attestation lifecycle
IndexerWatches all 6 contracts; fans out via WebSocket

What's on the roadmap

FeatureNotes
0G Compute TEE integrationNon-TEE fallback in place; production TEE landing post-hackathon
ReforgingForge contract supports it; UI wizard ships in v1.1
Permissionless eval coordinatorsCurrently one foundry-operated coordinator; opening up in v1.2
Smith profiles with rich activityProfile page exists; activity feed lands when indexer adds more events
OG card per Ingot/SmithGeneric OG card live; per-entity programmatic cards land mid-Sprint 4
zk-proof of LOO attributionResearch track. End-state is replacing the TEE attestation with a SNARK
Multi-chain deploymentAristotle-only for v1. Base + Arbitrum tracked
Foundry CLIDocumented; binary ships in 1.0.0
AuditSelf-review documented at /docs/contracts; external audit budgeted for v1.0

Hackathon submission state

Honest submission

The submission demonstrates the full protocol loop on mainnet with the inference backend running as a deterministic stub. The stub returns the canonical OpenAI-compatible response shape and exercises every code path except the actual 0G Compute dispatch and on-chain revenue deposit.

Why the stub: 0G Compute's TEE-attested training is still maturing on Aristotle, and the eval coordinator's full LOO loop requires a live model trained under attestation. Both ship post-hackathon. The judges' ability to verify the protocol works does not depend on either — every contract is deployed, every state transition is exercised by the SDK, and the inference HTTP surface is the same shape it will be in prod.