Get Started
Sign up, create your first project, and send your first trace in five minutes.
This guide walks you from a blank Muster install to a trace visible in the UI. You'll need a running Muster instance — either the public hosted version or your own self-hosted deployment (see Self-hosting).
Step 1 — Create a project and grab API keys
- Log in to Muster (the URL is whatever your operator gave you, or
http://localhost:3000for a local install). - Create an organization, then create a project inside it.
- Open Settings → API Keys in the project sidebar.
- Click Create new API key.
- Copy both keys somewhere safe:
- Public key (
pk-lf-...) — used in client SDKs - Secret key (
sk-lf-...) — used in server SDKs, never expose to browsers
- Public key (
The secret key is shown only once. If you lose it, rotate the key and create a new one.
Step 2 — Install the SDK
Pick the language you're instrumenting.
Python:
pip install langfuseTypeScript / JavaScript:
npm install langfuseThe SDKs are the same packages used by upstream Langfuse — Muster speaks the identical ingestion API.
Step 3 — Send a trace
Python:
from langfuse import Langfuse
langfuse = Langfuse(
public_key="pk-lf-...",
secret_key="sk-lf-...",
host="http://localhost:3000", # your Muster URL
)
trace = langfuse.trace(
name="user_question",
user_id="user_123",
metadata={"environment": "production"},
)
trace.generation(
name="llm_response",
model="gpt-4",
input="What is AI?",
output="AI stands for Artificial Intelligence...",
)
langfuse.flush()TypeScript:
import { Langfuse } from "langfuse";
const langfuse = new Langfuse({
publicKey: "pk-lf-...",
secretKey: "sk-lf-...",
baseUrl: "http://localhost:3000",
});
const trace = langfuse.trace({
name: "user_question",
userId: "user_123",
metadata: { environment: "production" },
});
trace.generation({
name: "llm_response",
model: "gpt-4",
input: "What is AI?",
output: "AI stands for Artificial Intelligence...",
});
await langfuse.flush();flush() blocks until queued events are delivered. Don't skip it in short-lived
scripts — the SDK batches by default and processes can exit before the buffer
sends.
Step 4 — View the trace
Open the Traces page in the project sidebar. Your trace should appear within a few seconds. Click into it to see the nested observation, input, output, and timing.
If nothing appears:
- Check the SDK logs for HTTP errors (4xx usually means the API key is wrong; 5xx usually means the server-side ingestion worker isn't running).
- Confirm the
host/baseUrlmatches your Muster URL exactly, including protocol. - Self-hosters: confirm the
muster-workercontainer is running. Traces are ingested asynchronously — if the worker is down, traces never arrive.
What's next
- Read Concepts to understand the data model: trace, observation, score, session, dataset.
- Once you have traces flowing, set up Agent Inventory to register the agents producing those traces.
- If you also want Muster to find agents you didn't instrument, configure the Discovery Engine.