Features
Privacy-first
Your data stays in your environment. GDPR compliant.
One-line setup
A single line of code to instrument all your LLM calls.
Auto-instrumentation
Automatic support for OpenAI SDK. Anthropic and Gemini coming soon.
Flexible backends
Local JSONL export or real-time Google Sheets.
Quick Start
from hilt import instrument, uninstrument
from openai import OpenAI
# Enable automatic logging
instrument(backend="local", filepath="logs/chat.jsonl")
client = OpenAI()
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)
# Stop logging when done
uninstrument()After calling instrument(), all prompts and completions are automatically logged with latency, tokens, cost and status codes.
Available columns
Select the columns you need for compliance or dashboards.
| Column | Description |
|---|---|
| timestamp | ISO timestamp (UTC) |
| conversation_id | Stable thread identifier |
| event_id | Unique event UUID |
| speaker | human / agent |
| action | Type: prompt, completion, system |
| message | Normalized content (500 chars max) |
| tokens_in | Prompt tokens |
| tokens_out | Completion tokens |
| cost_usd | Monetized cost (6 decimals) |
| latency_ms | Wall-clock latency |
| model | Provider/model label |
Storage options
Local JSONL (default)
instrument( backend="local", filepath="logs/app.jsonl" )
- • Privacy-first: local data
- • Compatible with Pandas, Spark, etc.
Google Sheets (real-time)
instrument( backend="sheets", sheet_id="1abc...", credentials_path="creds.json" )
- • Ideal for support, QA, costs
- • Install: pip install "hilt[sheets]"
Roadmap
OpenAI SDK auto-instrumentation
Anthropic Claude auto-instrumentation
Google Gemini auto-instrumentation
LangGraph auto-instrumentation
