// one-time pricing

PAY ONCE. OWN IT.

No subscriptions. No tiers. Every utility is a one-time purchase. Buy the code, keep the code, modify the code — it lives in your repo.

// 36 free utilities

🔢
Token Optimiser
Count tokens and estimate cost across every major model.
FREE
🗜️
Prompt Compressor
Shrink prompts without losing meaning.
FREE
🧭
Model Router
Pick the right model for the task and the budget.
FREE
🔗
Prompt Chain Builder
Compose multi-step LLM workflows without a framework.
FREE
📊
Agent Monitor
Watch what your agents are actually doing.
FREE
🔧
JSON Repair
Fix the #1 LLM output problem: broken JSON.
FREE
Response Validator
Validate LLM JSON output against a schema and auto-retry on failure.
FREE
🧼
Output Sanitizer
Strip the garbage LLMs add to their output.
FREE
🔁
Retry with Backoff
Universal retry wrapper for LLM API calls.
FREE
🚦
Rate Limit Handler
Queue LLM requests and respect per-provider limits.
FREE
🌊
Streaming Handler
Clean SSE parser for streaming LLM responses.
FREE
🏗️
System Prompt Builder
Build modular, composable system prompts from reusable blocks.
FREE
📚
Prompt Templates
Battle-tested system prompts for common LLM tasks.
FREE
🎯
Few-Shot Builder
Pack the most effective examples into the fewest tokens.
FREE
💭
Chain-of-Thought Wrapper
Wrap any prompt in a CoT reasoning frame.
FREE
💰
Token Budget Allocator
Plan token spend across a multi-turn conversation.
FREE
📏
Context Gauge
See how close you are to blowing the context window.
FREE
🧮
Cost Calculator
What will this LLM call actually cost?
FREE
🔮
Cost Forecaster
Project monthly LLM spend before you ship.
FREE
⏱️
Latency Tester
Benchmark LLM endpoint latency.
FREE
🔁
Prompt Translator
Translate prompts between model families.
FREE
🔀
Prompt Version Diff
Diff two prompts and see what actually changed.
FREE
🧲
Embedding Similarity
Cosine similarity without numpy.
FREE
🛡️
Prompt Injection Scanner
Scan untrusted input for prompt-injection patterns.
FREE
🕶️
PII Scrubber
Strip PII before sending to a model provider.
FREE
📚
Recursive Summarizer
Summarize massive codebases for LLM context.
FREE
📉
Dependency Minimizer
Strip unused imports from bloated files.
FREE
🧪
Synthetic Data Generator
Generate training data from a schema.
FREE
🔄
JSONL Converter
Convert between JSON, JSONL, and CSV at scale.
FREE
📑
JSON to CSV
Flatten any JSON blob into CSV.
FREE
📋
Markdown Table Parser
Extract structured tables from LLM-generated markdown.
FREE
📐
Markdown to Schema
Extract a JSON schema from a markdown spec.
FREE
📄
PDF Text Stripper
Extract text from PDFs without external deps.
FREE
💳
Stripe Webhook Handler
Verified, idempotent, retry-safe Stripe webhooks in one file.
FREE
🔘
Tailwind Button
Accessible, variant-driven Tailwind button — drop in and ship.
FREE
🔐
Clerk + Prisma User Sync
Clerk webhooks → Prisma upsert. Drop-in Next.js route handler.
FREE

// 5 paid utilities

💾
Response Cache
Stop paying twice for the same LLM call.
$3
📦
Batch Processor
Process thousands of prompts in parallel without rate-limit pain.
$4
🛡️
Agent Error Recovery
Catch agent failures before they spiral.
$4
✂️
Context Window Manager
Smart trimming so long conversations stay under the limit.
$4
🔍
Hallucination Checker
Catch fabricated facts before they reach the user.
$5

// one-time purchase. no recurring charges.