LLM API Cost Calculator

Estimate monthly API cost for any LLM workload across Anthropic and OpenAI models. Includes prompt-cache and batch-API math so you see what you actually pay — not just the headline rate.

DomainProgrammingVersionv1.0.0Added2026-05-17
Inputs
Model
Provider/model. Per-Mtok rates baked in — see pricing_as_of for freshness.
Input Tokens / Calltok
Average prompt size: system + user + retrieved context combined.
Output Tokens / Calltok
Calls / Month
Cached Input Fraction
Share of input tokens served from prompt cache (0 = no cache, 1 = fully cached). Anthropic and GPT-5.x get ~90% off cache reads; GPT-4o-class get 50% off.
Both providers offer 50% off both rates for async batch jobs.
Result
version1.0.0
POST /v1/programming-dev/llm-api-cost-calculator
curl -X POST https://api.toolsamurai.com/v1/programming-dev/llm-api-cost-calculator \
  -H "Authorization: Bearer sk_live_•••••••••••••••" \
  -H "Content-Type: application/json" \
  -d '{
     "model": "anthropic/claude-sonnet-4.6",
     "input_tokens_per_call": 2000,
     "output_tokens_per_call": 500,
     "calls_per_month": 100000,
     "cached_input_fraction": 0,
     "use_batch_api": false
  }'
llmopenaianthropiccostapi-pricingtokensprompt-cachebatch-apigptclaude