Overview
Track costs and monitor usage for Amazon Bedrock by sending Converse requests through LLM Ops. This integration is not a generic “swap the OpenAI base URL” flow: you call a dedicated path on the LLM Ops API that forwards tobedrock-runtime.{region}.amazonaws.com with AWS Signature Version 4 signing.
This guide shows Python (requests), JavaScript (fetch), and cURL. Use the official AWS Converse API request body shape (messages, inferenceConfig, etc.).
Security: LLM Ops does not store your Cloudidr tracking token, AWS secret keys, request prompts, or response content in the analytics database—only usage metadata needed for cost analytics. The proxy must forward the JSON body to AWS to complete the call. AWS credentials are used in memory for the request to sign the upstream call; they are not written to our analytics tables. Never commit real keys; use IAM least privilege and environment variables.
Quick Start
- LLM Ops API host:
https://api.llm-ops.cloudidr.com - Endpoint pattern:
POST /bedrock/model/{modelId}/converse
Example model ID in the path:amazon.nova-lite-v1:0(dots and colons stay in the path segment). - Upstream API: Converse only—not
InvokeModelor third-party SDK defaults that talk to AWS without the proxy.
X-Cloudidr-Key— LLM Ops tracking token (trk_...).- AWS credentials —
X-Aws-Access-Key-IdandX-Aws-Secret-Access-Key(and optionally STS session token) so the proxy can sign the Bedrock request.
API Keys and configuration
| Credential | Purpose |
|---|---|
| Cloudidr Key | From the LLM Ops dashboard; typically trk_... |
| AWS Access Key ID | IAM user or role with bedrock:InvokeModel (and model access) in the target account |
| AWS Secret Access Key | Paired with the access key |
| AWS Session Token (optional) | For temporary STS credentials |
Integration Examples
- Python
- JavaScript
- cURL
Install
boto3) calls AWS directly. To route through LLM Ops, use HTTP (e.g. requests) with the URL and headers below—matching how our smoke tests exercise the proxy.Basic Example
With Metadata (Department / Project / Agent)
Parsing the response (Converse shape)
Successful responses includeoutput, usage, and stopReason, for example:data["usage"]["inputTokens"],data["usage"]["outputTokens"]- Assistant text:
data["output"]["message"]["content"](list of blocks; often a block with"text")
content as needed.Required and optional headers
| Header | Required | Description |
|---|---|---|
X-Cloudidr-Key | Yes | LLM Ops tracking token (trk_...) |
X-Aws-Access-Key-Id | Yes | AWS access key ID |
X-Aws-Secret-Access-Key | Yes | AWS secret access key |
X-Aws-Session-Token | No | STS session token when using temporary credentials |
X-Aws-Region | No | AWS region for Bedrock runtime (default us-east-1 if omitted) |
X-Department | No | Cost attribution: department |
X-Project | No | Cost attribution: project/team (preferred) |
X-Team | No | Legacy alias for the same tag as X-Project |
X-Agent | No | Cost attribution: agent or app name |
Model IDs and cross-region inference
The{modelId} in the URL is passed to AWS (after normalization). For some providers, Bedrock expects a cross-region inference profile ID (prefix us., eu., or ap.). The proxy may automatically prepend the right prefix when you use a plain ID for:
anthropic.*meta.*deepseek.*
X-Aws-Region (e.g. eu-* → eu., ap-* → ap., otherwise often us.). If your model ID already starts with us., eu., or ap., it is left unchanged.
Many Amazon models (e.g. Amazon Nova) use the plain model ID without a us. prefix. Always confirm the exact ID in the AWS Bedrock console for your account and region.
Anthropic models on Bedrock: Your account must have model access enabled; Anthropic often requires completing the use-case form under Bedrock → Model access → Anthropic in the AWS console.
Converse features (streaming, tools)
The proxy forwards the Converse JSON body to AWS. Features such as streaming (stream in the Converse request) or tool configuration follow the AWS Converse API specification. Refer to the latest AWS documentation for field names and behavior; LLM Ops records usage from successful responses when token counts are present.
Supported Models
Models available to your AWS account in the chosen region can be used. See the Supported Models page for pricing alignment in LLM Ops.What Gets Tracked
LLM Ops automatically captures: ✅ Token usage — Input and output tokens from the Converseusage object where available✅ Cost — Estimated cost from LLM Ops pricing
✅ Latency — Request duration and provider timing
✅ Model — Effective model ID used for Bedrock
✅ Metadata — Department, project/team, agent
✅ Errors — HTTP status and summarized error messages
✅ Optimizer — When enabled, routing metadata for cost-optimizer decisions
View Your Data
After making requests, view costs in the LLM Ops Dashboard:- Agent Explorer — Costs by agent
- Department Breakdown — Department spending
- Team Analysis — Project/team-level costs
- Model Comparison — Compare models (including Bedrock IDs)
- Time Series — Spend over time
Migration from calling Bedrock directly
Previously you might have called:POST https://bedrock-runtime.{region}.amazonaws.com/model/{modelId}/converse with SigV4 from your code.
With LLM Ops:
- Change the URL to
https://api.llm-ops.cloudidr.com/bedrock/model/{modelId}/converse. - Add
X-Cloudidr-Key: trk_...on every request. - Pass AWS credentials using the headers above so the proxy can sign the upstream Bedrock call (you can remove local SigV4 signing when using the proxy, unless you keep a different architecture).
Troubleshooting
400 Missing X-Cloudidr-Key
400 Missing X-Cloudidr-Key
Every request must include
X-Cloudidr-Key with a valid LLM Ops tracking token. Get or rotate tokens in the dashboard.401 AWS authentication failed
401 AWS authentication failed
Check
X-Aws-Access-Key-Id, X-Aws-Secret-Access-Key, and optional X-Aws-Session-Token. Ensure the IAM principal can invoke Bedrock in X-Aws-Region. Keys must not be swapped with the Cloudidr token.403 / 404 Model access
403 / 404 Model access
Enable the model in Amazon Bedrock → Model access for your account and region. For Anthropic, complete the console use-case step if required.
Wrong region or model ID
Wrong region or model ID
Set
X-Aws-Region to the region where the model is available. Verify the model ID string (including version suffixes like :0) matches AWS documentation.Cost data not appearing
Cost data not appearing
Allow 10–30 seconds for dashboard updates. Confirm HTTP 200 from the proxy and a valid
X-Cloudidr-Key so the request is attributed to your org.Next Steps
View Dashboard
See your OpenAI API costs in real-time
Supported Models
View all supported OpenAI models
Anthropic Integration
Add cost tracking for Claude models
Set Budgets
Configure spending alerts and limits

