Snippets

Code snippets and quick examples from the _snippets folder

Getting Started with Amazon S3 Vectors and Bedrock Embeddings

This library is licensed under the MIT-0 License. See the [LICENSE](../../LICENSE) file.

bedrock s3-vectors jupyter embeddings python vector-search
Tech: Python AWS SDK (Boto3) Amazon S3 Vectors Amazon Bedrock (Cohere model) Jupyter Notebooks
easy

06 Invoke Model With Caching

06 - Bedrock InvokeModel API + Prompt Caching + AgentCore Code Interpreter Prompt caching reduces cost and latency when the same system prompt and tool definitions are sent across multiple calls — a common pattern in agentic workflows where the model is called repeatedly in a tool-use loop. How it works: 1. Mark reusable content (system prompt, tools) with `cache_control` 2. First call: tokens are written to cache (slightly higher write cost) 3. Subsequent calls: tokens are read from cache (lower cost, lower latency) IMPORTANT — When to use caching: - USE for content that repeats across calls: system prompts, tool definitions, long reference documents, few-shot examples - AVOID caching content that changes every call (user messages, dynamic context) — cache writes cost MORE than uncached tokens, so caching volatile content increases cost without benefit NOTE: Prompt caching requires a REGIONAL model ID (e.g., us.anthropic.claude-*). Cross-region model IDs (global.*) route requests to different regions, preventing cache hits. Requirements: pip install boto3 bedrock-agentcore Usage: python 06_invoke_model_with_caching.py

04 Code Interpreter Highlevel Api

04 - AgentCore Code Interpreter High-Level API (Direct, No Model) This uses the AgentCore SDK's high-level CodeInterpreter class to execute code directly — no Claude, no agentic loop. Useful when your application already knows what code to run (batch jobs, testing, pipelines). Requirements: pip install bedrock-agentcore Usage: # Configure AWS credentials (IAM, SSO, or environment variables) python 04_code_interpreter_highlevel_api.py

03 Strands Agent With Code Interpreter

03 - Strands SDK + AgentCore Code Interpreter (Recommended) This is the RECOMMENDED migration path. Strands handles the agentic loop automatically — you define a tool, create an agent, and call it. The result is as simple as the Anthropic experience: one call does everything. Requirements: pip install strands-agents bedrock-agentcore Usage: # Configure AWS credentials (IAM, SSO, or environment variables) python 03_strands_agent_with_code_interpreter.py

02 Invoke Model With Code Interpreter

02 - Bedrock InvokeModel API + AgentCore Code Interpreter This approach uses the InvokeModel API with the native Anthropic Messages format. The request body is nearly identical to what you'd send to api.anthropic.com, making it the easiest path to minimize changes to existing Anthropic code. The agentic loop is the same as the Converse API version, but the request/response format matches the native Anthropic format. Requirements: pip install boto3 bedrock-agentcore Usage: # Configure AWS credentials (IAM, SSO, or environment variables) python 02_invoke_model_with_code_interpreter.py

01 Converse Api With Code Interpreter

01 - Bedrock Converse API + AgentCore Code Interpreter This is the custom agentic loop approach using the Converse API. It wires together two AWS services: - Bedrock Runtime (Converse API) for Claude - AgentCore Code Interpreter for sandboxed code execution The loop: Claude → tool_use? → execute code → return result → Claude → final answer. Requirements: pip install boto3 bedrock-agentcore Usage: # Configure AWS credentials (IAM, SSO, or environment variables) python 01_converse_api_with_code_interpreter.py

00 Anthropic Code Execution

00 - Anthropic Code Execution Tool (BEFORE - Reference) This is the ORIGINAL Anthropic approach. One API call — Claude decides to execute code, runs it in a managed container, and returns the answer inline. No agentic loop. No tool-use handling. No session management. This is what we are migrating FROM. Requirements: pip install anthropic Usage: export ANTHROPIC_API_KEY=your_key python 00_anthropic_code_execution.py