strands-agents1.16.0
Published
A model-driven approach to building AI agents in just a few lines of code
pip install strands-agents
Package Downloads
Authors
Project URLs
Requires Python
>=3.10
Dependencies
- boto3
<2.0.0,>=1.26.0 - botocore
<2.0.0,>=1.29.0 - docstring-parser
<1.0,>=0.15 - jsonschema
<5.0.0,>=4.0.0 - mcp
<2.0.0,>=1.11.0 - opentelemetry-api
<2.0.0,>=1.30.0 - opentelemetry-instrumentation-threading
<1.00b0,>=0.51b0 - opentelemetry-sdk
<2.0.0,>=1.30.0 - pydantic
<3.0.0,>=2.4.0 - typing-extensions
<5.0.0,>=4.13.2 - watchdog
<7.0.0,>=6.0.0 - a2a-sdk
<0.4.0,>=0.3.0; extra == "a2a" - a2a-sdk
[sql]<0.4.0,>=0.3.0; extra == "a2a" - fastapi
<1.0.0,>=0.115.12; extra == "a2a" - httpx
<1.0.0,>=0.28.1; extra == "a2a" - starlette
<1.0.0,>=0.46.2; extra == "a2a" - uvicorn
<1.0.0,>=0.34.2; extra == "a2a" - a2a-sdk
<0.4.0,>=0.3.0; extra == "all" - a2a-sdk
[sql]<0.4.0,>=0.3.0; extra == "all" - anthropic
<1.0.0,>=0.21.0; extra == "all" - boto3-stubs
[sagemaker-runtime]<2.0.0,>=1.26.0; extra == "all" - fastapi
<1.0.0,>=0.115.12; extra == "all" - google-genai
<2.0.0,>=1.32.0; extra == "all" - httpx
<1.0.0,>=0.28.1; extra == "all" - litellm
<2.0.0,>=1.75.9; extra == "all" - llama-api-client
<1.0.0,>=0.1.0; extra == "all" - mistralai
>=1.8.2; extra == "all" - ollama
<1.0.0,>=0.4.8; extra == "all" - openai
<1.110.0,>=1.68.0; extra == "all" - openai
<2.0.0,>=1.68.0; extra == "all" - opentelemetry-exporter-otlp-proto-http
<2.0.0,>=1.30.0; extra == "all" - sphinx-autodoc-typehints
<4.0.0,>=1.12.0; extra == "all" - sphinx-rtd-theme
<2.0.0,>=1.0.0; extra == "all" - sphinx
<9.0.0,>=5.0.0; extra == "all" - starlette
<1.0.0,>=0.46.2; extra == "all" - uvicorn
<1.0.0,>=0.34.2; extra == "all" - writer-sdk
<3.0.0,>=2.2.0; extra == "all" - anthropic
<1.0.0,>=0.21.0; extra == "anthropic" - commitizen
<5.0.0,>=4.4.0; extra == "dev" - hatch
<2.0.0,>=1.0.0; extra == "dev" - moto
<6.0.0,>=5.1.0; extra == "dev" - mypy
<2.0.0,>=1.15.0; extra == "dev" - pre-commit
<4.4.0,>=3.2.0; extra == "dev" - pytest-asyncio
<1.3.0,>=1.0.0; extra == "dev" - pytest-cov
<8.0.0,>=7.0.0; extra == "dev" - pytest-xdist
<4.0.0,>=3.0.0; extra == "dev" - pytest
<9.0.0,>=8.0.0; extra == "dev" - ruff
<0.14.0,>=0.13.0; extra == "dev" - sphinx-autodoc-typehints
<4.0.0,>=1.12.0; extra == "docs" - sphinx-rtd-theme
<2.0.0,>=1.0.0; extra == "docs" - sphinx
<9.0.0,>=5.0.0; extra == "docs" - google-genai
<2.0.0,>=1.32.0; extra == "gemini" - litellm
<2.0.0,>=1.75.9; extra == "litellm" - openai
<1.110.0,>=1.68.0; extra == "litellm" - llama-api-client
<1.0.0,>=0.1.0; extra == "llamaapi" - mistralai
>=1.8.2; extra == "mistral" - ollama
<1.0.0,>=0.4.8; extra == "ollama" - openai
<2.0.0,>=1.68.0; extra == "openai" - opentelemetry-exporter-otlp-proto-http
<2.0.0,>=1.30.0; extra == "otel" - boto3-stubs
[sagemaker-runtime]<2.0.0,>=1.26.0; extra == "sagemaker" - openai
<2.0.0,>=1.68.0; extra == "sagemaker" - writer-sdk
<3.0.0,>=2.2.0; extra == "writer"
Strands Agents
A model-driven approach to building AI agents in just a few lines of code.
Documentation ◆ Samples ◆ Python SDK ◆ Tools ◆ Agent Builder ◆ MCP Server
Strands Agents is a simple yet powerful SDK that takes a model-driven approach to building and running AI agents. From simple conversational assistants to complex autonomous workflows, from local development to production deployment, Strands Agents scales with your needs.
Feature Overview
- Lightweight & Flexible: Simple agent loop that just works and is fully customizable
- Model Agnostic: Support for Amazon Bedrock, Anthropic, Gemini, LiteLLM, Llama, Ollama, OpenAI, Writer, and custom providers
- Advanced Capabilities: Multi-agent systems, autonomous agents, and streaming support
- Built-in MCP: Native support for Model Context Protocol (MCP) servers, enabling access to thousands of pre-built tools
Quick Start
# Install Strands Agents
pip install strands-agents strands-agents-tools
from strands import Agent
from strands_tools import calculator
agent = Agent(tools=[calculator])
agent("What is the square root of 1764")
Note: For the default Amazon Bedrock model provider, you'll need AWS credentials configured and model access enabled for Claude 4 Sonnet in the us-west-2 region. See the Quickstart Guide for details on configuring other model providers.
Installation
Ensure you have Python 3.10+ installed, then:
# Create and activate virtual environment
python -m venv .venv
source .venv/bin/activate # On Windows use: .venv\Scripts\activate
# Install Strands and tools
pip install strands-agents strands-agents-tools
Features at a Glance
Python-Based Tools
Easily build tools using Python decorators:
from strands import Agent, tool
@tool
def word_count(text: str) -> int:
"""Count words in text.
This docstring is used by the LLM to understand the tool's purpose.
"""
return len(text.split())
agent = Agent(tools=[word_count])
response = agent("How many words are in this sentence?")
Hot Reloading from Directory:
Enable automatic tool loading and reloading from the ./tools/ directory:
from strands import Agent
# Agent will watch ./tools/ directory for changes
agent = Agent(load_tools_from_directory=True)
response = agent("Use any tools you find in the tools directory")
MCP Support
Seamlessly integrate Model Context Protocol (MCP) servers:
from strands import Agent
from strands.tools.mcp import MCPClient
from mcp import stdio_client, StdioServerParameters
aws_docs_client = MCPClient(
lambda: stdio_client(StdioServerParameters(command="uvx", args=["awslabs.aws-documentation-mcp-server@latest"]))
)
with aws_docs_client:
agent = Agent(tools=aws_docs_client.list_tools_sync())
response = agent("Tell me about Amazon Bedrock and how to use it with Python")
Multiple Model Providers
Support for various model providers:
from strands import Agent
from strands.models import BedrockModel
from strands.models.ollama import OllamaModel
from strands.models.llamaapi import LlamaAPIModel
from strands.models.gemini import GeminiModel
from strands.models.llamacpp import LlamaCppModel
# Bedrock
bedrock_model = BedrockModel(
model_id="us.amazon.nova-pro-v1:0",
temperature=0.3,
streaming=True, # Enable/disable streaming
)
agent = Agent(model=bedrock_model)
agent("Tell me about Agentic AI")
# Google Gemini
gemini_model = GeminiModel(
client_args={
"api_key": "your_gemini_api_key",
},
model_id="gemini-2.5-flash",
params={"temperature": 0.7}
)
agent = Agent(model=gemini_model)
agent("Tell me about Agentic AI")
# Ollama
ollama_model = OllamaModel(
host="http://localhost:11434",
model_id="llama3"
)
agent = Agent(model=ollama_model)
agent("Tell me about Agentic AI")
# Llama API
llama_model = LlamaAPIModel(
model_id="Llama-4-Maverick-17B-128E-Instruct-FP8",
)
agent = Agent(model=llama_model)
response = agent("Tell me about Agentic AI")
Built-in providers:
- Amazon Bedrock
- Anthropic
- Gemini
- Cohere
- LiteLLM
- llama.cpp
- LlamaAPI
- MistralAI
- Ollama
- OpenAI
- SageMaker
- Writer
Custom providers can be implemented using Custom Providers
Example tools
Strands offers an optional strands-agents-tools package with pre-built tools for quick experimentation:
from strands import Agent
from strands_tools import calculator
agent = Agent(tools=[calculator])
agent("What is the square root of 1764")
It's also available on GitHub via strands-agents/tools.
Documentation
For detailed guidance & examples, explore our documentation:
Contributing ❤️
We welcome contributions! See our Contributing Guide for details on:
- Reporting bugs & features
- Development setup
- Contributing via Pull Requests
- Code of Conduct
- Reporting of security issues
License
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Security
See CONTRIBUTING for more information.