langchain-azure-ai1.0.3
Published
An integration package to support Azure AI Foundry capabilities in LangChain/LangGraph ecosystem.
pip install langchain-azure-ai
Package Downloads
Authors
Project URLs
Requires Python
<4.0,>=3.10.0
Dependencies
- aiohttp
<4.0,>=3.10 - azure-ai-agents
<1.3.0,>=1.2.0b3 - azure-ai-documentintelligence
<2.0.0,>=1.0.2; extra == "tools" - azure-ai-inference
[opentelemetry]<2.0,>=1.0.0b9 - azure-ai-projects
<2.0,>=1.0 - azure-ai-textanalytics
<6.0.0,>=5.3.0; extra == "tools" - azure-ai-vision-imageanalysis
<2.0.0,>=1.0.0; extra == "tools" - azure-core
<2.0,>=1.32 - azure-cosmos
<5.0,>=4.14.0b1 - azure-identity
<2.0,>=1.15 - azure-mgmt-logic
<11.0.0,>=10.0.0; extra == "tools" - azure-monitor-opentelemetry
<2.0,>=1.6; extra == "opentelemetry" - azure-search-documents
<12.0,>=11.4 - langchain
<2.0.0,>=1.0.0 - langchain-openai
<2.0.0,>=1.0.0 - numpy
>=1.26.2; python_version < "3.13" - numpy
>=2.1.0; python_version >= "3.13" - opentelemetry-api
>=1.37; extra == "opentelemetry" - opentelemetry-instrumentation
>=0.58b0; extra == "opentelemetry" - opentelemetry-instrumentation-threading
>=0.58b0; extra == "opentelemetry" - opentelemetry-semantic-conventions
>=0.58b0; extra == "opentelemetry" - opentelemetry-semantic-conventions-ai
<0.5.0,>=0.4.2; extra == "opentelemetry" - six
<2.0.0,>=1.17.0
langchain-azure-ai
This package contains the LangChain integration for Azure AI Foundry. To learn more about how to use this package, see the LangChain documentation in Azure AI Foundry.
Installation
pip install -U langchain-azure-ai
For using tools, including Azure AI Document Intelligence, Azure AI Text Analytics for Health, or Azure LogicApps, please install the extras tools:
pip install -U langchain-azure-ai[tools]
For using tracing capabilities with OpenTelemetry, you need to add the extras opentelemetry:
pip install -U langchain-azure-ai[opentelemetry]
Quick Start with langchain-azure-ai
The langchain-azure-ai package uses the Azure AI Foundry family of SDKs and client libraries for Azure to provide first-class support of Azure AI Foundry capabilities in LangChain and LangGraph.
This package includes:
- Azure AI Agent Service
- Azure AI Foundry Models inference
- Azure AI Search
- Azure AI Services tools
- Cosmos DB
Here's a quick start example to show you how to get started with the Chat Completions model. For more details and tutorials see Develop with LangChain and LangGraph and models from Azure AI Foundry.
Azure AI Chat Completions Model with Azure OpenAI
from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
from langchain_core.messages import HumanMessage, SystemMessage
model = AzureAIChatCompletionsModel(
endpoint="https://{your-resource-name}.services.ai.azure.com/openai/v1",
credential="your-api-key", #if using Entra ID you can should use DefaultAzureCredential() instead
model="gpt-4o"
)
messages = [
SystemMessage(
content="Translate the following from English into Italian"
),
HumanMessage(content="hi!"),
]
model.invoke(messages)
AIMessage(content='Ciao!', additional_kwargs={}, response_metadata={'model': 'gpt-4o', 'token_usage': {'input_tokens': 20, 'output_tokens': 3, 'total_tokens': 23}, 'finish_reason': 'stop'}, id='run-0758e7ec-99cd-440b-bfa2-3a1078335133-0', usage_metadata={'input_tokens': 20, 'output_tokens': 3, 'total_tokens': 23})
Azure AI Chat Completions Model with DeepSeek-R1
from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
from langchain_core.messages import HumanMessage, SystemMessage
model = AzureAIChatCompletionsModel(
endpoint="https://{your-resource-name}.services.ai.azure.com/models",
credential="your-api-key", #if using Entra ID you can should use DefaultAzureCredential() instead
model="DeepSeek-R1",
)
messages = [
HumanMessage(content="Translate the following from English into Italian: \"hi!\"")
]
message_stream = model.stream(messages)
print(' '.join(chunk.content for chunk in message_stream))
<think>
Okay , the user just sent " hi !" and I need to translate that into Italian . Let me think . " Hi " is an informal greeting , so in Italian , the equivalent would be " C iao !" But wait , there are other options too . Sometimes people use " Sal ve ," which is a bit more neutral , but " C iao " is more common in casual settings . The user probably wants a straightforward translation , so " C iao !" is the safest bet here . Let me double -check to make sure there 's no nuance I 'm missing . N ope , " C iao " is definitely the right choice for translating " hi !" in an informal context . I 'll go with that .
</think>
C iao !
Changelog
-
1.0.2:
- We updated the
AzureAIOpenTelemetryTracerto create a parent trace for multi agent scenarios. Previously, you were required to do this manually, which was unnecesary.
- We updated the
-
1.0.0:
- We introduce support for LangChain and LangGraph 1.0.
-
0.1.8:
- We fixed some issues with
AzureAIOpenTelemetryTracer, including compliant hierarchy, tool spans under chat, finish reason normalization, conversation id. See [PR #167] - We fixed an issue with taking image inputs for declarative agents created with Azure AI Foundry Agents service.
- We enhanced tool descriptions to improve tool call accuracy.
- We fixed some issues with
-
0.1.7:
- [NEW]: We introduce LangGraph support for declarative agents created in Azure AI Foundry. You can now compose complex graphs in LangGraph and add nodes that take advantage of Azure AI Agent Service. See
AgentServiceFactory - We fix an issue with the interface of
AzureAIEmbeddingsModel#158. - We improve the signatures of the tools
AzureAIDocumentIntelligenceTool,AzureAIImageAnalysisTool, andAzureAITextAnalyticsHealthToolPR #160.
- [NEW]: We introduce LangGraph support for declarative agents created in Azure AI Foundry. You can now compose complex graphs in LangGraph and add nodes that take advantage of Azure AI Agent Service. See
-
0.1.6:
- [Breaking change]: Using parameter
project_connection_stringto createAzureAIEmbeddingsModelandAzureAIChatCompletionsModelis not longer supported. Useproject_endpointinstead. - [Breaking change]: Class
AzureAIInferenceTracerhas been removed in favor ofAzureAIOpenTelemetryTracerwhich has a better support for OpenTelemetry and the new semantic conventions for GenAI. - Adding the following tools to the package:
AzureAIDocumentIntelligenceTool,AzureAIImageAnalysisTool, andAzureAITextAnalyticsHealthTool. You can also useAIServicesToolkitto have access to all the tools in Azure AI Services.
- [Breaking change]: Using parameter
-
0.1.4:
- Bug fix #91.
-
0.1.3:
- [Breaking change]: We renamed the parameter
model_nameinAzureAIEmbeddingsModelandAzureAIChatCompletionsModeltomodel, which is the parameter expected by the methodlangchain.chat_models.init_chat_model. - We fixed an issue with JSON mode in chat models #81.
- We fixed the dependencies for NumpPy #70.
- We fixed an issue when tracing Pyndantic objects in the inputs #65.
- We made
connection_stringparameter optional as suggested at #65.
- [Breaking change]: We renamed the parameter
-
0.1.2:
- Bug fix #35.
-
0.1.1:
- Adding
AzureCosmosDBNoSqlVectorSearchandAzureCosmosDBNoSqlSemanticCachefor vector search and full text search. - Adding
AzureCosmosDBMongoVCoreVectorSearchandAzureCosmosDBMongoVCoreSemanticCachefor vector search. - You can now create
AzureAIEmbeddingsModelandAzureAIChatCompletionsModelclients directly from your AI project's connection string using the parameterproject_connection_string. Your default Azure AI Services connection is used to find the model requested. This requires to haveazure-ai-projectspackage installed. - Support for native LLM structure outputs. Use
with_structured_output(method="json_schema")to use native structured schema support. Usewith_structured_output(method="json_mode")to use native JSON outputs capabilities. By default, LangChain usesmethod="function_calling"which uses tool calling capabilities to generate valid structure JSON payloads. This requires to haveazure-ai-inference >= 1.0.0b7. - Bug fix #18 and #31.
- Adding
-
0.1.0:
- Introduce
AzureAIEmbeddingsModelfor embedding generation andAzureAIChatCompletionsModelfor chat completions generation using the Azure AI Inference API. This client also supports GitHub Models endpoint. - Introduce
AzureAIOpenTelemetryTracerfor tracing with OpenTelemetry and Azure Application Insights.
- Introduce