langchain-azure-ai0.1.4
langchain-azure-ai0.1.4
Published
An integration package to support Azure AI Foundry capabilities for model inference in LangChain.
pip install langchain-azure-ai
Package Downloads
Authors
Project URLs
Requires Python
<4.0,>=3.9
Dependencies
- aiohttp
<4.0.0,>=3.10.0
- azure-ai-inference
[opentelemetry]<2.0.0,>=1.0.0b7
- azure-core
<2.0.0,>=1.32.0
- azure-cosmos
<5.0.0,>=4.9.0
- azure-identity
<2.0.0,>=1.15.0
- azure-monitor-opentelemetry
<2.0.0,>=1.6.4; extra == "opentelemetry"
- langchain-core
<0.4.0,>=0.3.0
- langchain-openai
<0.4.0,>=0.3.0
- numpy
>=1.26.2; python_version < "3.13"
- numpy
>=2.1.0; python_version >= "3.13"
- opentelemetry-instrumentation-threading
<0.50,>=0.49b2; extra == "opentelemetry"
- opentelemetry-semantic-conventions-ai
<0.5.0,>=0.4.2; extra == "opentelemetry"
langchain-azure-ai
This package contains the LangChain integration for Azure AI Foundry. To learn more about how to use this package, see the LangChain documentation in Azure AI Foundry.
[!NOTE] This package is in Public Preview. For more information, see Supplemental Terms of Use for Microsoft Azure Previews.
Installation
pip install -U langchain-azure-ai
For using tracing capabilities with OpenTelemetry, you need to add the extras opentelemetry
:
pip install -U langchain-azure-ai[opentelemetry]
Changelog
-
0.1.4:
- Bug fix #91.
-
0.1.3:
- [Breaking change]: We renamed the parameter
model_name
inAzureAIEmbeddingsModel
andAzureAIChatCompletionsModel
tomodel
, which is the parameter expected by the methodlangchain.chat_models.init_chat_model
. - We fixed an issue with JSON mode in chat models #81.
- We fixed the dependencies for NumpPy #70.
- We fixed an issue when tracing Pyndantic objects in the inputs #65.
- We made
connection_string
parameter optional as suggested at #65.
- [Breaking change]: We renamed the parameter
-
0.1.2:
- Bug fix #35.
-
0.1.1:
- Adding
AzureCosmosDBNoSqlVectorSearch
andAzureCosmosDBNoSqlSemanticCache
for vector search and full text search. - Adding
AzureCosmosDBMongoVCoreVectorSearch
andAzureCosmosDBMongoVCoreSemanticCache
for vector search. - You can now create
AzureAIEmbeddingsModel
andAzureAIChatCompletionsModel
clients directly from your AI project's connection string using the parameterproject_connection_string
. Your default Azure AI Services connection is used to find the model requested. This requires to haveazure-ai-projects
package installed. - Support for native LLM structure outputs. Use
with_structured_output(method="json_schema")
to use native structured schema support. Usewith_structured_output(method="json_mode")
to use native JSON outputs capabilities. By default, LangChain usesmethod="function_calling"
which uses tool calling capabilities to generate valid structure JSON payloads. This requires to haveazure-ai-inference >= 1.0.0b7
. - Bug fix #18 and #31.
- Adding
-
0.1.0:
- Introduce
AzureAIEmbeddingsModel
for embedding generation andAzureAIChatCompletionsModel
for chat completions generation using the Azure AI Inference API. This client also supports GitHub Models endpoint. - Introduce
AzureAIInferenceTracer
for tracing with OpenTelemetry and Azure Application Insights.
- Introduce