# AWS The `LangChain` integrations related to [Amazon AWS](https://aws.amazon.com/) platform. First-party AWS integrations are available in the `langchain_aws` package. ```bash pip install langchain-aws ``` And there are also some community integrations available in the `langchain_community` package with the `boto3` optional dependency. ```bash pip install langchain-community boto3 ``` ## Chat models ### Bedrock Chat >[Amazon Bedrock](https://aws.amazon.com/bedrock/) is a fully managed service that offers a choice of > high-performing foundation models (FMs) from leading AI companies like `AI21 Labs`, `Anthropic`, `Cohere`, > `Meta`, `Stability AI`, and `Amazon` via a single API, along with a broad set of capabilities you need to > build generative AI applications with security, privacy, and responsible AI. Using `Amazon Bedrock`, > you can easily experiment with and evaluate top FMs for your use case, privately customize them with > your data using techniques such as fine-tuning and `Retrieval Augmented Generation` (`RAG`), and build > agents that execute tasks using your enterprise systems and data sources. Since `Amazon Bedrock` is > serverless, you don't have to manage any infrastructure, and you can securely integrate and deploy > generative AI capabilities into your applications using the AWS services you are already familiar with. See a [usage example](/docs/integrations/chat/bedrock). ```python from langchain_aws import ChatBedrock ``` ### Bedrock Converse AWS has recently released the Bedrock Converse API which provides a unified conversational interface for Bedrock models. This API does not yet support custom models. You can see a list of all [models that are supported here](https://docs.aws.amazon.com/bedrock/latest/userguide/conversation-inference.html). To improve reliability the ChatBedrock integration will switch to using the Bedrock Converse API as soon as it has feature parity with the existing Bedrock API. Until then a separate [ChatBedrockConverse](https://python.langchain.com/api_reference/aws/chat_models/langchain_aws.chat_models.bedrock_converse.ChatBedrockConverse.html) integration has been released. We recommend using `ChatBedrockConverse` for users who do not need to use custom models. See the [docs](/docs/integrations/chat/bedrock/#bedrock-converse-api) and [API reference](https://python.langchain.com/api_reference/aws/chat_models/langchain_aws.chat_models.bedrock_converse.ChatBedrockConverse.html) for more detail. ```python from langchain_aws import ChatBedrockConverse ``` ## LLMs ### Bedrock See a [usage example](/docs/integrations/llms/bedrock). ```python from langchain_aws import BedrockLLM ``` ### Amazon API Gateway >[Amazon API Gateway](https://aws.amazon.com/api-gateway/) is a fully managed service that makes it easy for > developers to create, publish, maintain, monitor, and secure APIs at any scale. APIs act as the "front door" > for applications to access data, business logic, or functionality from your backend services. Using > `API Gateway`, you can create RESTful APIs and WebSocket APIs that enable real-time two-way communication > applications. `API Gateway` supports containerized and serverless workloads, as well as web applications. > > `API Gateway` handles all the tasks involved in accepting and processing up to hundreds of thousands of > concurrent API calls, including traffic management, CORS support, authorization and access control, > throttling, monitoring, and API version management. `API Gateway` has no minimum fees or startup costs. > You pay for the API calls you receive and the amount of data transferred out and, with the `API Gateway` > tiered pricing model, you can reduce your cost as your API usage scales. See a [usage example](/docs/integrations/llms/amazon_api_gateway). ```python from langchain_community.llms import AmazonAPIGateway ``` ### SageMaker Endpoint >[Amazon SageMaker](https://aws.amazon.com/sagemaker/) is a system that can build, train, and deploy > machine learning (ML) models with fully managed infrastructure, tools, and workflows. We use `SageMaker` to host our model and expose it as the `SageMaker Endpoint`. See a [usage example](/docs/integrations/llms/sagemaker). ```python from langchain_aws import SagemakerEndpoint ``` ## Embedding Models ### Bedrock See a [usage example](/docs/integrations/text_embedding/bedrock). ```python from langchain_aws import BedrockEmbeddings ``` ### SageMaker Endpoint See a [usage example](/docs/integrations/text_embedding/sagemaker-endpoint). ```python from langchain_community.embeddings import SagemakerEndpointEmbeddings from langchain_community.llms.sagemaker_endpoint import ContentHandlerBase ``` ## Document loaders ### AWS S3 Directory and File >[Amazon Simple Storage Service (Amazon S3)](https://docs.aws.amazon.com/AmazonS3/latest/userguide/using-folders.html) > is an object storage service. >[AWS S3 Directory](https://docs.aws.amazon.com/AmazonS3/latest/userguide/using-folders.html) >[AWS S3 Buckets](https://docs.aws.amazon.com/AmazonS3/latest/userguide/UsingBucket.html) See a [usage example for S3DirectoryLoader](/docs/integrations/document_loaders/aws_s3_directory). See a [usage example for S3FileLoader](/docs/integrations/document_loaders/aws_s3_file). ```python from langchain_community.document_loaders import S3DirectoryLoader, S3FileLoader ``` ### Amazon Textract >[Amazon Textract](https://docs.aws.amazon.com/managedservices/latest/userguide/textract.html) is a machine > learning (ML) service that automatically extracts text, handwriting, and data from scanned documents. See a [usage example](/docs/integrations/document_loaders/amazon_textract). ```python from langchain_community.document_loaders import AmazonTextractPDFLoader ``` ### Amazon Athena >[Amazon Athena](https://aws.amazon.com/athena/) is a serverless, interactive analytics service built >on open-source frameworks, supporting open-table and file formats. See a [usage example](/docs/integrations/document_loaders/athena). ```python from langchain_community.document_loaders.athena import AthenaLoader ``` ### AWS Glue >The [AWS Glue Data Catalog](https://docs.aws.amazon.com/en_en/glue/latest/dg/catalog-and-crawler.html) is a centralized metadata > repository that allows you to manage, access, and share metadata about > your data stored in AWS. It acts as a metadata store for your data assets, > enabling various AWS services and your applications to query and connect > to the data they need efficiently. See a [usage example](/docs/integrations/document_loaders/glue_catalog). ```python from langchain_community.document_loaders.glue_catalog import GlueCatalogLoader ``` ## Vector stores ### Amazon OpenSearch Service > [Amazon OpenSearch Service](https://aws.amazon.com/opensearch-service/) performs > interactive log analytics, real-time application monitoring, website search, and more. `OpenSearch` is > an open source, > distributed search and analytics suite derived from `Elasticsearch`. `Amazon OpenSearch Service` offers the > latest versions of `OpenSearch`, support for many versions of `Elasticsearch`, as well as > visualization capabilities powered by `OpenSearch Dashboards` and `Kibana`. We need to install several python libraries. ```bash pip install boto3 requests requests-aws4auth ``` See a [usage example](/docs/integrations/vectorstores/opensearch#using-aos-amazon-opensearch-service). ```python from langchain_community.vectorstores import OpenSearchVectorSearch ``` ### Amazon DocumentDB Vector Search >[Amazon DocumentDB (with MongoDB Compatibility)](https://docs.aws.amazon.com/documentdb/) makes it easy to set up, operate, and scale MongoDB-compatible databases in the cloud. > With Amazon DocumentDB, you can run the same application code and use the same drivers and tools that you use with MongoDB. > Vector search for Amazon DocumentDB combines the flexibility and rich querying capability of a JSON-based document database with the power of vector search. #### Installation and Setup See [detail configuration instructions](/docs/integrations/vectorstores/documentdb). We need to install the `pymongo` python package. ```bash pip install pymongo ``` #### Deploy DocumentDB on AWS [Amazon DocumentDB (with MongoDB Compatibility)](https://docs.aws.amazon.com/documentdb/) is a fast, reliable, and fully managed database service. Amazon DocumentDB makes it easy to set up, operate, and scale MongoDB-compatible databases in the cloud. AWS offers services for computing, databases, storage, analytics, and other functionality. For an overview of all AWS services, see [Cloud Computing with Amazon Web Services](https://aws.amazon.com/what-is-aws/). See a [usage example](/docs/integrations/vectorstores/documentdb). ```python from langchain_community.vectorstores import DocumentDBVectorSearch ``` ### Amazon MemoryDB [Amazon MemoryDB](https://aws.amazon.com/memorydb/) is a durable, in-memory database service that delivers ultra-fast performance. MemoryDB is compatible with Redis OSS, a popular open source data store, enabling you to quickly build applications using the same flexible and friendly Redis OSS APIs, and commands that they already use today. InMemoryVectorStore class provides a vectorstore to connect with Amazon MemoryDB. ```python from langchain_aws.vectorstores.inmemorydb import InMemoryVectorStore vds = InMemoryVectorStore.from_documents( chunks, embeddings, redis_url="rediss://cluster_endpoint:6379/ssl=True ssl_cert_reqs=none", vector_schema=vector_schema, index_name=INDEX_NAME, ) ``` See a [usage example](/docs/integrations/vectorstores/memorydb). ## Retrievers ### Amazon Kendra > [Amazon Kendra](https://docs.aws.amazon.com/kendra/latest/dg/what-is-kendra.html) is an intelligent search service > provided by `Amazon Web Services` (`AWS`). It utilizes advanced natural language processing (NLP) and machine > learning algorithms to enable powerful search capabilities across various data sources within an organization. > `Kendra` is designed to help users find the information they need quickly and accurately, > improving productivity and decision-making. > With `Kendra`, we can search across a wide range of content types, including documents, FAQs, knowledge bases, > manuals, and websites. It supports multiple languages and can understand complex queries, synonyms, and > contextual meanings to provide highly relevant search results. We need to install the `langchain-aws` library. ```bash pip install langchain-aws ``` See a [usage example](/docs/integrations/retrievers/amazon_kendra_retriever). ```python from langchain_aws import AmazonKendraRetriever ``` ### Amazon Bedrock (Knowledge Bases) > [Knowledge bases for Amazon Bedrock](https://aws.amazon.com/bedrock/knowledge-bases/) is an > `Amazon Web Services` (`AWS`) offering which lets you quickly build RAG applications by using your > private data to customize foundation model response. We need to install the `langchain-aws` library. ```bash pip install langchain-aws ``` See a [usage example](/docs/integrations/retrievers/bedrock). ```python from langchain_aws import AmazonKnowledgeBasesRetriever ``` ## Tools ### AWS Lambda >[`Amazon AWS Lambda`](https://aws.amazon.com/pm/lambda/) is a serverless computing service provided by > `Amazon Web Services` (`AWS`). It helps developers to build and run applications and services without > provisioning or managing servers. This serverless architecture enables you to focus on writing and > deploying code, while AWS automatically takes care of scaling, patching, and managing the > infrastructure required to run your applications. We need to install `boto3` python library. ```bash pip install boto3 ``` See a [usage example](/docs/integrations/tools/awslambda). ## Memory ### AWS DynamoDB >[AWS DynamoDB](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/dynamodb/index.html) > is a fully managed `NoSQL` database service that provides fast and predictable performance with seamless scalability. We have to configure the [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html). We need to install the `boto3` library. ```bash pip install boto3 ``` See a [usage example](/docs/integrations/memory/aws_dynamodb). ```python from langchain_community.chat_message_histories import DynamoDBChatMessageHistory ``` ## Graphs ### Amazon Neptune >[Amazon Neptune](https://aws.amazon.com/neptune/) > is a high-performance graph analytics and serverless database for superior scalability and availability. For the Cypher and SPARQL integrations below, we need to install the `langchain-aws` library. ```bash pip install langchain-aws ``` ### Amazon Neptune with Cypher See a [usage example](/docs/integrations/graphs/amazon_neptune_open_cypher). ```python from langchain_aws.graphs import NeptuneGraph from langchain_aws.graphs import NeptuneAnalyticsGraph from langchain_aws.chains import create_neptune_opencypher_qa_chain ``` ### Amazon Neptune with SPARQL See a [usage example](/docs/integrations/graphs/amazon_neptune_sparql). ```python from langchain_aws.graphs import NeptuneRdfGraph from langchain_aws.chains import create_neptune_sparql_qa_chain ``` ## Callbacks ### Bedrock token usage ```python from langchain_community.callbacks.bedrock_anthropic_callback import BedrockAnthropicTokenUsageCallbackHandler ``` ### SageMaker Tracking >[Amazon SageMaker](https://aws.amazon.com/sagemaker/) is a fully managed service that is used to quickly > and easily build, train and deploy machine learning (ML) models. >[Amazon SageMaker Experiments](https://docs.aws.amazon.com/sagemaker/latest/dg/experiments.html) is a capability > of `Amazon SageMaker` that lets you organize, track, > compare and evaluate ML experiments and model versions. We need to install several python libraries. ```bash pip install google-search-results sagemaker ``` See a [usage example](/docs/integrations/callbacks/sagemaker_tracking). ```python from langchain_community.callbacks import SageMakerCallbackHandler ``` ## Chains ### Amazon Comprehend Moderation Chain >[Amazon Comprehend](https://aws.amazon.com/comprehend/) is a natural-language processing (NLP) service that > uses machine learning to uncover valuable insights and connections in text. We need to install the `boto3` and `nltk` libraries. ```bash pip install boto3 nltk ``` See a [usage example](https://python.langchain.com/v0.1/docs/guides/productionization/safety/amazon_comprehend_chain/). ```python from langchain_experimental.comprehend_moderation import AmazonComprehendModerationChain ```