status
stringclasses
1 value
repo_name
stringclasses
31 values
repo_url
stringclasses
31 values
issue_id
int64
1
104k
title
stringlengths
4
233
body
stringlengths
0
186k
issue_url
stringlengths
38
56
pull_url
stringlengths
37
54
before_fix_sha
stringlengths
40
40
after_fix_sha
stringlengths
40
40
report_datetime
unknown
language
stringclasses
5 values
commit_datetime
unknown
updated_file
stringlengths
7
188
chunk_content
stringlengths
1
1.03M
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
7,034
Loading online PDFs gives temporary file path as source in metadata
Hi, first up, thank you for making langchain! I was playing around a little and found a minor issue with loading online PDFs, and would like to start contributing to langchain maybe by fixing this. ### System Info langchain 0.0.220, google collab, python 3.10 ### Who can help? _No response_ ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [ ] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [X] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction ```python from langchain.document_loaders import PyMuPDFLoader loader = PyMuPDFLoader('https://www.w3.org/WAI/ER/tests/xhtml/testfiles/resources/pdf/dummy.pdf') pages = loader.load() pages[0].metadata ``` <img width="977" alt="image" src="https://github.com/hwchase17/langchain/assets/21276922/4ededc60-bb03-4502-a8c8-3c221ab109c4"> ### Expected behavior Instead of giving the temporary file path, which is not useful and deleted shortly after, it could be more helpful if the source is set to be the URL passed to it. This would require some fixes in the `langchain/document_loaders/pdf.py` file.
https://github.com/langchain-ai/langchain/issues/7034
https://github.com/langchain-ai/langchain/pull/13274
6f64cb5078bb71007d25fff847541fd8f7713c0c
9bd6e9df365e966938979511237c035a02fb4fa9
"2023-07-01T23:24:53Z"
python
"2023-11-29T20:07:46Z"
libs/langchain/langchain/document_loaders/pdf.py
"""Load file.""" parser = PDFPlumberParser( text_kwargs=self.text_kwargs, dedupe=self.dedupe, extract_images=self.extract_images, ) blob = Blob.from_path(self.file_path) return parser.parse(blob) class AmazonTextractPDFLoader(BasePDFLoader): """Load `PDF` files from a local file system, HTTP or S3. To authenticate, the AWS client uses the following methods to automatically load credentials: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html If a specific credential profile should be used, you must pass the name of the profile from the ~/.aws/credentials file that is to be used. Make sure the credentials / roles used have the required policies to access the Amazon Textract service. Example: .. code-block:: python from langchain.document_loaders import AmazonTextractPDFLoader loader = AmazonTextractPDFLoader( file_path="s3://pdfs/myfile.pdf" ) document = loader.load() """ def __init__(
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
7,034
Loading online PDFs gives temporary file path as source in metadata
Hi, first up, thank you for making langchain! I was playing around a little and found a minor issue with loading online PDFs, and would like to start contributing to langchain maybe by fixing this. ### System Info langchain 0.0.220, google collab, python 3.10 ### Who can help? _No response_ ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [ ] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [X] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction ```python from langchain.document_loaders import PyMuPDFLoader loader = PyMuPDFLoader('https://www.w3.org/WAI/ER/tests/xhtml/testfiles/resources/pdf/dummy.pdf') pages = loader.load() pages[0].metadata ``` <img width="977" alt="image" src="https://github.com/hwchase17/langchain/assets/21276922/4ededc60-bb03-4502-a8c8-3c221ab109c4"> ### Expected behavior Instead of giving the temporary file path, which is not useful and deleted shortly after, it could be more helpful if the source is set to be the URL passed to it. This would require some fixes in the `langchain/document_loaders/pdf.py` file.
https://github.com/langchain-ai/langchain/issues/7034
https://github.com/langchain-ai/langchain/pull/13274
6f64cb5078bb71007d25fff847541fd8f7713c0c
9bd6e9df365e966938979511237c035a02fb4fa9
"2023-07-01T23:24:53Z"
python
"2023-11-29T20:07:46Z"
libs/langchain/langchain/document_loaders/pdf.py
self, file_path: str, textract_features: Optional[Sequence[str]] = None, client: Optional[Any] = None, credentials_profile_name: Optional[str] = None, region_name: Optional[str] = None, endpoint_url: Optional[str] = None, headers: Optional[Dict] = None, ) -> None: """Initialize the loader. Args: file_path: A file, url or s3 path for input file textract_features: Features to be used for extraction, each feature should be passed as a str that conforms to the enum `Textract_Features`, see `amazon-textract-caller` pkg client: boto3 textract client (Optional) credentials_profile_name: AWS profile name, if not default (Optional) region_name: AWS region, eg us-east-1 (Optional) endpoint_url: endpoint url for the textract service (Optional) """ super().__init__(file_path, headers=headers) try: import textractcaller as tc except ImportError: raise ModuleNotFoundError( "Could not import amazon-textract-caller python package. " "Please install it with `pip install amazon-textract-caller`." ) if textract_features:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
7,034
Loading online PDFs gives temporary file path as source in metadata
Hi, first up, thank you for making langchain! I was playing around a little and found a minor issue with loading online PDFs, and would like to start contributing to langchain maybe by fixing this. ### System Info langchain 0.0.220, google collab, python 3.10 ### Who can help? _No response_ ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [ ] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [X] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction ```python from langchain.document_loaders import PyMuPDFLoader loader = PyMuPDFLoader('https://www.w3.org/WAI/ER/tests/xhtml/testfiles/resources/pdf/dummy.pdf') pages = loader.load() pages[0].metadata ``` <img width="977" alt="image" src="https://github.com/hwchase17/langchain/assets/21276922/4ededc60-bb03-4502-a8c8-3c221ab109c4"> ### Expected behavior Instead of giving the temporary file path, which is not useful and deleted shortly after, it could be more helpful if the source is set to be the URL passed to it. This would require some fixes in the `langchain/document_loaders/pdf.py` file.
https://github.com/langchain-ai/langchain/issues/7034
https://github.com/langchain-ai/langchain/pull/13274
6f64cb5078bb71007d25fff847541fd8f7713c0c
9bd6e9df365e966938979511237c035a02fb4fa9
"2023-07-01T23:24:53Z"
python
"2023-11-29T20:07:46Z"
libs/langchain/langchain/document_loaders/pdf.py
features = [tc.Textract_Features[x] for x in textract_features] else: features = [] if credentials_profile_name or region_name or endpoint_url: try: import boto3 if credentials_profile_name is not None: session = boto3.Session(profile_name=credentials_profile_name) else: session = boto3.Session() client_params = {} if region_name: client_params["region_name"] = region_name if endpoint_url: client_params["endpoint_url"] = endpoint_url client = session.client("textract", **client_params) except ImportError: raise ModuleNotFoundError( "Could not import boto3 python package. " "Please install it with `pip install boto3`." ) except Exception as e: raise ValueError( "Could not load credentials to authenticate with AWS client. " "Please check that credentials in the specified " "profile name are valid." ) from e self.parser = AmazonTextractPDFParser(textract_features=features, client=client) def load(self) -> List[Document]:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
7,034
Loading online PDFs gives temporary file path as source in metadata
Hi, first up, thank you for making langchain! I was playing around a little and found a minor issue with loading online PDFs, and would like to start contributing to langchain maybe by fixing this. ### System Info langchain 0.0.220, google collab, python 3.10 ### Who can help? _No response_ ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [ ] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [X] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction ```python from langchain.document_loaders import PyMuPDFLoader loader = PyMuPDFLoader('https://www.w3.org/WAI/ER/tests/xhtml/testfiles/resources/pdf/dummy.pdf') pages = loader.load() pages[0].metadata ``` <img width="977" alt="image" src="https://github.com/hwchase17/langchain/assets/21276922/4ededc60-bb03-4502-a8c8-3c221ab109c4"> ### Expected behavior Instead of giving the temporary file path, which is not useful and deleted shortly after, it could be more helpful if the source is set to be the URL passed to it. This would require some fixes in the `langchain/document_loaders/pdf.py` file.
https://github.com/langchain-ai/langchain/issues/7034
https://github.com/langchain-ai/langchain/pull/13274
6f64cb5078bb71007d25fff847541fd8f7713c0c
9bd6e9df365e966938979511237c035a02fb4fa9
"2023-07-01T23:24:53Z"
python
"2023-11-29T20:07:46Z"
libs/langchain/langchain/document_loaders/pdf.py
"""Load given path as pages.""" return list(self.lazy_load()) def lazy_load( self, ) -> Iterator[Document]: """Lazy load documents""" if self.web_path and self._is_s3_url(self.web_path): blob = Blob(path=self.web_path) else: blob = Blob.from_path(self.file_path) if AmazonTextractPDFLoader._get_number_of_pages(blob) > 1: raise ValueError( f"the file {blob.path} is a multi-page document, \ but not stored on S3. \ Textract requires multi-page documents to be on S3." ) yield from self.parser.parse(blob) @staticmethod def _get_number_of_pages(blob: Blob) -> int:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
7,034
Loading online PDFs gives temporary file path as source in metadata
Hi, first up, thank you for making langchain! I was playing around a little and found a minor issue with loading online PDFs, and would like to start contributing to langchain maybe by fixing this. ### System Info langchain 0.0.220, google collab, python 3.10 ### Who can help? _No response_ ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [ ] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [X] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction ```python from langchain.document_loaders import PyMuPDFLoader loader = PyMuPDFLoader('https://www.w3.org/WAI/ER/tests/xhtml/testfiles/resources/pdf/dummy.pdf') pages = loader.load() pages[0].metadata ``` <img width="977" alt="image" src="https://github.com/hwchase17/langchain/assets/21276922/4ededc60-bb03-4502-a8c8-3c221ab109c4"> ### Expected behavior Instead of giving the temporary file path, which is not useful and deleted shortly after, it could be more helpful if the source is set to be the URL passed to it. This would require some fixes in the `langchain/document_loaders/pdf.py` file.
https://github.com/langchain-ai/langchain/issues/7034
https://github.com/langchain-ai/langchain/pull/13274
6f64cb5078bb71007d25fff847541fd8f7713c0c
9bd6e9df365e966938979511237c035a02fb4fa9
"2023-07-01T23:24:53Z"
python
"2023-11-29T20:07:46Z"
libs/langchain/langchain/document_loaders/pdf.py
try: import pypdf from PIL import Image, ImageSequence except ImportError: raise ModuleNotFoundError( "Could not import pypdf or Pilloe python package. " "Please install it with `pip install pypdf Pillow`." ) if blob.mimetype == "application/pdf": with blob.as_bytes_io() as input_pdf_file: pdf_reader = pypdf.PdfReader(input_pdf_file) return len(pdf_reader.pages) elif blob.mimetype == "image/tiff": num_pages = 0 img = Image.open(blob.as_bytes()) for _, _ in enumerate(ImageSequence.Iterator(img)): num_pages += 1 return num_pages elif blob.mimetype in ["image/png", "image/jpeg"]: return 1 else: raise ValueError(f"unsupported mime type: {blob.mimetype}") class DocumentIntelligenceLoader(BasePDFLoader):
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
7,034
Loading online PDFs gives temporary file path as source in metadata
Hi, first up, thank you for making langchain! I was playing around a little and found a minor issue with loading online PDFs, and would like to start contributing to langchain maybe by fixing this. ### System Info langchain 0.0.220, google collab, python 3.10 ### Who can help? _No response_ ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [ ] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [X] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction ```python from langchain.document_loaders import PyMuPDFLoader loader = PyMuPDFLoader('https://www.w3.org/WAI/ER/tests/xhtml/testfiles/resources/pdf/dummy.pdf') pages = loader.load() pages[0].metadata ``` <img width="977" alt="image" src="https://github.com/hwchase17/langchain/assets/21276922/4ededc60-bb03-4502-a8c8-3c221ab109c4"> ### Expected behavior Instead of giving the temporary file path, which is not useful and deleted shortly after, it could be more helpful if the source is set to be the URL passed to it. This would require some fixes in the `langchain/document_loaders/pdf.py` file.
https://github.com/langchain-ai/langchain/issues/7034
https://github.com/langchain-ai/langchain/pull/13274
6f64cb5078bb71007d25fff847541fd8f7713c0c
9bd6e9df365e966938979511237c035a02fb4fa9
"2023-07-01T23:24:53Z"
python
"2023-11-29T20:07:46Z"
libs/langchain/langchain/document_loaders/pdf.py
"""Loads a PDF with Azure Document Intelligence""" def __init__( self, file_path: str, client: Any, model: str = "prebuilt-document", headers: Optional[Dict] = None, ) -> None: """ Initialize the object for file processing with Azure Document Intelligence (formerly Form Recognizer). This constructor initializes a DocumentIntelligenceParser object to be used for parsing files using the Azure Document Intelligence API. The load method
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
7,034
Loading online PDFs gives temporary file path as source in metadata
Hi, first up, thank you for making langchain! I was playing around a little and found a minor issue with loading online PDFs, and would like to start contributing to langchain maybe by fixing this. ### System Info langchain 0.0.220, google collab, python 3.10 ### Who can help? _No response_ ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [ ] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [X] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction ```python from langchain.document_loaders import PyMuPDFLoader loader = PyMuPDFLoader('https://www.w3.org/WAI/ER/tests/xhtml/testfiles/resources/pdf/dummy.pdf') pages = loader.load() pages[0].metadata ``` <img width="977" alt="image" src="https://github.com/hwchase17/langchain/assets/21276922/4ededc60-bb03-4502-a8c8-3c221ab109c4"> ### Expected behavior Instead of giving the temporary file path, which is not useful and deleted shortly after, it could be more helpful if the source is set to be the URL passed to it. This would require some fixes in the `langchain/document_loaders/pdf.py` file.
https://github.com/langchain-ai/langchain/issues/7034
https://github.com/langchain-ai/langchain/pull/13274
6f64cb5078bb71007d25fff847541fd8f7713c0c
9bd6e9df365e966938979511237c035a02fb4fa9
"2023-07-01T23:24:53Z"
python
"2023-11-29T20:07:46Z"
libs/langchain/langchain/document_loaders/pdf.py
generates a Document node including metadata (source blob and page number) for each page. Parameters: ----------- file_path : str The path to the file that needs to be parsed. client: Any A DocumentAnalysisClient to perform the analysis of the blob model : str The model name or ID to be used for form recognition in Azure. Examples: --------- >>> obj = DocumentIntelligenceLoader( ... file_path="path/to/file", ... client=client, ... model="prebuilt-document" ... ) """ self.parser = DocumentIntelligenceParser(client=client, model=model) super().__init__(file_path, headers=headers) def load(self) -> List[Document]: """Load given path as pages.""" return list(self.lazy_load()) def lazy_load( self, ) -> Iterator[Document]: """Lazy load given path as pages.""" blob = Blob.from_path(self.file_path) yield from self.parser.parse(blob)
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
"""Tools provide access to various resources and services. LangChain has a large ecosystem of integrations with various external resources like local and remote file systems, APIs and databases. These integrations allow developers to create versatile applications that combine the power of LLMs with the ability to access, interact with and manipulate external resources. When developing an application, developers should inspect the capabilities and permissions of the tools that underlie the given agent toolkit, and determine whether permissions of the given toolkit are appropriate for the application. See [Security](https://python.langchain.com/docs/security) for more information. """ import warnings from typing import Any, Dict, List, Optional, Callable, Tuple from mypy_extensions import Arg, KwArg from langchain.agents.tools import Tool from langchain_core.language_models import BaseLanguageModel from langchain.callbacks.base import BaseCallbackManager from langchain.callbacks.manager import Callbacks from langchain.chains.api import news_docs, open_meteo_docs, podcast_docs, tmdb_docs from langchain.chains.api.base import APIChain from langchain.chains.llm_math.base import LLMMathChain from langchain.utilities.dalle_image_generator import DallEAPIWrapper from langchain.utilities.requests import TextRequestsWrapper from langchain.tools.arxiv.tool import ArxivQueryRun from langchain.tools.golden_query.tool import GoldenQueryRun from langchain.tools.pubmed.tool import PubmedQueryRun from langchain.tools.base import BaseTool
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
from langchain.tools.bing_search.tool import BingSearchRun from langchain.tools.ddg_search.tool import DuckDuckGoSearchRun from langchain.tools.google_cloud.texttospeech import GoogleCloudTextToSpeechTool from langchain.tools.google_lens.tool import GoogleLensQueryRun from langchain.tools.google_search.tool import GoogleSearchResults, GoogleSearchRun from langchain.tools.google_scholar.tool import GoogleScholarQueryRun from langchain.tools.google_finance.tool import GoogleFinanceQueryRun from langchain.tools.google_trends.tool import GoogleTrendsQueryRun from langchain.tools.metaphor_search.tool import MetaphorSearchResults from langchain.tools.google_jobs.tool import GoogleJobsQueryRun from langchain.tools.google_serper.tool import GoogleSerperResults, GoogleSerperRun from langchain.tools.searchapi.tool import SearchAPIResults, SearchAPIRun from langchain.tools.graphql.tool import BaseGraphQLTool from langchain.tools.human.tool import HumanInputRun from langchain.tools.requests.tool import ( RequestsDeleteTool, RequestsGetTool, RequestsPatchTool, RequestsPostTool, RequestsPutTool, ) from langchain.tools.eleven_labs.text2speech import ElevenLabsText2SpeechTool from langchain.tools.scenexplain.tool import SceneXplainTool from langchain.tools.searx_search.tool import SearxSearchResults, SearxSearchRun from langchain.tools.shell.tool import ShellTool from langchain.tools.sleep.tool import SleepTool from langchain.tools.stackexchange.tool import StackExchangeTool from langchain.tools.wikipedia.tool import WikipediaQueryRun from langchain.tools.wolfram_alpha.tool import WolframAlphaQueryRun from langchain.tools.openweathermap.tool import OpenWeatherMapQueryRun
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
from langchain.tools.dataforseo_api_search import DataForSeoAPISearchRun from langchain.tools.dataforseo_api_search import DataForSeoAPISearchResults from langchain.tools.memorize.tool import Memorize from langchain.tools.reddit_search.tool import RedditSearchRun from langchain.utilities.arxiv import ArxivAPIWrapper from langchain.utilities.golden_query import GoldenQueryAPIWrapper from langchain.utilities.pubmed import PubMedAPIWrapper from langchain.utilities.bing_search import BingSearchAPIWrapper from langchain.utilities.duckduckgo_search import DuckDuckGoSearchAPIWrapper from langchain.utilities.google_lens import GoogleLensAPIWrapper from langchain.utilities.google_jobs import GoogleJobsAPIWrapper from langchain.utilities.google_search import GoogleSearchAPIWrapper from langchain.utilities.google_serper import GoogleSerperAPIWrapper from langchain.utilities.google_scholar import GoogleScholarAPIWrapper from langchain.utilities.google_finance import GoogleFinanceAPIWrapper from langchain.utilities.google_trends import GoogleTrendsAPIWrapper from langchain.utilities.metaphor_search import MetaphorSearchAPIWrapper from langchain.utilities.awslambda import LambdaWrapper from langchain.utilities.graphql import GraphQLAPIWrapper from langchain.utilities.searchapi import SearchApiAPIWrapper from langchain.utilities.searx_search import SearxSearchWrapper from langchain.utilities.serpapi import SerpAPIWrapper from langchain.utilities.stackexchange import StackExchangeAPIWrapper from langchain.utilities.twilio import TwilioAPIWrapper from langchain.utilities.wikipedia import WikipediaAPIWrapper from langchain.utilities.wolfram_alpha import WolframAlphaAPIWrapper from langchain.utilities.openweathermap import OpenWeatherMapAPIWrapper from langchain.utilities.dataforseo_api_search import DataForSeoAPIWrapper from langchain.utilities.reddit_search import RedditSearchAPIWrapper def _get_python_repl() -> BaseTool:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
raise ImportError( "This tool has been moved to langchain experiment. " "This tool has access to a python REPL. " "For best practices make sure to sandbox this tool. " "Read https://github.com/langchain-ai/langchain/blob/master/SECURITY.md " "To keep using this code as is, install langchain experimental and " "update relevant imports replacing 'langchain' with 'langchain_experimental'" ) def _get_tools_requests_get() -> BaseTool:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
return RequestsGetTool(requests_wrapper=TextRequestsWrapper()) def _get_tools_requests_post() -> BaseTool: return RequestsPostTool(requests_wrapper=TextRequestsWrapper()) def _get_tools_requests_patch() -> BaseTool: return RequestsPatchTool(requests_wrapper=TextRequestsWrapper()) def _get_tools_requests_put() -> BaseTool: return RequestsPutTool(requests_wrapper=TextRequestsWrapper()) def _get_tools_requests_delete() -> BaseTool: return RequestsDeleteTool(requests_wrapper=TextRequestsWrapper()) def _get_terminal() -> BaseTool: return ShellTool() def _get_sleep() -> BaseTool: return SleepTool() _BASE_TOOLS: Dict[str, Callable[[], BaseTool]] = { "requests": _get_tools_requests_get, "requests_get": _get_tools_requests_get, "requests_post": _get_tools_requests_post, "requests_patch": _get_tools_requests_patch, "requests_put": _get_tools_requests_put, "requests_delete": _get_tools_requests_delete, "terminal": _get_terminal, "sleep": _get_sleep, } def _get_llm_math(llm: BaseLanguageModel) -> BaseTool:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
return Tool( name="Calculator", description="Useful for when you need to answer questions about math.", func=LLMMathChain.from_llm(llm=llm).run, coroutine=LLMMathChain.from_llm(llm=llm).arun, ) def _get_open_meteo_api(llm: BaseLanguageModel) -> BaseTool: chain = APIChain.from_llm_and_api_docs( llm, open_meteo_docs.OPEN_METEO_DOCS, limit_to_domains=["https://api.open-meteo.com/"], ) return Tool( name="Open-Meteo-API", description="Useful for when you want to get weather information from the OpenMeteo API. The input should be a question in natural language that this API can answer.", func=chain.run, ) _LLM_TOOLS: Dict[str, Callable[[BaseLanguageModel], BaseTool]] = { "llm-math": _get_llm_math, "open-meteo-api": _get_open_meteo_api, } def _get_news_api(llm: BaseLanguageModel, **kwargs: Any) -> BaseTool:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
news_api_key = kwargs["news_api_key"] chain = APIChain.from_llm_and_api_docs( llm, news_docs.NEWS_DOCS, headers={"X-Api-Key": news_api_key}, limit_to_domains=["https://newsapi.org/"], ) return Tool( name="News-API", description="Use this when you want to get information about the top headlines of current news stories. The input should be a question in natural language that this API can answer.", func=chain.run, ) def _get_tmdb_api(llm: BaseLanguageModel, **kwargs: Any) -> BaseTool: tmdb_bearer_token = kwargs["tmdb_bearer_token"] chain = APIChain.from_llm_and_api_docs( llm, tmdb_docs.TMDB_DOCS, headers={"Authorization": f"Bearer {tmdb_bearer_token}"}, limit_to_domains=["https://api.themoviedb.org/"], ) return Tool( name="TMDB-API", description="Useful for when you want to get information from The Movie Database. The input should be a question in natural language that this API can answer.", func=chain.run, ) def _get_podcast_api(llm: BaseLanguageModel, **kwargs: Any) -> BaseTool:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
listen_api_key = kwargs["listen_api_key"] chain = APIChain.from_llm_and_api_docs( llm, podcast_docs.PODCAST_DOCS, headers={"X-ListenAPI-Key": listen_api_key}, limit_to_domains=["https://listen-api.listennotes.com/"], ) return Tool( name="Podcast-API", description="Use the Listen Notes Podcast API to search all podcasts or episodes. The input should be a question in natural language that this API can answer.", func=chain.run, ) def _get_lambda_api(**kwargs: Any) -> BaseTool: return Tool( name=kwargs["awslambda_tool_name"], description=kwargs["awslambda_tool_description"], func=LambdaWrapper(**kwargs).run, ) def _get_wolfram_alpha(**kwargs: Any) -> BaseTool:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
return WolframAlphaQueryRun(api_wrapper=WolframAlphaAPIWrapper(**kwargs)) def _get_google_search(**kwargs: Any) -> BaseTool: return GoogleSearchRun(api_wrapper=GoogleSearchAPIWrapper(**kwargs)) def _get_wikipedia(**kwargs: Any) -> BaseTool: return WikipediaQueryRun(api_wrapper=WikipediaAPIWrapper(**kwargs)) def _get_arxiv(**kwargs: Any) -> BaseTool: return ArxivQueryRun(api_wrapper=ArxivAPIWrapper(**kwargs)) def _get_golden_query(**kwargs: Any) -> BaseTool: return GoldenQueryRun(api_wrapper=GoldenQueryAPIWrapper(**kwargs)) def _get_pubmed(**kwargs: Any) -> BaseTool: return PubmedQueryRun(api_wrapper=PubMedAPIWrapper(**kwargs)) def _get_google_jobs(**kwargs: Any) -> BaseTool: return GoogleJobsQueryRun(api_wrapper=GoogleJobsAPIWrapper(**kwargs)) def _get_google_lens(**kwargs: Any) -> BaseTool: return GoogleLensQueryRun(api_wrapper=GoogleLensAPIWrapper(**kwargs)) def _get_google_serper(**kwargs: Any) -> BaseTool: return GoogleSerperRun(api_wrapper=GoogleSerperAPIWrapper(**kwargs)) def _get_google_scholar(**kwargs: Any) -> BaseTool: return GoogleScholarQueryRun(api_wrapper=GoogleScholarAPIWrapper(**kwargs)) def _get_google_finance(**kwargs: Any) -> BaseTool: return GoogleFinanceQueryRun(api_wrapper=GoogleFinanceAPIWrapper(**kwargs)) def _get_google_trends(**kwargs: Any) -> BaseTool: return GoogleTrendsQueryRun(api_wrapper=GoogleTrendsAPIWrapper(**kwargs)) def _get_google_serper_results_json(**kwargs: Any) -> BaseTool:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
return GoogleSerperResults(api_wrapper=GoogleSerperAPIWrapper(**kwargs)) def _get_google_search_results_json(**kwargs: Any) -> BaseTool: return GoogleSearchResults(api_wrapper=GoogleSearchAPIWrapper(**kwargs)) def _get_searchapi(**kwargs: Any) -> BaseTool: return SearchAPIRun(api_wrapper=SearchApiAPIWrapper(**kwargs)) def _get_searchapi_results_json(**kwargs: Any) -> BaseTool: return SearchAPIResults(api_wrapper=SearchApiAPIWrapper(**kwargs)) def _get_serpapi(**kwargs: Any) -> BaseTool: return Tool( name="Search", description="A search engine. Useful for when you need to answer questions about current events. Input should be a search query.", func=SerpAPIWrapper(**kwargs).run, coroutine=SerpAPIWrapper(**kwargs).arun, ) def _get_stackexchange(**kwargs: Any) -> BaseTool: return StackExchangeTool(api_wrapper=StackExchangeAPIWrapper(**kwargs)) def _get_dalle_image_generator(**kwargs: Any) -> Tool: return Tool( "Dall-E-Image-Generator", DallEAPIWrapper(**kwargs).run, "A wrapper around OpenAI DALL-E API. Useful for when you need to generate images from a text description. Input should be an image description.", ) def _get_twilio(**kwargs: Any) -> BaseTool: return Tool( name="Text-Message", description="Useful for when you need to send a text message to a provided phone number.", func=TwilioAPIWrapper(**kwargs).run, ) def _get_searx_search(**kwargs: Any) -> BaseTool:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
return SearxSearchRun(wrapper=SearxSearchWrapper(**kwargs)) def _get_searx_search_results_json(**kwargs: Any) -> BaseTool: wrapper_kwargs = {k: v for k, v in kwargs.items() if k != "num_results"} return SearxSearchResults(wrapper=SearxSearchWrapper(**wrapper_kwargs), **kwargs) def _get_bing_search(**kwargs: Any) -> BaseTool: return BingSearchRun(api_wrapper=BingSearchAPIWrapper(**kwargs)) def _get_metaphor_search(**kwargs: Any) -> BaseTool: return MetaphorSearchResults(api_wrapper=MetaphorSearchAPIWrapper(**kwargs)) def _get_ddg_search(**kwargs: Any) -> BaseTool: return DuckDuckGoSearchRun(api_wrapper=DuckDuckGoSearchAPIWrapper(**kwargs)) def _get_human_tool(**kwargs: Any) -> BaseTool: return HumanInputRun(**kwargs) def _get_scenexplain(**kwargs: Any) -> BaseTool: return SceneXplainTool(**kwargs) def _get_graphql_tool(**kwargs: Any) -> BaseTool: graphql_endpoint = kwargs["graphql_endpoint"] wrapper = GraphQLAPIWrapper(graphql_endpoint=graphql_endpoint) return BaseGraphQLTool(graphql_wrapper=wrapper) def _get_openweathermap(**kwargs: Any) -> BaseTool: return OpenWeatherMapQueryRun(api_wrapper=OpenWeatherMapAPIWrapper(**kwargs)) def _get_dataforseo_api_search(**kwargs: Any) -> BaseTool: return DataForSeoAPISearchRun(api_wrapper=DataForSeoAPIWrapper(**kwargs)) def _get_dataforseo_api_search_json(**kwargs: Any) -> BaseTool: return DataForSeoAPISearchResults(api_wrapper=DataForSeoAPIWrapper(**kwargs)) def _get_eleven_labs_text2speech(**kwargs: Any) -> BaseTool: return ElevenLabsText2SpeechTool(**kwargs) def _get_memorize(llm: BaseLanguageModel, **kwargs: Any) -> BaseTool: return Memorize(llm=llm) def _get_google_cloud_texttospeech(**kwargs: Any) -> BaseTool:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
return GoogleCloudTextToSpeechTool(**kwargs) def _get_reddit_search(**kwargs: Any) -> BaseTool: return RedditSearchRun(api_wrapper=RedditSearchAPIWrapper(**kwargs)) _EXTRA_LLM_TOOLS: Dict[ str, Tuple[Callable[[Arg(BaseLanguageModel, "llm"), KwArg(Any)], BaseTool], List[str]], ] = { "news-api": (_get_news_api, ["news_api_key"]), "tmdb-api": (_get_tmdb_api, ["tmdb_bearer_token"]), "podcast-api": (_get_podcast_api, ["listen_api_key"]), "memorize": (_get_memorize, []), } _EXTRA_OPTIONAL_TOOLS: Dict[str, Tuple[Callable[[KwArg(Any)], BaseTool], List[str]]] = { "wolfram-alpha": (_get_wolfram_alpha, ["wolfram_alpha_appid"]), "google-search": (_get_google_search, ["google_api_key", "google_cse_id"]), "google-search-results-json": ( _get_google_search_results_json, ["google_api_key", "google_cse_id", "num_results"], ), "searx-search-results-json": ( _get_searx_search_results_json, ["searx_host", "engines", "num_results", "aiosession"], ), "bing-search": (_get_bing_search, ["bing_subscription_key", "bing_search_url"]), "metaphor-search": (_get_metaphor_search, ["metaphor_api_key"]), "ddg-search": (_get_ddg_search, []), "google-lens": (_get_google_lens, ["serp_api_key"]), "google-serper": (_get_google_serper, ["serper_api_key", "aiosession"]), "google-scholar": (
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
_get_google_scholar, ["top_k_results", "hl", "lr", "serp_api_key"], ), "google-finance": ( _get_google_finance, ["serp_api_key"], ), "google-trends": ( _get_google_trends, ["serp_api_key"], ), "google-jobs": ( _get_google_jobs, ["serp_api_key"], ), "google-serper-results-json": ( _get_google_serper_results_json, ["serper_api_key", "aiosession"], ), "searchapi": (_get_searchapi, ["searchapi_api_key", "aiosession"]), "searchapi-results-json": ( _get_searchapi_results_json, ["searchapi_api_key", "aiosession"], ), "serpapi": (_get_serpapi, ["serpapi_api_key", "aiosession"]), "dalle-image-generator": (_get_dalle_image_generator, ["openai_api_key"]), "twilio": (_get_twilio, ["account_sid", "auth_token", "from_number"]), "searx-search": (_get_searx_search, ["searx_host", "engines", "aiosession"]), "wikipedia": (_get_wikipedia, ["top_k_results", "lang"]), "arxiv": (
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
_get_arxiv, ["top_k_results", "load_max_docs", "load_all_available_meta"], ), "golden-query": (_get_golden_query, ["golden_api_key"]), "pubmed": (_get_pubmed, ["top_k_results"]), "human": (_get_human_tool, ["prompt_func", "input_func"]), "awslambda": ( _get_lambda_api, ["awslambda_tool_name", "awslambda_tool_description", "function_name"], ), "stackexchange": (_get_stackexchange, []), "sceneXplain": (_get_scenexplain, []), "graphql": (_get_graphql_tool, ["graphql_endpoint"]), "openweathermap-api": (_get_openweathermap, ["openweathermap_api_key"]), "dataforseo-api-search": ( _get_dataforseo_api_search, ["api_login", "api_password", "aiosession"], ), "dataforseo-api-search-json": ( _get_dataforseo_api_search_json, ["api_login", "api_password", "aiosession"], ), "eleven_labs_text2speech": (_get_eleven_labs_text2speech, ["eleven_api_key"]), "google_cloud_texttospeech": (_get_google_cloud_texttospeech, []), "reddit_search": ( _get_reddit_search, ["reddit_client_id", "reddit_client_secret", "reddit_user_agent"], ), } def _handle_callbacks(
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
callback_manager: Optional[BaseCallbackManager], callbacks: Callbacks ) -> Callbacks: if callback_manager is not None: warnings.warn( "callback_manager is deprecated. Please use callbacks instead.", DeprecationWarning, ) if callbacks is not None: raise ValueError( "Cannot specify both callback_manager and callbacks arguments." ) return callback_manager return callbacks def load_huggingface_tool( task_or_repo_id: str, model_repo_id: Optional[str] = None, token: Optional[str] = None, remote: bool = False, **kwargs: Any, ) -> BaseTool: """Loads a tool from the HuggingFace Hub. Args: task_or_repo_id: Task or model repo id. model_repo_id: Optional model repo id. token: Optional token. remote: Optional remote. Defaults to False.
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
**kwargs: Returns: A tool. """ try: from transformers import load_tool except ImportError: raise ImportError( "HuggingFace tools require the libraries `transformers>=4.29.0`" " and `huggingface_hub>=0.14.1` to be installed." " Please install it with" " `pip install --upgrade transformers huggingface_hub`." ) hf_tool = load_tool( task_or_repo_id, model_repo_id=model_repo_id, token=token, remote=remote, **kwargs, ) outputs = hf_tool.outputs if set(outputs) != {"text"}: raise NotImplementedError("Multimodal outputs not supported yet.") inputs = hf_tool.inputs if set(inputs) != {"text"}: raise NotImplementedError("Multimodal inputs not supported yet.") return Tool.from_function( hf_tool.__call__, name=hf_tool.name, description=hf_tool.description ) def load_tools(
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
tool_names: List[str], llm: Optional[BaseLanguageModel] = None, callbacks: Callbacks = None, **kwargs: Any, ) -> List[BaseTool]: """Load tools based on their name. Tools allow agents to interact with various resources and services like APIs, databases, file systems, etc. Please scope the permissions of each tools to the minimum required for the application. For example, if an application only needs to read from a database, the database tool should not be given write permissions. Moreover consider scoping the permissions to only allow accessing specific tables and impose user-level quota for limiting resource usage. Please read the APIs of the individual tools to determine which configuration they support. See [Security](https://python.langchain.com/docs/security) for more information. Args: tool_names: name of tools to load. llm: An optional language model, may be needed to initialize certain tools. callbacks: Optional callback manager or list of callback handlers. If not provided, default global callback manager will be used. Returns:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
List of tools. """ tools = [] callbacks = _handle_callbacks( callback_manager=kwargs.get("callback_manager"), callbacks=callbacks ) for name in tool_names: if name == "requests": warnings.warn( "tool name `requests` is deprecated - " "please use `requests_all` or specify the requests method" ) if name == "requests_all": requests_method_tools = [ _tool for _tool in _BASE_TOOLS if _tool.startswith("requests_") ] tool_names.extend(requests_method_tools) elif name in _BASE_TOOLS: tools.append(_BASE_TOOLS[name]()) elif name in _LLM_TOOLS: if llm is None: raise ValueError(f"Tool {name} requires an LLM to be provided") tool = _LLM_TOOLS[name](llm) tools.append(tool) elif name in _EXTRA_LLM_TOOLS: if llm is None: raise ValueError(f"Tool {name} requires an LLM to be provided")
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/agents/load_tools.py
_get_llm_tool_func, extra_keys = _EXTRA_LLM_TOOLS[name] missing_keys = set(extra_keys).difference(kwargs) if missing_keys: raise ValueError( f"Tool {name} requires some parameters that were not " f"provided: {missing_keys}" ) sub_kwargs = {k: kwargs[k] for k in extra_keys} tool = _get_llm_tool_func(llm=llm, **sub_kwargs) tools.append(tool) elif name in _EXTRA_OPTIONAL_TOOLS: _get_tool_func, extra_keys = _EXTRA_OPTIONAL_TOOLS[name] sub_kwargs = {k: kwargs[k] for k in extra_keys if k in kwargs} tool = _get_tool_func(**sub_kwargs) tools.append(tool) else: raise ValueError(f"Got unknown tool {name}") if callbacks is not None: for tool in tools: tool.callbacks = callbacks return tools def get_all_tool_names() -> List[str]: """Get a list of all possible tool names.""" return ( list(_BASE_TOOLS) + list(_EXTRA_OPTIONAL_TOOLS) + list(_EXTRA_LLM_TOOLS) + list(_LLM_TOOLS) )
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
"""**Tools** are classes that an Agent uses to interact with the world. Each tool has a **description**. Agent uses the description to choose the right tool for the job. **Class hierarchy:** .. code-block:: ToolMetaclass --> BaseTool --> <name>Tool # Examples: AIPluginTool, BaseGraphQLTool <name> # Examples: BraveSearch, HumanInputRun **Main helpers:** .. code-block:: CallbackManagerForToolRun, AsyncCallbackManagerForToolRun """ from typing import Any from langchain.tools.base import BaseTool, StructuredTool, Tool, tool _DEPRECATED_TOOLS = {"PythonAstREPLTool", "PythonREPLTool"} def _import_ainetwork_app() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
from langchain.tools.ainetwork.app import AINAppOps return AINAppOps def _import_ainetwork_owner() -> Any: from langchain.tools.ainetwork.owner import AINOwnerOps return AINOwnerOps def _import_ainetwork_rule() -> Any: from langchain.tools.ainetwork.rule import AINRuleOps return AINRuleOps def _import_ainetwork_transfer() -> Any: from langchain.tools.ainetwork.transfer import AINTransfer return AINTransfer def _import_ainetwork_value() -> Any: from langchain.tools.ainetwork.value import AINValueOps return AINValueOps def _import_arxiv_tool() -> Any: from langchain.tools.arxiv.tool import ArxivQueryRun return ArxivQueryRun def _import_azure_cognitive_services_AzureCogsFormRecognizerTool() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
from langchain.tools.azure_cognitive_services import AzureCogsFormRecognizerTool return AzureCogsFormRecognizerTool def _import_azure_cognitive_services_AzureCogsImageAnalysisTool() -> Any: from langchain.tools.azure_cognitive_services import AzureCogsImageAnalysisTool return AzureCogsImageAnalysisTool def _import_azure_cognitive_services_AzureCogsSpeech2TextTool() -> Any: from langchain.tools.azure_cognitive_services import AzureCogsSpeech2TextTool return AzureCogsSpeech2TextTool def _import_azure_cognitive_services_AzureCogsText2SpeechTool() -> Any: from langchain.tools.azure_cognitive_services import AzureCogsText2SpeechTool return AzureCogsText2SpeechTool def _import_azure_cognitive_services_AzureCogsTextAnalyticsHealthTool() -> Any: from langchain.tools.azure_cognitive_services import ( AzureCogsTextAnalyticsHealthTool, ) return AzureCogsTextAnalyticsHealthTool def _import_bing_search_tool_BingSearchResults() -> Any: from langchain.tools.bing_search.tool import BingSearchResults return BingSearchResults def _import_bing_search_tool_BingSearchRun() -> Any: from langchain.tools.bing_search.tool import BingSearchRun return BingSearchRun def _import_brave_search_tool() -> Any: from langchain.tools.brave_search.tool import BraveSearch return BraveSearch def _import_ddg_search_tool_DuckDuckGoSearchResults() -> Any: from langchain.tools.ddg_search.tool import DuckDuckGoSearchResults return DuckDuckGoSearchResults def _import_ddg_search_tool_DuckDuckGoSearchRun() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
from langchain.tools.ddg_search.tool import DuckDuckGoSearchRun return DuckDuckGoSearchRun def _import_edenai_EdenAiExplicitImageTool() -> Any: from langchain.tools.edenai import EdenAiExplicitImageTool return EdenAiExplicitImageTool def _import_edenai_EdenAiObjectDetectionTool() -> Any: from langchain.tools.edenai import EdenAiObjectDetectionTool return EdenAiObjectDetectionTool def _import_edenai_EdenAiParsingIDTool() -> Any: from langchain.tools.edenai import EdenAiParsingIDTool return EdenAiParsingIDTool def _import_edenai_EdenAiParsingInvoiceTool() -> Any: from langchain.tools.edenai import EdenAiParsingInvoiceTool return EdenAiParsingInvoiceTool def _import_edenai_EdenAiSpeechToTextTool() -> Any: from langchain.tools.edenai import EdenAiSpeechToTextTool return EdenAiSpeechToTextTool def _import_edenai_EdenAiTextModerationTool() -> Any: from langchain.tools.edenai import EdenAiTextModerationTool return EdenAiTextModerationTool def _import_edenai_EdenAiTextToSpeechTool() -> Any: from langchain.tools.edenai import EdenAiTextToSpeechTool return EdenAiTextToSpeechTool def _import_edenai_EdenaiTool() -> Any: from langchain.tools.edenai import EdenaiTool return EdenaiTool def _import_eleven_labs_text2speech() -> Any: from langchain.tools.eleven_labs.text2speech import ElevenLabsText2SpeechTool return ElevenLabsText2SpeechTool def _import_file_management_CopyFileTool() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
from langchain.tools.file_management import CopyFileTool return CopyFileTool def _import_file_management_DeleteFileTool() -> Any: from langchain.tools.file_management import DeleteFileTool return DeleteFileTool def _import_file_management_FileSearchTool() -> Any: from langchain.tools.file_management import FileSearchTool return FileSearchTool def _import_file_management_ListDirectoryTool() -> Any: from langchain.tools.file_management import ListDirectoryTool return ListDirectoryTool def _import_file_management_MoveFileTool() -> Any: from langchain.tools.file_management import MoveFileTool return MoveFileTool def _import_file_management_ReadFileTool() -> Any: from langchain.tools.file_management import ReadFileTool return ReadFileTool def _import_file_management_WriteFileTool() -> Any: from langchain.tools.file_management import WriteFileTool return WriteFileTool def _import_gmail_GmailCreateDraft() -> Any: from langchain.tools.gmail import GmailCreateDraft return GmailCreateDraft def _import_gmail_GmailGetMessage() -> Any: from langchain.tools.gmail import GmailGetMessage return GmailGetMessage def _import_gmail_GmailGetThread() -> Any: from langchain.tools.gmail import GmailGetThread return GmailGetThread def _import_gmail_GmailSearch() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
from langchain.tools.gmail import GmailSearch return GmailSearch def _import_gmail_GmailSendMessage() -> Any: from langchain.tools.gmail import GmailSendMessage return GmailSendMessage def _import_google_cloud_texttospeech() -> Any: from langchain.tools.google_cloud.texttospeech import GoogleCloudTextToSpeechTool return GoogleCloudTextToSpeechTool def _import_google_places_tool() -> Any: from langchain.tools.google_places.tool import GooglePlacesTool return GooglePlacesTool def _import_google_search_tool_GoogleSearchResults() -> Any: from langchain.tools.google_search.tool import GoogleSearchResults return GoogleSearchResults def _import_google_search_tool_GoogleSearchRun() -> Any: from langchain.tools.google_search.tool import GoogleSearchRun return GoogleSearchRun def _import_google_serper_tool_GoogleSerperResults() -> Any: from langchain.tools.google_serper.tool import GoogleSerperResults return GoogleSerperResults def _import_google_serper_tool_GoogleSerperRun() -> Any: from langchain.tools.google_serper.tool import GoogleSerperRun return GoogleSerperRun def _import_graphql_tool() -> Any: from langchain.tools.graphql.tool import BaseGraphQLTool return BaseGraphQLTool def _import_human_tool() -> Any: from langchain.tools.human.tool import HumanInputRun return HumanInputRun def _import_ifttt() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
from langchain.tools.ifttt import IFTTTWebhook return IFTTTWebhook def _import_interaction_tool() -> Any: from langchain.tools.interaction.tool import StdInInquireTool return StdInInquireTool def _import_jira_tool() -> Any: from langchain.tools.jira.tool import JiraAction return JiraAction def _import_json_tool_JsonGetValueTool() -> Any: from langchain.tools.json.tool import JsonGetValueTool return JsonGetValueTool def _import_json_tool_JsonListKeysTool() -> Any: from langchain.tools.json.tool import JsonListKeysTool return JsonListKeysTool def _import_metaphor_search() -> Any: from langchain.tools.metaphor_search import MetaphorSearchResults return MetaphorSearchResults def _import_office365_create_draft_message() -> Any: from langchain.tools.office365.create_draft_message import O365CreateDraftMessage return O365CreateDraftMessage def _import_office365_events_search() -> Any: from langchain.tools.office365.events_search import O365SearchEvents return O365SearchEvents def _import_office365_messages_search() -> Any: from langchain.tools.office365.messages_search import O365SearchEmails return O365SearchEmails def _import_office365_send_event() -> Any: from langchain.tools.office365.send_event import O365SendEvent return O365SendEvent def _import_office365_send_message() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
from langchain.tools.office365.send_message import O365SendMessage return O365SendMessage def _import_office365_utils() -> Any: from langchain.tools.office365.utils import authenticate return authenticate def _import_openapi_utils_api_models() -> Any: from langchain.tools.openapi.utils.api_models import APIOperation return APIOperation def _import_openapi_utils_openapi_utils() -> Any: from langchain.tools.openapi.utils.openapi_utils import OpenAPISpec return OpenAPISpec def _import_openweathermap_tool() -> Any: from langchain.tools.openweathermap.tool import OpenWeatherMapQueryRun return OpenWeatherMapQueryRun def _import_playwright_ClickTool() -> Any: from langchain.tools.playwright import ClickTool return ClickTool def _import_playwright_CurrentWebPageTool() -> Any: from langchain.tools.playwright import CurrentWebPageTool return CurrentWebPageTool def _import_playwright_ExtractHyperlinksTool() -> Any: from langchain.tools.playwright import ExtractHyperlinksTool return ExtractHyperlinksTool def _import_playwright_ExtractTextTool() -> Any: from langchain.tools.playwright import ExtractTextTool return ExtractTextTool def _import_playwright_GetElementsTool() -> Any: from langchain.tools.playwright import GetElementsTool return GetElementsTool def _import_playwright_NavigateBackTool() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
from langchain.tools.playwright import NavigateBackTool return NavigateBackTool def _import_playwright_NavigateTool() -> Any: from langchain.tools.playwright import NavigateTool return NavigateTool def _import_plugin() -> Any: from langchain.tools.plugin import AIPluginTool return AIPluginTool def _import_powerbi_tool_InfoPowerBITool() -> Any: from langchain.tools.powerbi.tool import InfoPowerBITool return InfoPowerBITool def _import_powerbi_tool_ListPowerBITool() -> Any: from langchain.tools.powerbi.tool import ListPowerBITool return ListPowerBITool def _import_powerbi_tool_QueryPowerBITool() -> Any: from langchain.tools.powerbi.tool import QueryPowerBITool return QueryPowerBITool def _import_pubmed_tool() -> Any: from langchain.tools.pubmed.tool import PubmedQueryRun return PubmedQueryRun def _import_python_tool_PythonAstREPLTool() -> Any: raise ImportError( "This tool has been moved to langchain experiment. " "This tool has access to a python REPL. " "For best practices make sure to sandbox this tool. " "Read https://github.com/langchain-ai/langchain/blob/master/SECURITY.md " "To keep using this code as is, install langchain experimental and " "update relevant imports replacing 'langchain' with 'langchain_experimental'" ) def _import_python_tool_PythonREPLTool() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
raise ImportError( "This tool has been moved to langchain experiment. " "This tool has access to a python REPL. " "For best practices make sure to sandbox this tool. " "Read https://github.com/langchain-ai/langchain/blob/master/SECURITY.md " "To keep using this code as is, install langchain experimental and " "update relevant imports replacing 'langchain' with 'langchain_experimental'" ) def _import_reddit_search_RedditSearchRun() -> Any: from langchain.tools.reddit_search.tool import RedditSearchRun return RedditSearchRun def _import_render() -> Any: from langchain.tools.render import format_tool_to_openai_function return format_tool_to_openai_function def _import_requests_tool_BaseRequestsTool() -> Any: from langchain.tools.requests.tool import BaseRequestsTool return BaseRequestsTool def _import_requests_tool_RequestsDeleteTool() -> Any: from langchain.tools.requests.tool import RequestsDeleteTool return RequestsDeleteTool def _import_requests_tool_RequestsGetTool() -> Any: from langchain.tools.requests.tool import RequestsGetTool return RequestsGetTool def _import_requests_tool_RequestsPatchTool() -> Any: from langchain.tools.requests.tool import RequestsPatchTool return RequestsPatchTool def _import_requests_tool_RequestsPostTool() -> Any: from langchain.tools.requests.tool import RequestsPostTool return RequestsPostTool def _import_requests_tool_RequestsPutTool() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
from langchain.tools.requests.tool import RequestsPutTool return RequestsPutTool def _import_scenexplain_tool() -> Any: from langchain.tools.scenexplain.tool import SceneXplainTool return SceneXplainTool def _import_searx_search_tool_SearxSearchResults() -> Any: from langchain.tools.searx_search.tool import SearxSearchResults return SearxSearchResults def _import_searx_search_tool_SearxSearchRun() -> Any: from langchain.tools.searx_search.tool import SearxSearchRun return SearxSearchRun def _import_shell_tool() -> Any: from langchain.tools.shell.tool import ShellTool return ShellTool def _import_sleep_tool() -> Any: from langchain.tools.sleep.tool import SleepTool return SleepTool def _import_spark_sql_tool_BaseSparkSQLTool() -> Any: from langchain.tools.spark_sql.tool import BaseSparkSQLTool return BaseSparkSQLTool def _import_spark_sql_tool_InfoSparkSQLTool() -> Any: from langchain.tools.spark_sql.tool import InfoSparkSQLTool return InfoSparkSQLTool def _import_spark_sql_tool_ListSparkSQLTool() -> Any: from langchain.tools.spark_sql.tool import ListSparkSQLTool return ListSparkSQLTool def _import_spark_sql_tool_QueryCheckerTool() -> Any: from langchain.tools.spark_sql.tool import QueryCheckerTool return QueryCheckerTool def _import_spark_sql_tool_QuerySparkSQLTool() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
from langchain.tools.spark_sql.tool import QuerySparkSQLTool return QuerySparkSQLTool def _import_sql_database_tool_BaseSQLDatabaseTool() -> Any: from langchain.tools.sql_database.tool import BaseSQLDatabaseTool return BaseSQLDatabaseTool def _import_sql_database_tool_InfoSQLDatabaseTool() -> Any: from langchain.tools.sql_database.tool import InfoSQLDatabaseTool return InfoSQLDatabaseTool def _import_sql_database_tool_ListSQLDatabaseTool() -> Any: from langchain.tools.sql_database.tool import ListSQLDatabaseTool return ListSQLDatabaseTool def _import_sql_database_tool_QuerySQLCheckerTool() -> Any: from langchain.tools.sql_database.tool import QuerySQLCheckerTool return QuerySQLCheckerTool def _import_sql_database_tool_QuerySQLDataBaseTool() -> Any: from langchain.tools.sql_database.tool import QuerySQLDataBaseTool return QuerySQLDataBaseTool def _import_stackexchange_tool() -> Any: from langchain.tools.stackexchange.tool import StackExchangeTool return StackExchangeTool def _import_steamship_image_generation() -> Any: from langchain.tools.steamship_image_generation import SteamshipImageGenerationTool return SteamshipImageGenerationTool def _import_vectorstore_tool_VectorStoreQATool() -> Any: from langchain.tools.vectorstore.tool import VectorStoreQATool return VectorStoreQATool def _import_vectorstore_tool_VectorStoreQAWithSourcesTool() -> Any: from langchain.tools.vectorstore.tool import VectorStoreQAWithSourcesTool return VectorStoreQAWithSourcesTool def _import_wikipedia_tool() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
from langchain.tools.wikipedia.tool import WikipediaQueryRun return WikipediaQueryRun def _import_wolfram_alpha_tool() -> Any: from langchain.tools.wolfram_alpha.tool import WolframAlphaQueryRun return WolframAlphaQueryRun def _import_yahoo_finance_news() -> Any: from langchain.tools.yahoo_finance_news import YahooFinanceNewsTool return YahooFinanceNewsTool def _import_youtube_search() -> Any: from langchain.tools.youtube.search import YouTubeSearchTool return YouTubeSearchTool def _import_zapier_tool_ZapierNLAListActions() -> Any: from langchain.tools.zapier.tool import ZapierNLAListActions return ZapierNLAListActions def _import_zapier_tool_ZapierNLARunAction() -> Any: from langchain.tools.zapier.tool import ZapierNLARunAction return ZapierNLARunAction def _import_bearly_tool() -> Any: from langchain.tools.bearly.tool import BearlyInterpreterTool return BearlyInterpreterTool def _import_e2b_data_analysis() -> Any: from langchain.tools.e2b_data_analysis.tool import E2BDataAnalysisTool return E2BDataAnalysisTool def __getattr__(name: str) -> Any: if name == "AINAppOps": return _import_ainetwork_app() elif name == "AINOwnerOps": return _import_ainetwork_owner()
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
elif name == "AINRuleOps": return _import_ainetwork_rule() elif name == "AINTransfer": return _import_ainetwork_transfer() elif name == "AINValueOps": return _import_ainetwork_value() elif name == "ArxivQueryRun": return _import_arxiv_tool() elif name == "AzureCogsFormRecognizerTool": return _import_azure_cognitive_services_AzureCogsFormRecognizerTool() elif name == "AzureCogsImageAnalysisTool": return _import_azure_cognitive_services_AzureCogsImageAnalysisTool() elif name == "AzureCogsSpeech2TextTool": return _import_azure_cognitive_services_AzureCogsSpeech2TextTool() elif name == "AzureCogsText2SpeechTool": return _import_azure_cognitive_services_AzureCogsText2SpeechTool() elif name == "AzureCogsTextAnalyticsHealthTool": return _import_azure_cognitive_services_AzureCogsTextAnalyticsHealthTool() elif name == "BingSearchResults": return _import_bing_search_tool_BingSearchResults() elif name == "BingSearchRun": return _import_bing_search_tool_BingSearchRun() elif name == "BraveSearch": return _import_brave_search_tool() elif name == "DuckDuckGoSearchResults": return _import_ddg_search_tool_DuckDuckGoSearchResults() elif name == "DuckDuckGoSearchRun": return _import_ddg_search_tool_DuckDuckGoSearchRun() elif name == "EdenAiExplicitImageTool": return _import_edenai_EdenAiExplicitImageTool()
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
elif name == "EdenAiObjectDetectionTool": return _import_edenai_EdenAiObjectDetectionTool() elif name == "EdenAiParsingIDTool": return _import_edenai_EdenAiParsingIDTool() elif name == "EdenAiParsingInvoiceTool": return _import_edenai_EdenAiParsingInvoiceTool() elif name == "EdenAiSpeechToTextTool": return _import_edenai_EdenAiSpeechToTextTool() elif name == "EdenAiTextModerationTool": return _import_edenai_EdenAiTextModerationTool() elif name == "EdenAiTextToSpeechTool": return _import_edenai_EdenAiTextToSpeechTool() elif name == "EdenaiTool": return _import_edenai_EdenaiTool() elif name == "ElevenLabsText2SpeechTool": return _import_eleven_labs_text2speech() elif name == "CopyFileTool": return _import_file_management_CopyFileTool() elif name == "DeleteFileTool": return _import_file_management_DeleteFileTool() elif name == "FileSearchTool": return _import_file_management_FileSearchTool() elif name == "ListDirectoryTool": return _import_file_management_ListDirectoryTool() elif name == "MoveFileTool": return _import_file_management_MoveFileTool() elif name == "ReadFileTool": return _import_file_management_ReadFileTool() elif name == "WriteFileTool": return _import_file_management_WriteFileTool()
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
elif name == "GmailCreateDraft": return _import_gmail_GmailCreateDraft() elif name == "GmailGetMessage": return _import_gmail_GmailGetMessage() elif name == "GmailGetThread": return _import_gmail_GmailGetThread() elif name == "GmailSearch": return _import_gmail_GmailSearch() elif name == "GmailSendMessage": return _import_gmail_GmailSendMessage() elif name == "GoogleCloudTextToSpeechTool": return _import_google_cloud_texttospeech() elif name == "GooglePlacesTool": return _import_google_places_tool() elif name == "GoogleSearchResults": return _import_google_search_tool_GoogleSearchResults() elif name == "GoogleSearchRun": return _import_google_search_tool_GoogleSearchRun() elif name == "GoogleSerperResults": return _import_google_serper_tool_GoogleSerperResults() elif name == "GoogleSerperRun": return _import_google_serper_tool_GoogleSerperRun() elif name == "BaseGraphQLTool": return _import_graphql_tool() elif name == "HumanInputRun": return _import_human_tool() elif name == "IFTTTWebhook": return _import_ifttt() elif name == "StdInInquireTool": return _import_interaction_tool()
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
elif name == "JiraAction": return _import_jira_tool() elif name == "JsonGetValueTool": return _import_json_tool_JsonGetValueTool() elif name == "JsonListKeysTool": return _import_json_tool_JsonListKeysTool() elif name == "MetaphorSearchResults": return _import_metaphor_search() elif name == "O365CreateDraftMessage": return _import_office365_create_draft_message() elif name == "O365SearchEvents": return _import_office365_events_search() elif name == "O365SearchEmails": return _import_office365_messages_search() elif name == "O365SendEvent": return _import_office365_send_event() elif name == "O365SendMessage": return _import_office365_send_message() elif name == "authenticate": return _import_office365_utils() elif name == "APIOperation": return _import_openapi_utils_api_models() elif name == "OpenAPISpec": return _import_openapi_utils_openapi_utils() elif name == "OpenWeatherMapQueryRun": return _import_openweathermap_tool() elif name == "ClickTool": return _import_playwright_ClickTool() elif name == "CurrentWebPageTool": return _import_playwright_CurrentWebPageTool()
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
elif name == "ExtractHyperlinksTool": return _import_playwright_ExtractHyperlinksTool() elif name == "ExtractTextTool": return _import_playwright_ExtractTextTool() elif name == "GetElementsTool": return _import_playwright_GetElementsTool() elif name == "NavigateBackTool": return _import_playwright_NavigateBackTool() elif name == "NavigateTool": return _import_playwright_NavigateTool() elif name == "AIPluginTool": return _import_plugin() elif name == "InfoPowerBITool": return _import_powerbi_tool_InfoPowerBITool() elif name == "ListPowerBITool": return _import_powerbi_tool_ListPowerBITool() elif name == "QueryPowerBITool": return _import_powerbi_tool_QueryPowerBITool() elif name == "PubmedQueryRun": return _import_pubmed_tool() elif name == "PythonAstREPLTool": return _import_python_tool_PythonAstREPLTool() elif name == "PythonREPLTool": return _import_python_tool_PythonREPLTool() elif name == "RedditSearchRun": return _import_reddit_search_RedditSearchRun() elif name == "format_tool_to_openai_function": return _import_render() elif name == "BaseRequestsTool": return _import_requests_tool_BaseRequestsTool()
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
elif name == "RequestsDeleteTool": return _import_requests_tool_RequestsDeleteTool() elif name == "RequestsGetTool": return _import_requests_tool_RequestsGetTool() elif name == "RequestsPatchTool": return _import_requests_tool_RequestsPatchTool() elif name == "RequestsPostTool": return _import_requests_tool_RequestsPostTool() elif name == "RequestsPutTool": return _import_requests_tool_RequestsPutTool() elif name == "SceneXplainTool": return _import_scenexplain_tool() elif name == "SearxSearchResults": return _import_searx_search_tool_SearxSearchResults() elif name == "SearxSearchRun": return _import_searx_search_tool_SearxSearchRun() elif name == "ShellTool": return _import_shell_tool() elif name == "SleepTool": return _import_sleep_tool() elif name == "BaseSparkSQLTool": return _import_spark_sql_tool_BaseSparkSQLTool() elif name == "InfoSparkSQLTool": return _import_spark_sql_tool_InfoSparkSQLTool() elif name == "ListSparkSQLTool": return _import_spark_sql_tool_ListSparkSQLTool() elif name == "QueryCheckerTool": return _import_spark_sql_tool_QueryCheckerTool() elif name == "QuerySparkSQLTool": return _import_spark_sql_tool_QuerySparkSQLTool()
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
elif name == "BaseSQLDatabaseTool": return _import_sql_database_tool_BaseSQLDatabaseTool() elif name == "InfoSQLDatabaseTool": return _import_sql_database_tool_InfoSQLDatabaseTool() elif name == "ListSQLDatabaseTool": return _import_sql_database_tool_ListSQLDatabaseTool() elif name == "QuerySQLCheckerTool": return _import_sql_database_tool_QuerySQLCheckerTool() elif name == "QuerySQLDataBaseTool": return _import_sql_database_tool_QuerySQLDataBaseTool() elif name == "StackExchangeTool": return _import_stackexchange_tool() elif name == "SteamshipImageGenerationTool": return _import_steamship_image_generation() elif name == "VectorStoreQATool": return _import_vectorstore_tool_VectorStoreQATool() elif name == "VectorStoreQAWithSourcesTool": return _import_vectorstore_tool_VectorStoreQAWithSourcesTool() elif name == "WikipediaQueryRun": return _import_wikipedia_tool() elif name == "WolframAlphaQueryRun": return _import_wolfram_alpha_tool() elif name == "YahooFinanceNewsTool": return _import_yahoo_finance_news() elif name == "YouTubeSearchTool": return _import_youtube_search() elif name == "ZapierNLAListActions": return _import_zapier_tool_ZapierNLAListActions() elif name == "ZapierNLARunAction": return _import_zapier_tool_ZapierNLARunAction()
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
elif name == "BearlyInterpreterTool": return _import_bearly_tool() elif name == "E2BDataAnalysisTool": return _import_e2b_data_analysis() else: raise AttributeError(f"Could not find: {name}") __all__ = [ "AINAppOps", "AINOwnerOps", "AINRuleOps", "AINTransfer", "AINValueOps", "AIPluginTool", "APIOperation", "ArxivQueryRun", "AzureCogsFormRecognizerTool", "AzureCogsImageAnalysisTool", "AzureCogsSpeech2TextTool", "AzureCogsText2SpeechTool", "AzureCogsTextAnalyticsHealthTool", "BaseGraphQLTool", "BaseRequestsTool", "BaseSQLDatabaseTool", "BaseSparkSQLTool", "BaseTool", "BearlyInterpreterTool", "BingSearchResults", "BingSearchRun", "BraveSearch", "ClickTool",
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
"CopyFileTool", "CurrentWebPageTool", "DeleteFileTool", "DuckDuckGoSearchResults", "DuckDuckGoSearchRun", "E2BDataAnalysisTool", "EdenAiExplicitImageTool", "EdenAiObjectDetectionTool", "EdenAiParsingIDTool", "EdenAiParsingInvoiceTool", "EdenAiSpeechToTextTool", "EdenAiTextModerationTool", "EdenAiTextToSpeechTool", "EdenaiTool", "ElevenLabsText2SpeechTool", "ExtractHyperlinksTool", "ExtractTextTool", "FileSearchTool", "GetElementsTool", "GmailCreateDraft", "GmailGetMessage", "GmailGetThread", "GmailSearch", "GmailSendMessage", "GoogleCloudTextToSpeechTool", "GooglePlacesTool", "GoogleSearchResults", "GoogleSearchRun", "GoogleSerperResults", "GoogleSerperRun",
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
"HumanInputRun", "IFTTTWebhook", "InfoPowerBITool", "InfoSQLDatabaseTool", "InfoSparkSQLTool", "JiraAction", "JsonGetValueTool", "JsonListKeysTool", "ListDirectoryTool", "ListPowerBITool", "ListSQLDatabaseTool", "ListSparkSQLTool", "MetaphorSearchResults", "MoveFileTool", "NavigateBackTool", "NavigateTool", "O365CreateDraftMessage", "O365SearchEmails", "O365SearchEvents", "O365SendEvent", "O365SendMessage", "OpenAPISpec", "OpenWeatherMapQueryRun", "PubmedQueryRun", "RedditSearchRun", "QueryCheckerTool", "QueryPowerBITool", "QuerySQLCheckerTool", "QuerySQLDataBaseTool", "QuerySparkSQLTool",
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/tools/__init__.py
"ReadFileTool", "RequestsDeleteTool", "RequestsGetTool", "RequestsPatchTool", "RequestsPostTool", "RequestsPutTool", "SceneXplainTool", "SearxSearchResults", "SearxSearchRun", "ShellTool", "SleepTool", "StdInInquireTool", "StackExchangeTool", "SteamshipImageGenerationTool", "StructuredTool", "Tool", "VectorStoreQATool", "VectorStoreQAWithSourcesTool", "WikipediaQueryRun", "WolframAlphaQueryRun", "WriteFileTool", "YahooFinanceNewsTool", "YouTubeSearchTool", "ZapierNLAListActions", "ZapierNLARunAction", "authenticate", "format_tool_to_openai_function", "tool", ]
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/utilities/__init__.py
"""**Utilities** are the integrations with third-part systems and packages. Other LangChain classes use **Utilities** to interact with third-part systems and packages. """ from typing import Any from langchain.utilities.requests import Requests, RequestsWrapper, TextRequestsWrapper def _import_alpha_vantage() -> Any: from langchain.utilities.alpha_vantage import AlphaVantageAPIWrapper return AlphaVantageAPIWrapper def _import_apify() -> Any: from langchain.utilities.apify import ApifyWrapper return ApifyWrapper def _import_arcee() -> Any: from langchain.utilities.arcee import ArceeWrapper return ArceeWrapper def _import_arxiv() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/utilities/__init__.py
from langchain.utilities.arxiv import ArxivAPIWrapper return ArxivAPIWrapper def _import_awslambda() -> Any: from langchain.utilities.awslambda import LambdaWrapper return LambdaWrapper def _import_bibtex() -> Any: from langchain.utilities.bibtex import BibtexparserWrapper return BibtexparserWrapper def _import_bing_search() -> Any: from langchain.utilities.bing_search import BingSearchAPIWrapper return BingSearchAPIWrapper def _import_brave_search() -> Any: from langchain.utilities.brave_search import BraveSearchWrapper return BraveSearchWrapper def _import_duckduckgo_search() -> Any: from langchain.utilities.duckduckgo_search import DuckDuckGoSearchAPIWrapper return DuckDuckGoSearchAPIWrapper def _import_golden_query() -> Any: from langchain.utilities.golden_query import GoldenQueryAPIWrapper return GoldenQueryAPIWrapper def _import_google_lens() -> Any: from langchain.utilities.google_lens import GoogleLensAPIWrapper return GoogleLensAPIWrapper def _import_google_places_api() -> Any: from langchain.utilities.google_places_api import GooglePlacesAPIWrapper return GooglePlacesAPIWrapper def _import_google_jobs() -> Any: from langchain.utilities.google_jobs import GoogleJobsAPIWrapper return GoogleJobsAPIWrapper def _import_google_scholar() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/utilities/__init__.py
from langchain.utilities.google_scholar import GoogleScholarAPIWrapper return GoogleScholarAPIWrapper def _import_google_trends() -> Any: from langchain.utilities.google_trends import GoogleTrendsAPIWrapper return GoogleTrendsAPIWrapper def _import_google_finance() -> Any: from langchain.utilities.google_finance import GoogleFinanceAPIWrapper return GoogleFinanceAPIWrapper def _import_google_search() -> Any: from langchain.utilities.google_search import GoogleSearchAPIWrapper return GoogleSearchAPIWrapper def _import_google_serper() -> Any: from langchain.utilities.google_serper import GoogleSerperAPIWrapper return GoogleSerperAPIWrapper def _import_graphql() -> Any: from langchain.utilities.graphql import GraphQLAPIWrapper return GraphQLAPIWrapper def _import_jira() -> Any: from langchain.utilities.jira import JiraAPIWrapper return JiraAPIWrapper def _import_max_compute() -> Any: from langchain.utilities.max_compute import MaxComputeAPIWrapper return MaxComputeAPIWrapper def _import_metaphor_search() -> Any: from langchain.utilities.metaphor_search import MetaphorSearchAPIWrapper return MetaphorSearchAPIWrapper def _import_openweathermap() -> Any: from langchain.utilities.openweathermap import OpenWeatherMapAPIWrapper return OpenWeatherMapAPIWrapper def _import_outline() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/utilities/__init__.py
from langchain.utilities.outline import OutlineAPIWrapper return OutlineAPIWrapper def _import_portkey() -> Any: from langchain.utilities.portkey import Portkey return Portkey def _import_powerbi() -> Any: from langchain.utilities.powerbi import PowerBIDataset return PowerBIDataset def _import_pubmed() -> Any: from langchain.utilities.pubmed import PubMedAPIWrapper return PubMedAPIWrapper def _import_python() -> Any: from langchain.utilities.python import PythonREPL return PythonREPL def _import_scenexplain() -> Any: from langchain.utilities.scenexplain import SceneXplainAPIWrapper return SceneXplainAPIWrapper def _import_searchapi() -> Any: from langchain.utilities.searchapi import SearchApiAPIWrapper return SearchApiAPIWrapper def _import_searx_search() -> Any: from langchain.utilities.searx_search import SearxSearchWrapper return SearxSearchWrapper def _import_serpapi() -> Any: from langchain.utilities.serpapi import SerpAPIWrapper return SerpAPIWrapper def _import_spark_sql() -> Any: from langchain.utilities.spark_sql import SparkSQL return SparkSQL def _import_sql_database() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/utilities/__init__.py
from langchain.utilities.sql_database import SQLDatabase return SQLDatabase def _import_stackexchange() -> Any: from langchain.utilities.stackexchange import StackExchangeAPIWrapper return StackExchangeAPIWrapper def _import_tensorflow_datasets() -> Any: from langchain.utilities.tensorflow_datasets import TensorflowDatasets return TensorflowDatasets def _import_twilio() -> Any: from langchain.utilities.twilio import TwilioAPIWrapper return TwilioAPIWrapper def _import_wikipedia() -> Any: from langchain.utilities.wikipedia import WikipediaAPIWrapper return WikipediaAPIWrapper def _import_wolfram_alpha() -> Any: from langchain.utilities.wolfram_alpha import WolframAlphaAPIWrapper return WolframAlphaAPIWrapper def _import_zapier() -> Any: from langchain.utilities.zapier import ZapierNLAWrapper return ZapierNLAWrapper def __getattr__(name: str) -> Any: if name == "AlphaVantageAPIWrapper": return _import_alpha_vantage() elif name == "ApifyWrapper": return _import_apify() elif name == "ArceeWrapper": return _import_arcee() elif name == "ArxivAPIWrapper": return _import_arxiv()
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/utilities/__init__.py
elif name == "LambdaWrapper": return _import_awslambda() elif name == "BibtexparserWrapper": return _import_bibtex() elif name == "BingSearchAPIWrapper": return _import_bing_search() elif name == "BraveSearchWrapper": return _import_brave_search() elif name == "DuckDuckGoSearchAPIWrapper": return _import_duckduckgo_search() elif name == "GoogleLensAPIWrapper": return _import_google_lens() elif name == "GoldenQueryAPIWrapper": return _import_golden_query() elif name == "GoogleJobsAPIWrapper": return _import_google_jobs() elif name == "GoogleScholarAPIWrapper": return _import_google_scholar() elif name == "GoogleFinanceAPIWrapper": return _import_google_finance() elif name == "GoogleTrendsAPIWrapper": return _import_google_trends() elif name == "GooglePlacesAPIWrapper": return _import_google_places_api() elif name == "GoogleSearchAPIWrapper": return _import_google_search() elif name == "GoogleSerperAPIWrapper": return _import_google_serper() elif name == "GraphQLAPIWrapper": return _import_graphql()
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/utilities/__init__.py
elif name == "JiraAPIWrapper": return _import_jira() elif name == "MaxComputeAPIWrapper": return _import_max_compute() elif name == "MetaphorSearchAPIWrapper": return _import_metaphor_search() elif name == "OpenWeatherMapAPIWrapper": return _import_openweathermap() elif name == "OutlineAPIWrapper": return _import_outline() elif name == "Portkey": return _import_portkey() elif name == "PowerBIDataset": return _import_powerbi() elif name == "PubMedAPIWrapper": return _import_pubmed() elif name == "PythonREPL": return _import_python() elif name == "SceneXplainAPIWrapper": return _import_scenexplain() elif name == "SearchApiAPIWrapper": return _import_searchapi() elif name == "SearxSearchWrapper": return _import_searx_search() elif name == "SerpAPIWrapper": return _import_serpapi() elif name == "SparkSQL": return _import_spark_sql() elif name == "StackExchangeAPIWrapper": return _import_stackexchange()
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/utilities/__init__.py
elif name == "SQLDatabase": return _import_sql_database() elif name == "TensorflowDatasets": return _import_tensorflow_datasets() elif name == "TwilioAPIWrapper": return _import_twilio() elif name == "WikipediaAPIWrapper": return _import_wikipedia() elif name == "WolframAlphaAPIWrapper": return _import_wolfram_alpha() elif name == "ZapierNLAWrapper": return _import_zapier() else: raise AttributeError(f"Could not find: {name}") __all__ = [ "AlphaVantageAPIWrapper", "ApifyWrapper", "ArceeWrapper", "ArxivAPIWrapper", "BibtexparserWrapper", "BingSearchAPIWrapper", "BraveSearchWrapper", "DuckDuckGoSearchAPIWrapper", "GoldenQueryAPIWrapper", "GoogleFinanceAPIWrapper", "GoogleLensAPIWrapper", "GoogleJobsAPIWrapper", "GooglePlacesAPIWrapper", "GoogleScholarAPIWrapper", "GoogleTrendsAPIWrapper",
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/langchain/utilities/__init__.py
"GoogleSearchAPIWrapper", "GoogleSerperAPIWrapper", "GraphQLAPIWrapper", "JiraAPIWrapper", "LambdaWrapper", "MaxComputeAPIWrapper", "MetaphorSearchAPIWrapper", "OpenWeatherMapAPIWrapper", "OutlineAPIWrapper", "Portkey", "PowerBIDataset", "PubMedAPIWrapper", "PythonREPL", "Requests", "RequestsWrapper", "SQLDatabase", "SceneXplainAPIWrapper", "SearchApiAPIWrapper", "SearxSearchWrapper", "SerpAPIWrapper", "SparkSQL", "StackExchangeAPIWrapper", "TensorflowDatasets", "TextRequestsWrapper", "TwilioAPIWrapper", "WikipediaAPIWrapper", "WolframAlphaAPIWrapper", "ZapierNLAWrapper", ]
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/tests/unit_tests/tools/test_imports.py
from langchain.tools import __all__ EXPECTED_ALL = [ "AINAppOps", "AINOwnerOps", "AINRuleOps", "AINTransfer", "AINValueOps", "AIPluginTool", "APIOperation", "ArxivQueryRun", "AzureCogsFormRecognizerTool", "AzureCogsImageAnalysisTool", "AzureCogsSpeech2TextTool", "AzureCogsText2SpeechTool", "AzureCogsTextAnalyticsHealthTool", "BaseGraphQLTool", "BaseRequestsTool", "BaseSQLDatabaseTool", "BaseSparkSQLTool", "BaseTool", "BearlyInterpreterTool", "BingSearchResults", "BingSearchRun", "BraveSearch", "ClickTool", "CopyFileTool", "CurrentWebPageTool",
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/tests/unit_tests/tools/test_imports.py
"DeleteFileTool", "DuckDuckGoSearchResults", "DuckDuckGoSearchRun", "E2BDataAnalysisTool", "EdenAiExplicitImageTool", "EdenAiObjectDetectionTool", "EdenAiParsingIDTool", "EdenAiParsingInvoiceTool", "EdenAiSpeechToTextTool", "EdenAiTextModerationTool", "EdenAiTextToSpeechTool", "EdenaiTool", "ElevenLabsText2SpeechTool", "ExtractHyperlinksTool", "ExtractTextTool", "FileSearchTool", "GetElementsTool", "GmailCreateDraft", "GmailGetMessage", "GmailGetThread", "GmailSearch", "GmailSendMessage", "GoogleCloudTextToSpeechTool", "GooglePlacesTool", "GoogleSearchResults", "GoogleSearchRun", "GoogleSerperResults", "GoogleSerperRun", "HumanInputRun", "IFTTTWebhook",
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/tests/unit_tests/tools/test_imports.py
"InfoPowerBITool", "InfoSQLDatabaseTool", "InfoSparkSQLTool", "JiraAction", "JsonGetValueTool", "JsonListKeysTool", "ListDirectoryTool", "ListPowerBITool", "ListSQLDatabaseTool", "ListSparkSQLTool", "MetaphorSearchResults", "MoveFileTool", "NavigateBackTool", "NavigateTool", "O365CreateDraftMessage", "O365SearchEmails", "O365SearchEvents", "O365SendEvent", "O365SendMessage", "OpenAPISpec", "OpenWeatherMapQueryRun", "PubmedQueryRun", "RedditSearchRun", "QueryCheckerTool", "QueryPowerBITool", "QuerySQLCheckerTool", "QuerySQLDataBaseTool", "QuerySparkSQLTool", "ReadFileTool", "RequestsDeleteTool",
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/tests/unit_tests/tools/test_imports.py
"RequestsGetTool", "RequestsPatchTool", "RequestsPostTool", "RequestsPutTool", "SceneXplainTool", "SearxSearchResults", "SearxSearchRun", "ShellTool", "SleepTool", "StackExchangeTool", "StdInInquireTool", "SteamshipImageGenerationTool", "StructuredTool", "Tool", "VectorStoreQATool", "VectorStoreQAWithSourcesTool", "WikipediaQueryRun", "WolframAlphaQueryRun", "WriteFileTool", "YahooFinanceNewsTool", "YouTubeSearchTool", "ZapierNLAListActions", "ZapierNLARunAction", "authenticate", "format_tool_to_openai_function", "tool", ] def test_all_imports() -> None: assert set(__all__) == set(EXPECTED_ALL)
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/tests/unit_tests/tools/test_public_api.py
"""Test the public API of the tools package.""" from langchain.tools import __all__ as public_api _EXPECTED = [ "AINAppOps", "AINOwnerOps", "AINRuleOps", "AINTransfer", "AINValueOps", "AIPluginTool", "APIOperation", "ArxivQueryRun", "AzureCogsFormRecognizerTool", "AzureCogsImageAnalysisTool", "AzureCogsSpeech2TextTool", "AzureCogsText2SpeechTool", "AzureCogsTextAnalyticsHealthTool", "BaseGraphQLTool", "BaseRequestsTool", "BaseSQLDatabaseTool", "BaseSparkSQLTool", "BaseTool", "BearlyInterpreterTool", "BingSearchResults", "BingSearchRun", "BraveSearch", "ClickTool", "CopyFileTool", "CurrentWebPageTool", "DeleteFileTool", "DuckDuckGoSearchResults",
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/tests/unit_tests/tools/test_public_api.py
"DuckDuckGoSearchRun", "E2BDataAnalysisTool", "EdenAiExplicitImageTool", "EdenAiObjectDetectionTool", "EdenAiParsingIDTool", "EdenAiParsingInvoiceTool", "EdenAiSpeechToTextTool", "EdenAiTextModerationTool", "EdenAiTextToSpeechTool", "EdenaiTool", "ElevenLabsText2SpeechTool", "ExtractHyperlinksTool", "ExtractTextTool", "FileSearchTool", "GetElementsTool", "GmailCreateDraft", "GmailGetMessage", "GmailGetThread", "GmailSearch", "GmailSendMessage", "GoogleCloudTextToSpeechTool", "GooglePlacesTool", "GoogleSearchResults", "GoogleSearchRun", "GoogleSerperResults", "GoogleSerperRun", "HumanInputRun", "IFTTTWebhook", "InfoPowerBITool", "InfoSQLDatabaseTool",
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/tests/unit_tests/tools/test_public_api.py
"InfoSparkSQLTool", "JiraAction", "JsonGetValueTool", "JsonListKeysTool", "ListDirectoryTool", "ListPowerBITool", "ListSQLDatabaseTool", "ListSparkSQLTool", "MetaphorSearchResults", "MoveFileTool", "NavigateBackTool", "NavigateTool", "O365CreateDraftMessage", "O365SearchEmails", "O365SearchEvents", "O365SendEvent", "O365SendMessage", "OpenAPISpec", "OpenWeatherMapQueryRun", "PubmedQueryRun", "RedditSearchRun", "QueryCheckerTool", "QueryPowerBITool", "QuerySQLCheckerTool", "QuerySQLDataBaseTool", "QuerySparkSQLTool", "ReadFileTool", "RequestsDeleteTool", "RequestsGetTool", "RequestsPatchTool",
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/tests/unit_tests/tools/test_public_api.py
"RequestsPostTool", "RequestsPutTool", "SceneXplainTool", "SearxSearchResults", "SearxSearchRun", "ShellTool", "SleepTool", "StdInInquireTool", "StackExchangeTool", "SteamshipImageGenerationTool", "StructuredTool", "Tool", "VectorStoreQATool", "VectorStoreQAWithSourcesTool", "WikipediaQueryRun", "WolframAlphaQueryRun", "WriteFileTool", "YahooFinanceNewsTool", "YouTubeSearchTool", "ZapierNLAListActions", "ZapierNLARunAction", "authenticate", "format_tool_to_openai_function", "tool", ] def test_public_api() -> None: """Test for regressions or changes in the public API.""" assert set(public_api) == set(_EXPECTED)
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/tests/unit_tests/utilities/test_imports.py
from langchain.utilities import __all__ EXPECTED_ALL = [ "AlphaVantageAPIWrapper", "ApifyWrapper", "ArceeWrapper", "ArxivAPIWrapper", "BibtexparserWrapper", "BingSearchAPIWrapper", "BraveSearchWrapper", "DuckDuckGoSearchAPIWrapper", "GoldenQueryAPIWrapper", "GoogleFinanceAPIWrapper", "GoogleJobsAPIWrapper", "GoogleLensAPIWrapper", "GooglePlacesAPIWrapper", "GoogleScholarAPIWrapper", "GoogleSearchAPIWrapper", "GoogleSerperAPIWrapper", "GoogleTrendsAPIWrapper",
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
12,039
Tools for Dictionary APIs
### Feature request It would be nice to have agents that could access dictionary APIs such as the Merriam-Webster API or Urban Dictionary API (for slang). ### Motivation It can be useful to be able to look up definitions for words using a dictionary to provide additional context. With no current dictionary tools available, it would be beneficial for there to be an implemented dictionary tool available at all. ### Your contribution We will open a PR that adds a new tool for accessing the Merriam-Webster Collegiate Dictionary API (https://dictionaryapi.com/products/api-collegiate-dictionary[/](https://www.dictionaryapi.com/)), which provides definitions for English words, as soon as possible. In the future this could be extended to support other Merriam-Webster APIs such as their Medical Dictionary API (https://dictionaryapi.com/products/api-medical-dictionary) or Spanish-English Dictionary API (https://dictionaryapi.com/products/api-spanish-dictionary). We may also open another PR for Urban Dictionary API integration.
https://github.com/langchain-ai/langchain/issues/12039
https://github.com/langchain-ai/langchain/pull/12044
f3dd4a10cffd507a1300abf0f7729e95072f44eb
c2e3963da4b7c6650fc37acfa8ea39a355e7dae9
"2023-10-19T18:31:45Z"
python
"2023-11-30T01:28:29Z"
libs/langchain/tests/unit_tests/utilities/test_imports.py
"GraphQLAPIWrapper", "JiraAPIWrapper", "LambdaWrapper", "MaxComputeAPIWrapper", "MetaphorSearchAPIWrapper", "OpenWeatherMapAPIWrapper", "OutlineAPIWrapper", "Portkey", "PowerBIDataset", "PubMedAPIWrapper", "PythonREPL", "Requests", "RequestsWrapper", "SQLDatabase", "SceneXplainAPIWrapper", "SearchApiAPIWrapper", "SearxSearchWrapper", "SerpAPIWrapper", "SparkSQL", "StackExchangeAPIWrapper", "TensorflowDatasets", "TextRequestsWrapper", "TwilioAPIWrapper", "WikipediaAPIWrapper", "WolframAlphaAPIWrapper", "ZapierNLAWrapper", ] def test_all_imports() -> None: assert set(__all__) == set(EXPECTED_ALL)
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
""" **LLM** classes provide access to the large language model (**LLM**) APIs and services. **Class hierarchy:** .. code-block:: BaseLanguageModel --> BaseLLM --> LLM --> <name> # Examples: AI21, HuggingFaceHub, OpenAI **Main helpers:** .. code-block:: LLMResult, PromptValue, CallbackManagerForLLMRun, AsyncCallbackManagerForLLMRun, CallbackManager, AsyncCallbackManager, AIMessage, BaseMessage """ from typing import Any, Callable, Dict, Type from langchain.llms.base import BaseLLM def _import_ai21() -> Any: from langchain.llms.ai21 import AI21 return AI21 def _import_aleph_alpha() -> Any: from langchain.llms.aleph_alpha import AlephAlpha return AlephAlpha def _import_amazon_api_gateway() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
from langchain.llms.amazon_api_gateway import AmazonAPIGateway return AmazonAPIGateway def _import_anthropic() -> Any: from langchain.llms.anthropic import Anthropic return Anthropic def _import_anyscale() -> Any: from langchain.llms.anyscale import Anyscale return Anyscale def _import_arcee() -> Any: from langchain.llms.arcee import Arcee return Arcee def _import_aviary() -> Any: from langchain.llms.aviary import Aviary return Aviary def _import_azureml_endpoint() -> Any: from langchain.llms.azureml_endpoint import AzureMLOnlineEndpoint return AzureMLOnlineEndpoint def _import_baidu_qianfan_endpoint() -> Any: from langchain.llms.baidu_qianfan_endpoint import QianfanLLMEndpoint return QianfanLLMEndpoint def _import_bananadev() -> Any: from langchain.llms.bananadev import Banana return Banana def _import_baseten() -> Any: from langchain.llms.baseten import Baseten return Baseten def _import_beam() -> Any: from langchain.llms.beam import Beam return Beam def _import_bedrock() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
from langchain.llms.bedrock import Bedrock return Bedrock def _import_bittensor() -> Any: from langchain.llms.bittensor import NIBittensorLLM return NIBittensorLLM def _import_cerebriumai() -> Any: from langchain.llms.cerebriumai import CerebriumAI return CerebriumAI def _import_chatglm() -> Any: from langchain.llms.chatglm import ChatGLM return ChatGLM def _import_clarifai() -> Any: from langchain.llms.clarifai import Clarifai return Clarifai def _import_cohere() -> Any: from langchain.llms.cohere import Cohere return Cohere def _import_ctransformers() -> Any: from langchain.llms.ctransformers import CTransformers return CTransformers def _import_ctranslate2() -> Any: from langchain.llms.ctranslate2 import CTranslate2 return CTranslate2 def _import_databricks() -> Any: from langchain.llms.databricks import Databricks return Databricks def _import_databricks_chat() -> Any: from langchain.chat_models.databricks import ChatDatabricks return ChatDatabricks def _import_deepinfra() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
from langchain.llms.deepinfra import DeepInfra return DeepInfra def _import_deepsparse() -> Any: from langchain.llms.deepsparse import DeepSparse return DeepSparse def _import_edenai() -> Any: from langchain.llms.edenai import EdenAI return EdenAI def _import_fake() -> Any: from langchain.llms.fake import FakeListLLM return FakeListLLM def _import_fireworks() -> Any: from langchain.llms.fireworks import Fireworks return Fireworks def _import_forefrontai() -> Any: from langchain.llms.forefrontai import ForefrontAI return ForefrontAI def _import_gigachat() -> Any: from langchain.llms.gigachat import GigaChat return GigaChat def _import_google_palm() -> Any: from langchain.llms.google_palm import GooglePalm return GooglePalm def _import_gooseai() -> Any: from langchain.llms.gooseai import GooseAI return GooseAI def _import_gpt4all() -> Any: from langchain.llms.gpt4all import GPT4All return GPT4All def _import_gradient_ai() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
from langchain.llms.gradient_ai import GradientLLM return GradientLLM def _import_huggingface_endpoint() -> Any: from langchain.llms.huggingface_endpoint import HuggingFaceEndpoint return HuggingFaceEndpoint def _import_huggingface_hub() -> Any: from langchain.llms.huggingface_hub import HuggingFaceHub return HuggingFaceHub def _import_huggingface_pipeline() -> Any: from langchain.llms.huggingface_pipeline import HuggingFacePipeline return HuggingFacePipeline def _import_huggingface_text_gen_inference() -> Any: from langchain.llms.huggingface_text_gen_inference import ( HuggingFaceTextGenInference, ) return HuggingFaceTextGenInference def _import_human() -> Any: from langchain.llms.human import HumanInputLLM return HumanInputLLM def _import_javelin_ai_gateway() -> Any: from langchain.llms.javelin_ai_gateway import JavelinAIGateway return JavelinAIGateway def _import_koboldai() -> Any: from langchain.llms.koboldai import KoboldApiLLM return KoboldApiLLM def _import_llamacpp() -> Any: from langchain.llms.llamacpp import LlamaCpp return LlamaCpp def _import_manifest() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
from langchain.llms.manifest import ManifestWrapper return ManifestWrapper def _import_minimax() -> Any: from langchain.llms.minimax import Minimax return Minimax def _import_mlflow() -> Any: from langchain.llms.mlflow import Mlflow return Mlflow def _import_mlflow_chat() -> Any: from langchain.chat_models.mlflow import ChatMlflow return ChatMlflow def _import_mlflow_ai_gateway() -> Any: from langchain.llms.mlflow_ai_gateway import MlflowAIGateway return MlflowAIGateway def _import_modal() -> Any: from langchain.llms.modal import Modal return Modal def _import_mosaicml() -> Any: from langchain.llms.mosaicml import MosaicML return MosaicML def _import_nlpcloud() -> Any: from langchain.llms.nlpcloud import NLPCloud return NLPCloud def _import_octoai_endpoint() -> Any: from langchain.llms.octoai_endpoint import OctoAIEndpoint return OctoAIEndpoint def _import_ollama() -> Any: from langchain.llms.ollama import Ollama return Ollama def _import_opaqueprompts() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
from langchain.llms.opaqueprompts import OpaquePrompts return OpaquePrompts def _import_azure_openai() -> Any: from langchain.llms.openai import AzureOpenAI return AzureOpenAI def _import_openai() -> Any: from langchain.llms.openai import OpenAI return OpenAI def _import_openai_chat() -> Any: from langchain.llms.openai import OpenAIChat return OpenAIChat def _import_openllm() -> Any: from langchain.llms.openllm import OpenLLM return OpenLLM def _import_openlm() -> Any: from langchain.llms.openlm import OpenLM return OpenLM def _import_pai_eas_endpoint() -> Any: from langchain.llms.pai_eas_endpoint import PaiEasEndpoint return PaiEasEndpoint def _import_petals() -> Any: from langchain.llms.petals import Petals return Petals def _import_pipelineai() -> Any: from langchain.llms.pipelineai import PipelineAI return PipelineAI def _import_predibase() -> Any: from langchain.llms.predibase import Predibase return Predibase def _import_predictionguard() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
from langchain.llms.predictionguard import PredictionGuard return PredictionGuard def _import_promptlayer() -> Any: from langchain.llms.promptlayer_openai import PromptLayerOpenAI return PromptLayerOpenAI def _import_promptlayer_chat() -> Any: from langchain.llms.promptlayer_openai import PromptLayerOpenAIChat return PromptLayerOpenAIChat def _import_replicate() -> Any: from langchain.llms.replicate import Replicate return Replicate def _import_rwkv() -> Any: from langchain.llms.rwkv import RWKV return RWKV def _import_sagemaker_endpoint() -> Any: from langchain.llms.sagemaker_endpoint import SagemakerEndpoint return SagemakerEndpoint def _import_self_hosted() -> Any: from langchain.llms.self_hosted import SelfHostedPipeline return SelfHostedPipeline def _import_self_hosted_hugging_face() -> Any: from langchain.llms.self_hosted_hugging_face import SelfHostedHuggingFaceLLM return SelfHostedHuggingFaceLLM def _import_stochasticai() -> Any: from langchain.llms.stochasticai import StochasticAI return StochasticAI def _import_symblai_nebula() -> Any: from langchain.llms.symblai_nebula import Nebula return Nebula def _import_textgen() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
from langchain.llms.textgen import TextGen return TextGen def _import_titan_takeoff() -> Any: from langchain.llms.titan_takeoff import TitanTakeoff return TitanTakeoff def _import_titan_takeoff_pro() -> Any: from langchain.llms.titan_takeoff_pro import TitanTakeoffPro return TitanTakeoffPro def _import_together() -> Any: from langchain.llms.together import Together return Together def _import_tongyi() -> Any: from langchain.llms.tongyi import Tongyi return Tongyi def _import_vertex() -> Any: from langchain.llms.vertexai import VertexAI return VertexAI def _import_vertex_model_garden() -> Any: from langchain.llms.vertexai import VertexAIModelGarden return VertexAIModelGarden def _import_vllm() -> Any: from langchain.llms.vllm import VLLM return VLLM def _import_vllm_openai() -> Any: from langchain.llms.vllm import VLLMOpenAI return VLLMOpenAI def _import_watsonxllm() -> Any: from langchain.llms.watsonxllm import WatsonxLLM return WatsonxLLM def _import_writer() -> Any:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
from langchain.llms.writer import Writer return Writer def _import_xinference() -> Any: from langchain.llms.xinference import Xinference return Xinference def _import_yandex_gpt() -> Any: from langchain.llms.yandex import YandexGPT return YandexGPT def _import_volcengine_maas() -> Any: from langchain.llms.volcengine_maas import VolcEngineMaasLLM return VolcEngineMaasLLM def __getattr__(name: str) -> Any: if name == "AI21": return _import_ai21() elif name == "AlephAlpha": return _import_aleph_alpha() elif name == "AmazonAPIGateway": return _import_amazon_api_gateway() elif name == "Anthropic": return _import_anthropic() elif name == "Anyscale": return _import_anyscale() elif name == "Arcee": return _import_arcee() elif name == "Aviary": return _import_aviary() elif name == "AzureMLOnlineEndpoint": return _import_azureml_endpoint() elif name == "QianfanLLMEndpoint": return _import_baidu_qianfan_endpoint()
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
elif name == "Banana": return _import_bananadev() elif name == "Baseten": return _import_baseten() elif name == "Beam": return _import_beam() elif name == "Bedrock": return _import_bedrock() elif name == "NIBittensorLLM": return _import_bittensor() elif name == "CerebriumAI": return _import_cerebriumai() elif name == "ChatGLM": return _import_chatglm() elif name == "Clarifai": return _import_clarifai() elif name == "Cohere": return _import_cohere() elif name == "CTransformers": return _import_ctransformers() elif name == "CTranslate2": return _import_ctranslate2() elif name == "Databricks": return _import_databricks() elif name == "DeepInfra": return _import_deepinfra() elif name == "DeepSparse": return _import_deepsparse() elif name == "EdenAI": return _import_edenai()
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
elif name == "FakeListLLM": return _import_fake() elif name == "Fireworks": return _import_fireworks() elif name == "ForefrontAI": return _import_forefrontai() elif name == "GigaChat": return _import_gigachat() elif name == "GooglePalm": return _import_google_palm() elif name == "GooseAI": return _import_gooseai() elif name == "GPT4All": return _import_gpt4all() elif name == "GradientLLM": return _import_gradient_ai() elif name == "HuggingFaceEndpoint": return _import_huggingface_endpoint() elif name == "HuggingFaceHub": return _import_huggingface_hub() elif name == "HuggingFacePipeline": return _import_huggingface_pipeline() elif name == "HuggingFaceTextGenInference": return _import_huggingface_text_gen_inference() elif name == "HumanInputLLM": return _import_human() elif name == "JavelinAIGateway": return _import_javelin_ai_gateway() elif name == "KoboldApiLLM": return _import_koboldai()
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
elif name == "LlamaCpp": return _import_llamacpp() elif name == "ManifestWrapper": return _import_manifest() elif name == "Minimax": return _import_minimax() elif name == "Mlflow": return _import_mlflow() elif name == "MlflowAIGateway": return _import_mlflow_ai_gateway() elif name == "Modal": return _import_modal() elif name == "MosaicML": return _import_mosaicml() elif name == "NLPCloud": return _import_nlpcloud() elif name == "OctoAIEndpoint": return _import_octoai_endpoint() elif name == "Ollama": return _import_ollama() elif name == "OpaquePrompts": return _import_opaqueprompts() elif name == "AzureOpenAI": return _import_azure_openai() elif name == "OpenAI": return _import_openai() elif name == "OpenAIChat": return _import_openai_chat() elif name == "OpenLLM": return _import_openllm()
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
elif name == "OpenLM": return _import_openlm() elif name == "PaiEasEndpoint": return _import_pai_eas_endpoint() elif name == "Petals": return _import_petals() elif name == "PipelineAI": return _import_pipelineai() elif name == "Predibase": return _import_predibase() elif name == "PredictionGuard": return _import_predictionguard() elif name == "PromptLayerOpenAI": return _import_promptlayer() elif name == "PromptLayerOpenAIChat": return _import_promptlayer_chat() elif name == "Replicate": return _import_replicate() elif name == "RWKV": return _import_rwkv() elif name == "SagemakerEndpoint": return _import_sagemaker_endpoint() elif name == "SelfHostedPipeline": return _import_self_hosted() elif name == "SelfHostedHuggingFaceLLM": return _import_self_hosted_hugging_face() elif name == "StochasticAI": return _import_stochasticai() elif name == "Nebula": return _import_symblai_nebula()
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
elif name == "TextGen": return _import_textgen() elif name == "TitanTakeoff": return _import_titan_takeoff() elif name == "TitanTakeoffPro": return _import_titan_takeoff_pro() elif name == "Together": return _import_together() elif name == "Tongyi": return _import_tongyi() elif name == "VertexAI": return _import_vertex() elif name == "VertexAIModelGarden": return _import_vertex_model_garden() elif name == "VLLM": return _import_vllm() elif name == "VLLMOpenAI": return _import_vllm_openai() elif name == "WatsonxLLM": return _import_watsonxllm() elif name == "Writer": return _import_writer() elif name == "Xinference": return _import_xinference() elif name == "YandexGPT": return _import_yandex_gpt() elif name == "VolcEngineMaasLLM": return _import_volcengine_maas() elif name == "type_to_cls_dict":
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
type_to_cls_dict: Dict[str, Type[BaseLLM]] = { k: v() for k, v in get_type_to_cls_dict().items() } return type_to_cls_dict else: raise AttributeError(f"Could not find: {name}") __all__ = [ "AI21", "AlephAlpha", "AmazonAPIGateway", "Anthropic", "Anyscale", "Arcee", "Aviary", "AzureMLOnlineEndpoint", "AzureOpenAI", "Banana", "Baseten", "Beam", "Bedrock", "CTransformers", "CTranslate2", "CerebriumAI", "ChatGLM", "Clarifai", "Cohere", "Databricks", "DeepInfra", "DeepSparse", "EdenAI",
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
"FakeListLLM", "Fireworks", "ForefrontAI", "GigaChat", "GPT4All", "GooglePalm", "GooseAI", "GradientLLM", "HuggingFaceEndpoint", "HuggingFaceHub", "HuggingFacePipeline", "HuggingFaceTextGenInference", "HumanInputLLM", "KoboldApiLLM", "LlamaCpp", "TextGen", "ManifestWrapper", "Minimax", "MlflowAIGateway", "Modal", "MosaicML", "Nebula", "NIBittensorLLM", "NLPCloud", "Ollama", "OpenAI", "OpenAIChat", "OpenLLM", "OpenLM", "PaiEasEndpoint",
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
"Petals", "PipelineAI", "Predibase", "PredictionGuard", "PromptLayerOpenAI", "PromptLayerOpenAIChat", "OpaquePrompts", "RWKV", "Replicate", "SagemakerEndpoint", "SelfHostedHuggingFaceLLM", "SelfHostedPipeline", "StochasticAI", "TitanTakeoff", "TitanTakeoffPro", "Tongyi", "VertexAI", "VertexAIModelGarden", "VLLM", "VLLMOpenAI", "WatsonxLLM", "Writer", "OctoAIEndpoint", "Xinference", "JavelinAIGateway", "QianfanLLMEndpoint", "YandexGPT", "VolcEngineMaasLLM", ] def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, "amazon_bedrock": _import_bedrock, "anthropic": _import_anthropic, "anyscale": _import_anyscale, "arcee": _import_arcee, "aviary": _import_aviary, "azure": _import_azure_openai, "azureml_endpoint": _import_azureml_endpoint, "bananadev": _import_bananadev, "baseten": _import_baseten, "beam": _import_beam, "cerebriumai": _import_cerebriumai, "chat_glm": _import_chatglm, "clarifai": _import_clarifai, "cohere": _import_cohere, "ctransformers": _import_ctransformers, "ctranslate2": _import_ctranslate2, "databricks": _import_databricks, "databricks-chat": _import_databricks_chat,
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
"deepinfra": _import_deepinfra, "deepsparse": _import_deepsparse, "edenai": _import_edenai, "fake-list": _import_fake, "forefrontai": _import_forefrontai, "giga-chat-model": _import_gigachat, "google_palm": _import_google_palm, "gooseai": _import_gooseai, "gradient": _import_gradient_ai, "gpt4all": _import_gpt4all, "huggingface_endpoint": _import_huggingface_endpoint, "huggingface_hub": _import_huggingface_hub, "huggingface_pipeline": _import_huggingface_pipeline, "huggingface_textgen_inference": _import_huggingface_text_gen_inference, "human-input": _import_human, "koboldai": _import_koboldai, "llamacpp": _import_llamacpp, "textgen": _import_textgen, "minimax": _import_minimax, "mlflow": _import_mlflow, "mlflow-chat": _import_mlflow_chat, "mlflow-ai-gateway": _import_mlflow_ai_gateway, "modal": _import_modal, "mosaic": _import_mosaicml, "nebula": _import_symblai_nebula, "nibittensor": _import_bittensor, "nlpcloud": _import_nlpcloud, "ollama": _import_ollama, "openai": _import_openai, "openlm": _import_openlm,
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,127
Volc Engine MaaS has wrong entry in LLM type to class dict (causing SpaCy to not work with LangChain anymore)
### System Info * Windows 11 Home (build 22621.2715) * Python 3.12.0 * Clean virtual environment using Poetry with following dependencies: ``` python = "3.12.0" langchain = "0.0.344" spacy = "3.7.2" spacy-llm = "0.6.4" ``` ### Who can help? @h3l As the creator of the pull request where VolcEngine was introduced @baskaryan As tag handler of that pull request ### Information - [ ] The official example notebooks/scripts - [X] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction Anything that triggers spaCy's registry to make an inventory, for example: ```python import spacy spacy.blank("en") ``` With the last part of the Traceback being: ``` File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain\llms\__init__.py", line 699, in __getattr__ k: v() for k, v in get_type_to_cls_dict().items() ^^^ File "PROJECT_FOLDER\.venv\Lib\site-packages\langchain_core\load\serializable.py", line 97, in __init__ super().__init__(**kwargs) File "PROJECT_FOLDER\.venv\Lib\site-packages\pydantic\v1\main.py", line 341, in __init__ raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for VolcEngineMaasLLM __root__ Did not find volc_engine_maas_ak, please add an environment variable `VOLC_ACCESSKEY` which contains it, or pass `volc_engine_maas_ak` as a named parameter. (type=value_error) ``` #### What I think causes this I am quite certain that this is caused by [`langchain.llms.__init__.py:869 (for commit b161f30)`](https://github.com/langchain-ai/langchain/blob/b161f302ff56a14d8d0331cbec4a3efa23d06e1a/libs/langchain/langchain/llms/__init__.py#L869C51-L869C51): ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # Line below is the only that actually calls the import function, returning a class instead of an import function "VolcEngineMaasLLM": _import_volcengine_maas(), } ``` The Volc Engine Maas LLM is the only in this dict to actually call the import function, while all other entries only the function itself, and do not call it. ### Expected behavior Class to type dict only returns import functions, not actual classes: ```python def get_type_to_cls_dict() -> Dict[str, Callable[[], Type[BaseLLM]]]: return { "ai21": _import_ai21, "aleph_alpha": _import_aleph_alpha, "amazon_api_gateway": _import_amazon_api_gateway, ... "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, # What I think would be correct (now without function call) "VolcEngineMaasLLM": _import_volcengine_maas, } ``` Unfortunately I don't have time to put in a PR myself, but I hope this helps finding the solution!
https://github.com/langchain-ai/langchain/issues/14127
https://github.com/langchain-ai/langchain/pull/14194
6ae0194dc70119d8b05a0624a6cc4950f9f84608
818252b1f8b9ac9af6bb80d43b21c5e95d6b2e11
"2023-12-01T13:58:13Z"
python
"2023-12-03T16:43:23Z"
libs/langchain/langchain/llms/__init__.py
"pai_eas_endpoint": _import_pai_eas_endpoint, "petals": _import_petals, "pipelineai": _import_pipelineai, "predibase": _import_predibase, "opaqueprompts": _import_opaqueprompts, "replicate": _import_replicate, "rwkv": _import_rwkv, "sagemaker_endpoint": _import_sagemaker_endpoint, "self_hosted": _import_self_hosted, "self_hosted_hugging_face": _import_self_hosted_hugging_face, "stochasticai": _import_stochasticai, "together": _import_together, "tongyi": _import_tongyi, "titan_takeoff": _import_titan_takeoff, "titan_takeoff_pro": _import_titan_takeoff_pro, "vertexai": _import_vertex, "vertexai_model_garden": _import_vertex_model_garden, "openllm": _import_openllm, "openllm_client": _import_openllm, "vllm": _import_vllm, "vllm_openai": _import_vllm_openai, "watsonxllm": _import_watsonxllm, "writer": _import_writer, "xinference": _import_xinference, "javelin-ai-gateway": _import_javelin_ai_gateway, "qianfan_endpoint": _import_baidu_qianfan_endpoint, "yandex_gpt": _import_yandex_gpt, "VolcEngineMaasLLM": _import_volcengine_maas(), }
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,069
AzureOpenAI azure_ad_token_provider Keyerror
### System Info When I use below snippet of code ``` import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ``` I get error : ```--------------------------------------------------------------------------- KeyError Traceback (most recent call last) Cell In[36], line 21 18 # api_version = "2023-05-15" 19 endpoint = "https://xxxx.openai.azure.com" ---> 21 client = AzureOpenAI( 22 azure_endpoint=endpoint, 23 api_version="2023-05-15", 24 azure_deployment="example-gpt-4", 25 azure_ad_token_provider=token_provider, 26 ) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain_core/load/serializable.py:97, in Serializable.__init__(self, **kwargs) 96 def __init__(self, **kwargs: Any) -> None: ---> 97 super().__init__(**kwargs) 98 self._lc_kwargs = kwargs File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:339, in BaseModel.__init__(__pydantic_self__, **data) 333 """ 334 Create a new model by parsing and validating input data from keyword arguments. 335 336 Raises ValidationError if the input data cannot be parsed to form a valid model. 337 """ 338 # Uses something other than `self` the first arg to allow "self" as a settable attribute --> 339 values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data) 340 if validation_error: 341 raise validation_error File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:1102, in validate_model(model, input_data, cls) 1100 continue 1101 try: -> 1102 values = validator(cls_, values) 1103 except (ValueError, TypeError, AssertionError) as exc: 1104 errors.append(ErrorWrapper(exc, loc=ROOT_KEY)) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain/llms/openai.py:887, in AzureOpenAI.validate_environment(cls, values) 877 values["openai_api_base"] += ( 878 "/deployments/" + values["deployment_name"] 879 ) 880 values["deployment_name"] = None 881 client_params = { 882 "api_version": values["openai_api_version"], 883 "azure_endpoint": values["azure_endpoint"], 884 "azure_deployment": values["deployment_name"], 885 "api_key": values["openai_api_key"], 886 "azure_ad_token": values["azure_ad_token"], --> 887 "azure_ad_token_provider": values["azure_ad_token_provider"], 888 "organization": values["openai_organization"], 889 "base_url": values["openai_api_base"], 890 "timeout": values["request_timeout"], 891 "max_retries": values["max_retries"], 892 "default_headers": values["default_headers"], 893 "default_query": values["default_query"], 894 "http_client": values["http_client"], 895 } 896 values["client"] = openai.AzureOpenAI(**client_params).completions 897 values["async_client"] = openai.AsyncAzureOpenAI( 898 **client_params 899 ).completions KeyError: 'azure_ad_token_provider' ``` Ive also tried AzureChatOpenAI , and I get the same error back. The error is not reproduced when I use openai library AzureOpenAI . Also on openai the azure_ad_token_provider has type azure_ad_token_provider: 'AzureADTokenProvider | None' = None while in langchain it has type azure_ad_token_provider: Optional[str] = None which also makes me wonder if it would take as input a different type than string to work with. any ideas on how to fix this? Im actually using Azure Service principal authentication, and if I use as alternative field azure_ad_token = credential.get_token(“https://cognitiveservices.azure.com/.default”).token I get token expired after 60min which does not happen with a bearer token, so It is important to me to make the token_provider work. libraries : pydantic 1.10.12 pydantic_core 2.10.1 openai 1.2.0 langchain 0.0.342 langchain-core 0.0.7 ### Who can help? @hwchase17 @agola11 ### Information - [X] The official example notebooks/scripts - [ ] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ### Expected behavior client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) should return a Runnable instance which I can use for LLMChain
https://github.com/langchain-ai/langchain/issues/14069
https://github.com/langchain-ai/langchain/pull/14166
9938086df07d69d24f9770209ea9087d3b906155
62505043be20cf8af491e30785a6ca0eeb1d276e
"2023-11-30T13:39:55Z"
python
"2023-12-03T16:55:25Z"
libs/langchain/langchain/chat_models/azure_openai.py
"""Azure OpenAI chat wrapper.""" from __future__ import annotations import logging import os import warnings from typing import Any, Dict, Union from langchain_core.outputs import ChatResult from langchain_core.pydantic_v1 import BaseModel, Field, root_validator from langchain.chat_models.openai import ChatOpenAI from langchain.utils import get_from_dict_or_env from langchain.utils.openai import is_openai_v1 logger = logging.getLogger(__name__) class AzureChatOpenAI(ChatOpenAI):
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,069
AzureOpenAI azure_ad_token_provider Keyerror
### System Info When I use below snippet of code ``` import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ``` I get error : ```--------------------------------------------------------------------------- KeyError Traceback (most recent call last) Cell In[36], line 21 18 # api_version = "2023-05-15" 19 endpoint = "https://xxxx.openai.azure.com" ---> 21 client = AzureOpenAI( 22 azure_endpoint=endpoint, 23 api_version="2023-05-15", 24 azure_deployment="example-gpt-4", 25 azure_ad_token_provider=token_provider, 26 ) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain_core/load/serializable.py:97, in Serializable.__init__(self, **kwargs) 96 def __init__(self, **kwargs: Any) -> None: ---> 97 super().__init__(**kwargs) 98 self._lc_kwargs = kwargs File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:339, in BaseModel.__init__(__pydantic_self__, **data) 333 """ 334 Create a new model by parsing and validating input data from keyword arguments. 335 336 Raises ValidationError if the input data cannot be parsed to form a valid model. 337 """ 338 # Uses something other than `self` the first arg to allow "self" as a settable attribute --> 339 values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data) 340 if validation_error: 341 raise validation_error File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:1102, in validate_model(model, input_data, cls) 1100 continue 1101 try: -> 1102 values = validator(cls_, values) 1103 except (ValueError, TypeError, AssertionError) as exc: 1104 errors.append(ErrorWrapper(exc, loc=ROOT_KEY)) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain/llms/openai.py:887, in AzureOpenAI.validate_environment(cls, values) 877 values["openai_api_base"] += ( 878 "/deployments/" + values["deployment_name"] 879 ) 880 values["deployment_name"] = None 881 client_params = { 882 "api_version": values["openai_api_version"], 883 "azure_endpoint": values["azure_endpoint"], 884 "azure_deployment": values["deployment_name"], 885 "api_key": values["openai_api_key"], 886 "azure_ad_token": values["azure_ad_token"], --> 887 "azure_ad_token_provider": values["azure_ad_token_provider"], 888 "organization": values["openai_organization"], 889 "base_url": values["openai_api_base"], 890 "timeout": values["request_timeout"], 891 "max_retries": values["max_retries"], 892 "default_headers": values["default_headers"], 893 "default_query": values["default_query"], 894 "http_client": values["http_client"], 895 } 896 values["client"] = openai.AzureOpenAI(**client_params).completions 897 values["async_client"] = openai.AsyncAzureOpenAI( 898 **client_params 899 ).completions KeyError: 'azure_ad_token_provider' ``` Ive also tried AzureChatOpenAI , and I get the same error back. The error is not reproduced when I use openai library AzureOpenAI . Also on openai the azure_ad_token_provider has type azure_ad_token_provider: 'AzureADTokenProvider | None' = None while in langchain it has type azure_ad_token_provider: Optional[str] = None which also makes me wonder if it would take as input a different type than string to work with. any ideas on how to fix this? Im actually using Azure Service principal authentication, and if I use as alternative field azure_ad_token = credential.get_token(“https://cognitiveservices.azure.com/.default”).token I get token expired after 60min which does not happen with a bearer token, so It is important to me to make the token_provider work. libraries : pydantic 1.10.12 pydantic_core 2.10.1 openai 1.2.0 langchain 0.0.342 langchain-core 0.0.7 ### Who can help? @hwchase17 @agola11 ### Information - [X] The official example notebooks/scripts - [ ] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ### Expected behavior client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) should return a Runnable instance which I can use for LLMChain
https://github.com/langchain-ai/langchain/issues/14069
https://github.com/langchain-ai/langchain/pull/14166
9938086df07d69d24f9770209ea9087d3b906155
62505043be20cf8af491e30785a6ca0eeb1d276e
"2023-11-30T13:39:55Z"
python
"2023-12-03T16:55:25Z"
libs/langchain/langchain/chat_models/azure_openai.py
"""`Azure OpenAI` Chat Completion API. To use this class you must have a deployed model on Azure OpenAI. Use `deployment_name` in the constructor to refer to the "Model deployment name" in the Azure portal. In addition, you should have the ``openai`` python package installed, and the following environment variables set or passed in constructor in lower case: - ``AZURE_OPENAI_API_KEY`` - ``AZURE_OPENAI_API_ENDPOINT`` - ``AZURE_OPENAI_AD_TOKEN`` - ``OPENAI_API_VERSION`` - ``OPENAI_PROXY`` For example, if you have `gpt-35-turbo` deployed, with the deployment name `35-turbo-dev`, the constructor should look like: .. code-block:: python AzureChatOpenAI( azure_deployment="35-turbo-dev", openai_api_version="2023-05-15", ) Be aware the API version may change. You can also specify the version of the model using ``model_version`` constructor parameter, as Azure OpenAI doesn't return model version with the response. Default is empty. When you specify the version, it will be appended to the
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,069
AzureOpenAI azure_ad_token_provider Keyerror
### System Info When I use below snippet of code ``` import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ``` I get error : ```--------------------------------------------------------------------------- KeyError Traceback (most recent call last) Cell In[36], line 21 18 # api_version = "2023-05-15" 19 endpoint = "https://xxxx.openai.azure.com" ---> 21 client = AzureOpenAI( 22 azure_endpoint=endpoint, 23 api_version="2023-05-15", 24 azure_deployment="example-gpt-4", 25 azure_ad_token_provider=token_provider, 26 ) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain_core/load/serializable.py:97, in Serializable.__init__(self, **kwargs) 96 def __init__(self, **kwargs: Any) -> None: ---> 97 super().__init__(**kwargs) 98 self._lc_kwargs = kwargs File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:339, in BaseModel.__init__(__pydantic_self__, **data) 333 """ 334 Create a new model by parsing and validating input data from keyword arguments. 335 336 Raises ValidationError if the input data cannot be parsed to form a valid model. 337 """ 338 # Uses something other than `self` the first arg to allow "self" as a settable attribute --> 339 values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data) 340 if validation_error: 341 raise validation_error File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:1102, in validate_model(model, input_data, cls) 1100 continue 1101 try: -> 1102 values = validator(cls_, values) 1103 except (ValueError, TypeError, AssertionError) as exc: 1104 errors.append(ErrorWrapper(exc, loc=ROOT_KEY)) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain/llms/openai.py:887, in AzureOpenAI.validate_environment(cls, values) 877 values["openai_api_base"] += ( 878 "/deployments/" + values["deployment_name"] 879 ) 880 values["deployment_name"] = None 881 client_params = { 882 "api_version": values["openai_api_version"], 883 "azure_endpoint": values["azure_endpoint"], 884 "azure_deployment": values["deployment_name"], 885 "api_key": values["openai_api_key"], 886 "azure_ad_token": values["azure_ad_token"], --> 887 "azure_ad_token_provider": values["azure_ad_token_provider"], 888 "organization": values["openai_organization"], 889 "base_url": values["openai_api_base"], 890 "timeout": values["request_timeout"], 891 "max_retries": values["max_retries"], 892 "default_headers": values["default_headers"], 893 "default_query": values["default_query"], 894 "http_client": values["http_client"], 895 } 896 values["client"] = openai.AzureOpenAI(**client_params).completions 897 values["async_client"] = openai.AsyncAzureOpenAI( 898 **client_params 899 ).completions KeyError: 'azure_ad_token_provider' ``` Ive also tried AzureChatOpenAI , and I get the same error back. The error is not reproduced when I use openai library AzureOpenAI . Also on openai the azure_ad_token_provider has type azure_ad_token_provider: 'AzureADTokenProvider | None' = None while in langchain it has type azure_ad_token_provider: Optional[str] = None which also makes me wonder if it would take as input a different type than string to work with. any ideas on how to fix this? Im actually using Azure Service principal authentication, and if I use as alternative field azure_ad_token = credential.get_token(“https://cognitiveservices.azure.com/.default”).token I get token expired after 60min which does not happen with a bearer token, so It is important to me to make the token_provider work. libraries : pydantic 1.10.12 pydantic_core 2.10.1 openai 1.2.0 langchain 0.0.342 langchain-core 0.0.7 ### Who can help? @hwchase17 @agola11 ### Information - [X] The official example notebooks/scripts - [ ] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ### Expected behavior client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) should return a Runnable instance which I can use for LLMChain
https://github.com/langchain-ai/langchain/issues/14069
https://github.com/langchain-ai/langchain/pull/14166
9938086df07d69d24f9770209ea9087d3b906155
62505043be20cf8af491e30785a6ca0eeb1d276e
"2023-11-30T13:39:55Z"
python
"2023-12-03T16:55:25Z"
libs/langchain/langchain/chat_models/azure_openai.py
model name in the response. Setting correct version will help you to calculate the cost properly. Model version is not validated, so make sure you set it correctly to get the correct cost. Any parameters that are valid to be passed to the openai.create call can be passed in, even if not explicitly saved on this class. """ azure_endpoint: Union[str, None] = None """Your Azure endpoint, including the resource. Automatically inferred from env var `AZURE_OPENAI_ENDPOINT` if not provided. Example: `https://example-resource.azure.openai.com/` """ deployment_name: Union[str, None] = Field(default=None, alias="azure_deployment") """A model deployment. If given sets the base client URL to include `/deployments/{azure_deployment}`. Note: this means you won't be able to use non-deployment endpoints. """ openai_api_version: str = Field(default="", alias="api_version") """Automatically inferred from env var `OPENAI_API_VERSION` if not provided.""" openai_api_key: Union[str, None] = Field(default=None, alias="api_key") """Automatically inferred from env var `AZURE_OPENAI_API_KEY` if not provided.""" azure_ad_token: Union[str, None] = None """Your Azure Active Directory token. Automatically inferred from env var `AZURE_OPENAI_AD_TOKEN` if not provided. For more: https://www.microsoft.com/en-us/security/business/identity-access/microsoft-entra-id.
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,069
AzureOpenAI azure_ad_token_provider Keyerror
### System Info When I use below snippet of code ``` import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ``` I get error : ```--------------------------------------------------------------------------- KeyError Traceback (most recent call last) Cell In[36], line 21 18 # api_version = "2023-05-15" 19 endpoint = "https://xxxx.openai.azure.com" ---> 21 client = AzureOpenAI( 22 azure_endpoint=endpoint, 23 api_version="2023-05-15", 24 azure_deployment="example-gpt-4", 25 azure_ad_token_provider=token_provider, 26 ) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain_core/load/serializable.py:97, in Serializable.__init__(self, **kwargs) 96 def __init__(self, **kwargs: Any) -> None: ---> 97 super().__init__(**kwargs) 98 self._lc_kwargs = kwargs File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:339, in BaseModel.__init__(__pydantic_self__, **data) 333 """ 334 Create a new model by parsing and validating input data from keyword arguments. 335 336 Raises ValidationError if the input data cannot be parsed to form a valid model. 337 """ 338 # Uses something other than `self` the first arg to allow "self" as a settable attribute --> 339 values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data) 340 if validation_error: 341 raise validation_error File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:1102, in validate_model(model, input_data, cls) 1100 continue 1101 try: -> 1102 values = validator(cls_, values) 1103 except (ValueError, TypeError, AssertionError) as exc: 1104 errors.append(ErrorWrapper(exc, loc=ROOT_KEY)) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain/llms/openai.py:887, in AzureOpenAI.validate_environment(cls, values) 877 values["openai_api_base"] += ( 878 "/deployments/" + values["deployment_name"] 879 ) 880 values["deployment_name"] = None 881 client_params = { 882 "api_version": values["openai_api_version"], 883 "azure_endpoint": values["azure_endpoint"], 884 "azure_deployment": values["deployment_name"], 885 "api_key": values["openai_api_key"], 886 "azure_ad_token": values["azure_ad_token"], --> 887 "azure_ad_token_provider": values["azure_ad_token_provider"], 888 "organization": values["openai_organization"], 889 "base_url": values["openai_api_base"], 890 "timeout": values["request_timeout"], 891 "max_retries": values["max_retries"], 892 "default_headers": values["default_headers"], 893 "default_query": values["default_query"], 894 "http_client": values["http_client"], 895 } 896 values["client"] = openai.AzureOpenAI(**client_params).completions 897 values["async_client"] = openai.AsyncAzureOpenAI( 898 **client_params 899 ).completions KeyError: 'azure_ad_token_provider' ``` Ive also tried AzureChatOpenAI , and I get the same error back. The error is not reproduced when I use openai library AzureOpenAI . Also on openai the azure_ad_token_provider has type azure_ad_token_provider: 'AzureADTokenProvider | None' = None while in langchain it has type azure_ad_token_provider: Optional[str] = None which also makes me wonder if it would take as input a different type than string to work with. any ideas on how to fix this? Im actually using Azure Service principal authentication, and if I use as alternative field azure_ad_token = credential.get_token(“https://cognitiveservices.azure.com/.default”).token I get token expired after 60min which does not happen with a bearer token, so It is important to me to make the token_provider work. libraries : pydantic 1.10.12 pydantic_core 2.10.1 openai 1.2.0 langchain 0.0.342 langchain-core 0.0.7 ### Who can help? @hwchase17 @agola11 ### Information - [X] The official example notebooks/scripts - [ ] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ### Expected behavior client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) should return a Runnable instance which I can use for LLMChain
https://github.com/langchain-ai/langchain/issues/14069
https://github.com/langchain-ai/langchain/pull/14166
9938086df07d69d24f9770209ea9087d3b906155
62505043be20cf8af491e30785a6ca0eeb1d276e
"2023-11-30T13:39:55Z"
python
"2023-12-03T16:55:25Z"
libs/langchain/langchain/chat_models/azure_openai.py
""" azure_ad_token_provider: Union[str, None] = None """A function that returns an Azure Active Directory token. Will be invoked on every request. """ model_version: str = "" """Legacy, for openai<1.0.0 support.""" openai_api_type: str = "" """Legacy, for openai<1.0.0 support.""" validate_base_url: bool = True """For backwards compatibility. If legacy val openai_api_base is passed in, try to infer if it is a base_url or azure_endpoint and update accordingly. """ @root_validator() def validate_environment(cls, values: Dict) -> Dict: """Validate that api key and python package exists in environment.""" if values["n"] < 1: raise ValueError("n must be at least 1.") if values["n"] > 1 and values["streaming"]: raise ValueError("n must be 1 when streaming.") values["openai_api_key"] = ( values["openai_api_key"] or os.getenv("AZURE_OPENAI_API_KEY") or os.getenv("OPENAI_API_KEY") ) values["openai_api_base"] = values["openai_api_base"] or os.getenv(
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,069
AzureOpenAI azure_ad_token_provider Keyerror
### System Info When I use below snippet of code ``` import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ``` I get error : ```--------------------------------------------------------------------------- KeyError Traceback (most recent call last) Cell In[36], line 21 18 # api_version = "2023-05-15" 19 endpoint = "https://xxxx.openai.azure.com" ---> 21 client = AzureOpenAI( 22 azure_endpoint=endpoint, 23 api_version="2023-05-15", 24 azure_deployment="example-gpt-4", 25 azure_ad_token_provider=token_provider, 26 ) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain_core/load/serializable.py:97, in Serializable.__init__(self, **kwargs) 96 def __init__(self, **kwargs: Any) -> None: ---> 97 super().__init__(**kwargs) 98 self._lc_kwargs = kwargs File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:339, in BaseModel.__init__(__pydantic_self__, **data) 333 """ 334 Create a new model by parsing and validating input data from keyword arguments. 335 336 Raises ValidationError if the input data cannot be parsed to form a valid model. 337 """ 338 # Uses something other than `self` the first arg to allow "self" as a settable attribute --> 339 values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data) 340 if validation_error: 341 raise validation_error File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:1102, in validate_model(model, input_data, cls) 1100 continue 1101 try: -> 1102 values = validator(cls_, values) 1103 except (ValueError, TypeError, AssertionError) as exc: 1104 errors.append(ErrorWrapper(exc, loc=ROOT_KEY)) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain/llms/openai.py:887, in AzureOpenAI.validate_environment(cls, values) 877 values["openai_api_base"] += ( 878 "/deployments/" + values["deployment_name"] 879 ) 880 values["deployment_name"] = None 881 client_params = { 882 "api_version": values["openai_api_version"], 883 "azure_endpoint": values["azure_endpoint"], 884 "azure_deployment": values["deployment_name"], 885 "api_key": values["openai_api_key"], 886 "azure_ad_token": values["azure_ad_token"], --> 887 "azure_ad_token_provider": values["azure_ad_token_provider"], 888 "organization": values["openai_organization"], 889 "base_url": values["openai_api_base"], 890 "timeout": values["request_timeout"], 891 "max_retries": values["max_retries"], 892 "default_headers": values["default_headers"], 893 "default_query": values["default_query"], 894 "http_client": values["http_client"], 895 } 896 values["client"] = openai.AzureOpenAI(**client_params).completions 897 values["async_client"] = openai.AsyncAzureOpenAI( 898 **client_params 899 ).completions KeyError: 'azure_ad_token_provider' ``` Ive also tried AzureChatOpenAI , and I get the same error back. The error is not reproduced when I use openai library AzureOpenAI . Also on openai the azure_ad_token_provider has type azure_ad_token_provider: 'AzureADTokenProvider | None' = None while in langchain it has type azure_ad_token_provider: Optional[str] = None which also makes me wonder if it would take as input a different type than string to work with. any ideas on how to fix this? Im actually using Azure Service principal authentication, and if I use as alternative field azure_ad_token = credential.get_token(“https://cognitiveservices.azure.com/.default”).token I get token expired after 60min which does not happen with a bearer token, so It is important to me to make the token_provider work. libraries : pydantic 1.10.12 pydantic_core 2.10.1 openai 1.2.0 langchain 0.0.342 langchain-core 0.0.7 ### Who can help? @hwchase17 @agola11 ### Information - [X] The official example notebooks/scripts - [ ] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ### Expected behavior client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) should return a Runnable instance which I can use for LLMChain
https://github.com/langchain-ai/langchain/issues/14069
https://github.com/langchain-ai/langchain/pull/14166
9938086df07d69d24f9770209ea9087d3b906155
62505043be20cf8af491e30785a6ca0eeb1d276e
"2023-11-30T13:39:55Z"
python
"2023-12-03T16:55:25Z"
libs/langchain/langchain/chat_models/azure_openai.py
"OPENAI_API_BASE" ) values["openai_api_version"] = values["openai_api_version"] or os.getenv( "OPENAI_API_VERSION" ) values["openai_organization"] = ( values["openai_organization"] or os.getenv("OPENAI_ORG_ID") or os.getenv("OPENAI_ORGANIZATION") ) values["azure_endpoint"] = values["azure_endpoint"] or os.getenv( "AZURE_OPENAI_ENDPOINT" ) values["azure_ad_token"] = values["azure_ad_token"] or os.getenv( "AZURE_OPENAI_AD_TOKEN" ) values["openai_api_type"] = get_from_dict_or_env( values, "openai_api_type", "OPENAI_API_TYPE", default="azure" ) values["openai_proxy"] = get_from_dict_or_env( values, "openai_proxy", "OPENAI_PROXY", default="" ) try: import openai except ImportError: raise ImportError( "Could not import openai python package. " "Please install it with `pip install openai`." )
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,069
AzureOpenAI azure_ad_token_provider Keyerror
### System Info When I use below snippet of code ``` import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ``` I get error : ```--------------------------------------------------------------------------- KeyError Traceback (most recent call last) Cell In[36], line 21 18 # api_version = "2023-05-15" 19 endpoint = "https://xxxx.openai.azure.com" ---> 21 client = AzureOpenAI( 22 azure_endpoint=endpoint, 23 api_version="2023-05-15", 24 azure_deployment="example-gpt-4", 25 azure_ad_token_provider=token_provider, 26 ) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain_core/load/serializable.py:97, in Serializable.__init__(self, **kwargs) 96 def __init__(self, **kwargs: Any) -> None: ---> 97 super().__init__(**kwargs) 98 self._lc_kwargs = kwargs File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:339, in BaseModel.__init__(__pydantic_self__, **data) 333 """ 334 Create a new model by parsing and validating input data from keyword arguments. 335 336 Raises ValidationError if the input data cannot be parsed to form a valid model. 337 """ 338 # Uses something other than `self` the first arg to allow "self" as a settable attribute --> 339 values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data) 340 if validation_error: 341 raise validation_error File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:1102, in validate_model(model, input_data, cls) 1100 continue 1101 try: -> 1102 values = validator(cls_, values) 1103 except (ValueError, TypeError, AssertionError) as exc: 1104 errors.append(ErrorWrapper(exc, loc=ROOT_KEY)) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain/llms/openai.py:887, in AzureOpenAI.validate_environment(cls, values) 877 values["openai_api_base"] += ( 878 "/deployments/" + values["deployment_name"] 879 ) 880 values["deployment_name"] = None 881 client_params = { 882 "api_version": values["openai_api_version"], 883 "azure_endpoint": values["azure_endpoint"], 884 "azure_deployment": values["deployment_name"], 885 "api_key": values["openai_api_key"], 886 "azure_ad_token": values["azure_ad_token"], --> 887 "azure_ad_token_provider": values["azure_ad_token_provider"], 888 "organization": values["openai_organization"], 889 "base_url": values["openai_api_base"], 890 "timeout": values["request_timeout"], 891 "max_retries": values["max_retries"], 892 "default_headers": values["default_headers"], 893 "default_query": values["default_query"], 894 "http_client": values["http_client"], 895 } 896 values["client"] = openai.AzureOpenAI(**client_params).completions 897 values["async_client"] = openai.AsyncAzureOpenAI( 898 **client_params 899 ).completions KeyError: 'azure_ad_token_provider' ``` Ive also tried AzureChatOpenAI , and I get the same error back. The error is not reproduced when I use openai library AzureOpenAI . Also on openai the azure_ad_token_provider has type azure_ad_token_provider: 'AzureADTokenProvider | None' = None while in langchain it has type azure_ad_token_provider: Optional[str] = None which also makes me wonder if it would take as input a different type than string to work with. any ideas on how to fix this? Im actually using Azure Service principal authentication, and if I use as alternative field azure_ad_token = credential.get_token(“https://cognitiveservices.azure.com/.default”).token I get token expired after 60min which does not happen with a bearer token, so It is important to me to make the token_provider work. libraries : pydantic 1.10.12 pydantic_core 2.10.1 openai 1.2.0 langchain 0.0.342 langchain-core 0.0.7 ### Who can help? @hwchase17 @agola11 ### Information - [X] The official example notebooks/scripts - [ ] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ### Expected behavior client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) should return a Runnable instance which I can use for LLMChain
https://github.com/langchain-ai/langchain/issues/14069
https://github.com/langchain-ai/langchain/pull/14166
9938086df07d69d24f9770209ea9087d3b906155
62505043be20cf8af491e30785a6ca0eeb1d276e
"2023-11-30T13:39:55Z"
python
"2023-12-03T16:55:25Z"
libs/langchain/langchain/chat_models/azure_openai.py
if is_openai_v1(): openai_api_base = values["openai_api_base"] if openai_api_base and values["validate_base_url"]: if "/openai" not in openai_api_base: values["openai_api_base"] = ( values["openai_api_base"].rstrip("/") + "/openai" ) warnings.warn( "As of openai>=1.0.0, Azure endpoints should be specified via " f"the `azure_endpoint` param not `openai_api_base` " f"(or alias `base_url`). Updating `openai_api_base` from " f"{openai_api_base} to {values['openai_api_base']}." ) if values["deployment_name"]: warnings.warn( "As of openai>=1.0.0, if `deployment_name` (or alias " "`azure_deployment`) is specified then " "`openai_api_base` (or alias `base_url`) should not be. " "Instead use `deployment_name` (or alias `azure_deployment`) " "and `azure_endpoint`." ) if values["deployment_name"] not in values["openai_api_base"]: warnings.warn( "As of openai>=1.0.0, if `openai_api_base` " "(or alias `base_url`) is specified it is expected to be " "of the form " "https://example-resource.azure.openai.com/openai/deployments/example-deployment. " f"Updating {openai_api_base} to "
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,069
AzureOpenAI azure_ad_token_provider Keyerror
### System Info When I use below snippet of code ``` import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ``` I get error : ```--------------------------------------------------------------------------- KeyError Traceback (most recent call last) Cell In[36], line 21 18 # api_version = "2023-05-15" 19 endpoint = "https://xxxx.openai.azure.com" ---> 21 client = AzureOpenAI( 22 azure_endpoint=endpoint, 23 api_version="2023-05-15", 24 azure_deployment="example-gpt-4", 25 azure_ad_token_provider=token_provider, 26 ) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain_core/load/serializable.py:97, in Serializable.__init__(self, **kwargs) 96 def __init__(self, **kwargs: Any) -> None: ---> 97 super().__init__(**kwargs) 98 self._lc_kwargs = kwargs File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:339, in BaseModel.__init__(__pydantic_self__, **data) 333 """ 334 Create a new model by parsing and validating input data from keyword arguments. 335 336 Raises ValidationError if the input data cannot be parsed to form a valid model. 337 """ 338 # Uses something other than `self` the first arg to allow "self" as a settable attribute --> 339 values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data) 340 if validation_error: 341 raise validation_error File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:1102, in validate_model(model, input_data, cls) 1100 continue 1101 try: -> 1102 values = validator(cls_, values) 1103 except (ValueError, TypeError, AssertionError) as exc: 1104 errors.append(ErrorWrapper(exc, loc=ROOT_KEY)) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain/llms/openai.py:887, in AzureOpenAI.validate_environment(cls, values) 877 values["openai_api_base"] += ( 878 "/deployments/" + values["deployment_name"] 879 ) 880 values["deployment_name"] = None 881 client_params = { 882 "api_version": values["openai_api_version"], 883 "azure_endpoint": values["azure_endpoint"], 884 "azure_deployment": values["deployment_name"], 885 "api_key": values["openai_api_key"], 886 "azure_ad_token": values["azure_ad_token"], --> 887 "azure_ad_token_provider": values["azure_ad_token_provider"], 888 "organization": values["openai_organization"], 889 "base_url": values["openai_api_base"], 890 "timeout": values["request_timeout"], 891 "max_retries": values["max_retries"], 892 "default_headers": values["default_headers"], 893 "default_query": values["default_query"], 894 "http_client": values["http_client"], 895 } 896 values["client"] = openai.AzureOpenAI(**client_params).completions 897 values["async_client"] = openai.AsyncAzureOpenAI( 898 **client_params 899 ).completions KeyError: 'azure_ad_token_provider' ``` Ive also tried AzureChatOpenAI , and I get the same error back. The error is not reproduced when I use openai library AzureOpenAI . Also on openai the azure_ad_token_provider has type azure_ad_token_provider: 'AzureADTokenProvider | None' = None while in langchain it has type azure_ad_token_provider: Optional[str] = None which also makes me wonder if it would take as input a different type than string to work with. any ideas on how to fix this? Im actually using Azure Service principal authentication, and if I use as alternative field azure_ad_token = credential.get_token(“https://cognitiveservices.azure.com/.default”).token I get token expired after 60min which does not happen with a bearer token, so It is important to me to make the token_provider work. libraries : pydantic 1.10.12 pydantic_core 2.10.1 openai 1.2.0 langchain 0.0.342 langchain-core 0.0.7 ### Who can help? @hwchase17 @agola11 ### Information - [X] The official example notebooks/scripts - [ ] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ### Expected behavior client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) should return a Runnable instance which I can use for LLMChain
https://github.com/langchain-ai/langchain/issues/14069
https://github.com/langchain-ai/langchain/pull/14166
9938086df07d69d24f9770209ea9087d3b906155
62505043be20cf8af491e30785a6ca0eeb1d276e
"2023-11-30T13:39:55Z"
python
"2023-12-03T16:55:25Z"
libs/langchain/langchain/chat_models/azure_openai.py
f"{values['openai_api_base']}." ) values["openai_api_base"] += ( "/deployments/" + values["deployment_name"] ) values["deployment_name"] = None client_params = { "api_version": values["openai_api_version"], "azure_endpoint": values["azure_endpoint"], "azure_deployment": values["deployment_name"], "api_key": values["openai_api_key"], "azure_ad_token": values["azure_ad_token"], "azure_ad_token_provider": values["azure_ad_token_provider"], "organization": values["openai_organization"], "base_url": values["openai_api_base"], "timeout": values["request_timeout"], "max_retries": values["max_retries"], "default_headers": values["default_headers"], "default_query": values["default_query"], "http_client": values["http_client"], } values["client"] = openai.AzureOpenAI(**client_params).chat.completions values["async_client"] = openai.AsyncAzureOpenAI( **client_params ).chat.completions else: values["client"] = openai.ChatCompletion return values @property def _default_params(self) -> Dict[str, Any]:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,069
AzureOpenAI azure_ad_token_provider Keyerror
### System Info When I use below snippet of code ``` import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ``` I get error : ```--------------------------------------------------------------------------- KeyError Traceback (most recent call last) Cell In[36], line 21 18 # api_version = "2023-05-15" 19 endpoint = "https://xxxx.openai.azure.com" ---> 21 client = AzureOpenAI( 22 azure_endpoint=endpoint, 23 api_version="2023-05-15", 24 azure_deployment="example-gpt-4", 25 azure_ad_token_provider=token_provider, 26 ) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain_core/load/serializable.py:97, in Serializable.__init__(self, **kwargs) 96 def __init__(self, **kwargs: Any) -> None: ---> 97 super().__init__(**kwargs) 98 self._lc_kwargs = kwargs File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:339, in BaseModel.__init__(__pydantic_self__, **data) 333 """ 334 Create a new model by parsing and validating input data from keyword arguments. 335 336 Raises ValidationError if the input data cannot be parsed to form a valid model. 337 """ 338 # Uses something other than `self` the first arg to allow "self" as a settable attribute --> 339 values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data) 340 if validation_error: 341 raise validation_error File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:1102, in validate_model(model, input_data, cls) 1100 continue 1101 try: -> 1102 values = validator(cls_, values) 1103 except (ValueError, TypeError, AssertionError) as exc: 1104 errors.append(ErrorWrapper(exc, loc=ROOT_KEY)) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain/llms/openai.py:887, in AzureOpenAI.validate_environment(cls, values) 877 values["openai_api_base"] += ( 878 "/deployments/" + values["deployment_name"] 879 ) 880 values["deployment_name"] = None 881 client_params = { 882 "api_version": values["openai_api_version"], 883 "azure_endpoint": values["azure_endpoint"], 884 "azure_deployment": values["deployment_name"], 885 "api_key": values["openai_api_key"], 886 "azure_ad_token": values["azure_ad_token"], --> 887 "azure_ad_token_provider": values["azure_ad_token_provider"], 888 "organization": values["openai_organization"], 889 "base_url": values["openai_api_base"], 890 "timeout": values["request_timeout"], 891 "max_retries": values["max_retries"], 892 "default_headers": values["default_headers"], 893 "default_query": values["default_query"], 894 "http_client": values["http_client"], 895 } 896 values["client"] = openai.AzureOpenAI(**client_params).completions 897 values["async_client"] = openai.AsyncAzureOpenAI( 898 **client_params 899 ).completions KeyError: 'azure_ad_token_provider' ``` Ive also tried AzureChatOpenAI , and I get the same error back. The error is not reproduced when I use openai library AzureOpenAI . Also on openai the azure_ad_token_provider has type azure_ad_token_provider: 'AzureADTokenProvider | None' = None while in langchain it has type azure_ad_token_provider: Optional[str] = None which also makes me wonder if it would take as input a different type than string to work with. any ideas on how to fix this? Im actually using Azure Service principal authentication, and if I use as alternative field azure_ad_token = credential.get_token(“https://cognitiveservices.azure.com/.default”).token I get token expired after 60min which does not happen with a bearer token, so It is important to me to make the token_provider work. libraries : pydantic 1.10.12 pydantic_core 2.10.1 openai 1.2.0 langchain 0.0.342 langchain-core 0.0.7 ### Who can help? @hwchase17 @agola11 ### Information - [X] The official example notebooks/scripts - [ ] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ### Expected behavior client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) should return a Runnable instance which I can use for LLMChain
https://github.com/langchain-ai/langchain/issues/14069
https://github.com/langchain-ai/langchain/pull/14166
9938086df07d69d24f9770209ea9087d3b906155
62505043be20cf8af491e30785a6ca0eeb1d276e
"2023-11-30T13:39:55Z"
python
"2023-12-03T16:55:25Z"
libs/langchain/langchain/chat_models/azure_openai.py
"""Get the default parameters for calling OpenAI API.""" if is_openai_v1(): return super()._default_params else: return { **super()._default_params, "engine": self.deployment_name, } @property def _identifying_params(self) -> Dict[str, Any]: """Get the identifying parameters.""" return {**self._default_params} @property def _client_params(self) -> Dict[str, Any]: """Get the config params used for the openai client.""" if is_openai_v1(): return super()._client_params else: return { **super()._client_params, "api_type": self.openai_api_type, "api_version": self.openai_api_version, } @property def _llm_type(self) -> str:
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,069
AzureOpenAI azure_ad_token_provider Keyerror
### System Info When I use below snippet of code ``` import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ``` I get error : ```--------------------------------------------------------------------------- KeyError Traceback (most recent call last) Cell In[36], line 21 18 # api_version = "2023-05-15" 19 endpoint = "https://xxxx.openai.azure.com" ---> 21 client = AzureOpenAI( 22 azure_endpoint=endpoint, 23 api_version="2023-05-15", 24 azure_deployment="example-gpt-4", 25 azure_ad_token_provider=token_provider, 26 ) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain_core/load/serializable.py:97, in Serializable.__init__(self, **kwargs) 96 def __init__(self, **kwargs: Any) -> None: ---> 97 super().__init__(**kwargs) 98 self._lc_kwargs = kwargs File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:339, in BaseModel.__init__(__pydantic_self__, **data) 333 """ 334 Create a new model by parsing and validating input data from keyword arguments. 335 336 Raises ValidationError if the input data cannot be parsed to form a valid model. 337 """ 338 # Uses something other than `self` the first arg to allow "self" as a settable attribute --> 339 values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data) 340 if validation_error: 341 raise validation_error File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:1102, in validate_model(model, input_data, cls) 1100 continue 1101 try: -> 1102 values = validator(cls_, values) 1103 except (ValueError, TypeError, AssertionError) as exc: 1104 errors.append(ErrorWrapper(exc, loc=ROOT_KEY)) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain/llms/openai.py:887, in AzureOpenAI.validate_environment(cls, values) 877 values["openai_api_base"] += ( 878 "/deployments/" + values["deployment_name"] 879 ) 880 values["deployment_name"] = None 881 client_params = { 882 "api_version": values["openai_api_version"], 883 "azure_endpoint": values["azure_endpoint"], 884 "azure_deployment": values["deployment_name"], 885 "api_key": values["openai_api_key"], 886 "azure_ad_token": values["azure_ad_token"], --> 887 "azure_ad_token_provider": values["azure_ad_token_provider"], 888 "organization": values["openai_organization"], 889 "base_url": values["openai_api_base"], 890 "timeout": values["request_timeout"], 891 "max_retries": values["max_retries"], 892 "default_headers": values["default_headers"], 893 "default_query": values["default_query"], 894 "http_client": values["http_client"], 895 } 896 values["client"] = openai.AzureOpenAI(**client_params).completions 897 values["async_client"] = openai.AsyncAzureOpenAI( 898 **client_params 899 ).completions KeyError: 'azure_ad_token_provider' ``` Ive also tried AzureChatOpenAI , and I get the same error back. The error is not reproduced when I use openai library AzureOpenAI . Also on openai the azure_ad_token_provider has type azure_ad_token_provider: 'AzureADTokenProvider | None' = None while in langchain it has type azure_ad_token_provider: Optional[str] = None which also makes me wonder if it would take as input a different type than string to work with. any ideas on how to fix this? Im actually using Azure Service principal authentication, and if I use as alternative field azure_ad_token = credential.get_token(“https://cognitiveservices.azure.com/.default”).token I get token expired after 60min which does not happen with a bearer token, so It is important to me to make the token_provider work. libraries : pydantic 1.10.12 pydantic_core 2.10.1 openai 1.2.0 langchain 0.0.342 langchain-core 0.0.7 ### Who can help? @hwchase17 @agola11 ### Information - [X] The official example notebooks/scripts - [ ] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ### Expected behavior client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) should return a Runnable instance which I can use for LLMChain
https://github.com/langchain-ai/langchain/issues/14069
https://github.com/langchain-ai/langchain/pull/14166
9938086df07d69d24f9770209ea9087d3b906155
62505043be20cf8af491e30785a6ca0eeb1d276e
"2023-11-30T13:39:55Z"
python
"2023-12-03T16:55:25Z"
libs/langchain/langchain/chat_models/azure_openai.py
return "azure-openai-chat" @property def lc_attributes(self) -> Dict[str, Any]: return { "openai_api_type": self.openai_api_type, "openai_api_version": self.openai_api_version, } def _create_chat_result(self, response: Union[dict, BaseModel]) -> ChatResult: if not isinstance(response, dict): response = response.dict() for res in response["choices"]: if res.get("finish_reason", None) == "content_filter": raise ValueError( "Azure has not provided the response due to a content filter " "being triggered" ) chat_result = super()._create_chat_result(response) if "model" in response: model = response["model"] if self.model_version: model = f"{model}-{self.model_version}" if chat_result.llm_output is not None and isinstance( chat_result.llm_output, dict ): chat_result.llm_output["model_name"] = model return chat_result
closed
langchain-ai/langchain
https://github.com/langchain-ai/langchain
14,069
AzureOpenAI azure_ad_token_provider Keyerror
### System Info When I use below snippet of code ``` import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ``` I get error : ```--------------------------------------------------------------------------- KeyError Traceback (most recent call last) Cell In[36], line 21 18 # api_version = "2023-05-15" 19 endpoint = "https://xxxx.openai.azure.com" ---> 21 client = AzureOpenAI( 22 azure_endpoint=endpoint, 23 api_version="2023-05-15", 24 azure_deployment="example-gpt-4", 25 azure_ad_token_provider=token_provider, 26 ) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain_core/load/serializable.py:97, in Serializable.__init__(self, **kwargs) 96 def __init__(self, **kwargs: Any) -> None: ---> 97 super().__init__(**kwargs) 98 self._lc_kwargs = kwargs File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:339, in BaseModel.__init__(__pydantic_self__, **data) 333 """ 334 Create a new model by parsing and validating input data from keyword arguments. 335 336 Raises ValidationError if the input data cannot be parsed to form a valid model. 337 """ 338 # Uses something other than `self` the first arg to allow "self" as a settable attribute --> 339 values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data) 340 if validation_error: 341 raise validation_error File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/pydantic/v1/main.py:1102, in validate_model(model, input_data, cls) 1100 continue 1101 try: -> 1102 values = validator(cls_, values) 1103 except (ValueError, TypeError, AssertionError) as exc: 1104 errors.append(ErrorWrapper(exc, loc=ROOT_KEY)) File ~/PycharmProjects/aicc/env/lib/python3.9/site-packages/langchain/llms/openai.py:887, in AzureOpenAI.validate_environment(cls, values) 877 values["openai_api_base"] += ( 878 "/deployments/" + values["deployment_name"] 879 ) 880 values["deployment_name"] = None 881 client_params = { 882 "api_version": values["openai_api_version"], 883 "azure_endpoint": values["azure_endpoint"], 884 "azure_deployment": values["deployment_name"], 885 "api_key": values["openai_api_key"], 886 "azure_ad_token": values["azure_ad_token"], --> 887 "azure_ad_token_provider": values["azure_ad_token_provider"], 888 "organization": values["openai_organization"], 889 "base_url": values["openai_api_base"], 890 "timeout": values["request_timeout"], 891 "max_retries": values["max_retries"], 892 "default_headers": values["default_headers"], 893 "default_query": values["default_query"], 894 "http_client": values["http_client"], 895 } 896 values["client"] = openai.AzureOpenAI(**client_params).completions 897 values["async_client"] = openai.AsyncAzureOpenAI( 898 **client_params 899 ).completions KeyError: 'azure_ad_token_provider' ``` Ive also tried AzureChatOpenAI , and I get the same error back. The error is not reproduced when I use openai library AzureOpenAI . Also on openai the azure_ad_token_provider has type azure_ad_token_provider: 'AzureADTokenProvider | None' = None while in langchain it has type azure_ad_token_provider: Optional[str] = None which also makes me wonder if it would take as input a different type than string to work with. any ideas on how to fix this? Im actually using Azure Service principal authentication, and if I use as alternative field azure_ad_token = credential.get_token(“https://cognitiveservices.azure.com/.default”).token I get token expired after 60min which does not happen with a bearer token, so It is important to me to make the token_provider work. libraries : pydantic 1.10.12 pydantic_core 2.10.1 openai 1.2.0 langchain 0.0.342 langchain-core 0.0.7 ### Who can help? @hwchase17 @agola11 ### Information - [X] The official example notebooks/scripts - [ ] My own modified scripts ### Related Components - [X] LLMs/Chat Models - [ ] Embedding Models - [ ] Prompts / Prompt Templates / Prompt Selectors - [ ] Output Parsers - [ ] Document Loaders - [ ] Vector Stores / Retrievers - [ ] Memory - [ ] Agents / Agent Executors - [ ] Tools / Toolkits - [ ] Chains - [ ] Callbacks/Tracing - [ ] Async ### Reproduction import os from azure.identity import DefaultAzureCredential from azure.identity import get_bearer_token_provider from langchain.llms import AzureOpenAI from langchain.chat_models import AzureChatOpenAI credential = DefaultAzureCredential(interactive_browser_tenant_id=tenant_id, interactive_browser_client_id=client_id, client_secret=client_secret) token_provider = get_bearer_token_provider(credential, "https://cognitiveservices.azure.com/.default") endpoint = "https://xxxx.openai.azure.com" client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) ### Expected behavior client = AzureOpenAI( azure_endpoint=endpoint, api_version="2023-05-15", azure_deployment="example-gpt-4", azure_ad_token_provider=token_provider) should return a Runnable instance which I can use for LLMChain
https://github.com/langchain-ai/langchain/issues/14069
https://github.com/langchain-ai/langchain/pull/14166
9938086df07d69d24f9770209ea9087d3b906155
62505043be20cf8af491e30785a6ca0eeb1d276e
"2023-11-30T13:39:55Z"
python
"2023-12-03T16:55:25Z"
libs/langchain/langchain/llms/openai.py
from __future__ import annotations import logging import os import sys