# String-in, string-out llms | |
:::tip | |
You are probably looking for the [Chat Model Concept Guide](/docs/concepts/chat_models) page for more information. | |
::: | |
LangChain has implementations for older language models that take a string as input and return a string as output. These models are typically named without the "Chat" prefix (e.g., `Ollama`, `Anthropic`, `OpenAI`, etc.), and may include the "LLM" suffix (e.g., `OllamaLLM`, `AnthropicLLM`, `OpenAILLM`, etc.). These models implement the [BaseLLM](https://python.langchain.com/api_reference/core/language_models/langchain_core.language_models.llms.BaseLLM.html#langchain_core.language_models.llms.BaseLLM) interface. | |
Users should be using almost exclusively the newer [Chat Models](/docs/concepts/chat_models) as most | |
model providers have adopted a chat like interface for interacting with language models. |