id
stringlengths 14
16
| text
stringlengths 13
2.7k
| source
stringlengths 57
178
|
---|---|---|
93c753639227-1 | clear() → None[source]¶
Clear memory contents.
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep – set to True to make a deep copy of the model
Returns
new model instance
dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
classmethod from_orm(obj: Any) → Model¶
classmethod get_lc_namespace() → List[str]¶
Get the namespace of the langchain object. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.summary_buffer.ConversationSummaryBufferMemory.html |
93c753639227-2 | Get the namespace of the langchain object.
For example, if the class is langchain.llms.openai.OpenAI, then the
namespace is [“langchain”, “llms”, “openai”]
classmethod is_lc_serializable() → bool¶
Is this class serializable?
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
classmethod lc_id() → List[str]¶
A unique identifier for this class for serialization purposes.
The unique identifier is a list of strings that describes the path
to the object.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, Any][source]¶
Return history buffer.
classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod parse_obj(obj: Any) → Model¶
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
predict_new_summary(messages: List[BaseMessage], existing_summary: str) → str¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.summary_buffer.ConversationSummaryBufferMemory.html |
93c753639227-3 | predict_new_summary(messages: List[BaseMessage], existing_summary: str) → str¶
prune() → None[source]¶
Prune buffer if it exceeds max token limit
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Save context from this conversation to buffer.
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
classmethod update_forward_refs(**localns: Any) → None¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
classmethod validate(value: Any) → Model¶
property buffer: List[langchain.schema.messages.BaseMessage]¶
property lc_attributes: Dict¶
List of attribute names that should be included in the serialized kwargs.
These attributes must be accepted by the constructor.
property lc_secrets: Dict[str, str]¶
A map of constructor argument names to secret ids.
For example,{“openai_api_key”: “OPENAI_API_KEY”}
Examples using ConversationSummaryBufferMemory¶
Set env var OPENAI_API_KEY or load from a .env file:
Conversation Summary Buffer | lang/api.python.langchain.com/en/latest/memory/langchain.memory.summary_buffer.ConversationSummaryBufferMemory.html |
b3a201120362-0 | langchain.memory.chat_message_histories.file.FileChatMessageHistory¶
class langchain.memory.chat_message_histories.file.FileChatMessageHistory(file_path: str)[source]¶
Chat message history that stores history in a local file.
Parameters
file_path – path of the local file to store the messages.
Attributes
messages
Retrieve the messages from the local file
Methods
__init__(file_path)
add_ai_message(message)
Convenience method for adding an AI message string to the store.
add_message(message)
Append the message to the record in the local file
add_user_message(message)
Convenience method for adding a human message string to the store.
clear()
Clear session memory from the local file
__init__(file_path: str)[source]¶
add_ai_message(message: str) → None¶
Convenience method for adding an AI message string to the store.
Parameters
message – The string contents of an AI message.
add_message(message: BaseMessage) → None[source]¶
Append the message to the record in the local file
add_user_message(message: str) → None¶
Convenience method for adding a human message string to the store.
Parameters
message – The string contents of a human message.
clear() → None[source]¶
Clear session memory from the local file
Examples using FileChatMessageHistory¶
AutoGPT | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.file.FileChatMessageHistory.html |
191011689acb-0 | langchain.memory.chat_message_histories.xata.XataChatMessageHistory¶
class langchain.memory.chat_message_histories.xata.XataChatMessageHistory(session_id: str, db_url: str, api_key: str, branch_name: str = 'main', table_name: str = 'messages', create_table: bool = True)[source]¶
Chat message history stored in a Xata database.
Initialize with Xata client.
Attributes
messages
Methods
__init__(session_id, db_url, api_key[, ...])
Initialize with Xata client.
add_ai_message(message)
Convenience method for adding an AI message string to the store.
add_message(message)
Append the message to the Xata table
add_user_message(message)
Convenience method for adding a human message string to the store.
clear()
Delete session from Xata table.
__init__(session_id: str, db_url: str, api_key: str, branch_name: str = 'main', table_name: str = 'messages', create_table: bool = True) → None[source]¶
Initialize with Xata client.
add_ai_message(message: str) → None¶
Convenience method for adding an AI message string to the store.
Parameters
message – The string contents of an AI message.
add_message(message: BaseMessage) → None[source]¶
Append the message to the Xata table
add_user_message(message: str) → None¶
Convenience method for adding a human message string to the store.
Parameters
message – The string contents of a human message.
clear() → None[source]¶
Delete session from Xata table.
Examples using XataChatMessageHistory¶
Xata chat memory | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.xata.XataChatMessageHistory.html |
56086ef9d707-0 | langchain.memory.entity.RedisEntityStore¶
class langchain.memory.entity.RedisEntityStore[source]¶
Bases: BaseEntityStore
Redis-backed Entity store.
Entities get a TTL of 1 day by default, and
that TTL is extended by 3 days every time the entity is read back.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param key_prefix: str = 'memory_store'¶
param recall_ttl: Optional[int] = 259200¶
param redis_client: Any = None¶
param session_id: str = 'default'¶
param ttl: Optional[int] = 86400¶
clear() → None[source]¶
Delete all entities from store.
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep – set to True to make a deep copy of the model
Returns
new model instance | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.RedisEntityStore.html |
56086ef9d707-1 | deep – set to True to make a deep copy of the model
Returns
new model instance
delete(key: str) → None[source]¶
Delete entity value from store.
dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
exists(key: str) → bool[source]¶
Check if entity exists in store.
classmethod from_orm(obj: Any) → Model¶
get(key: str, default: Optional[str] = None) → Optional[str][source]¶
Get entity value from store.
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.RedisEntityStore.html |
56086ef9d707-2 | classmethod parse_obj(obj: Any) → Model¶
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶
set(key: str, value: Optional[str]) → None[source]¶
Set entity value in store.
classmethod update_forward_refs(**localns: Any) → None¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
classmethod validate(value: Any) → Model¶
property full_key_prefix: str¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.RedisEntityStore.html |
26c918adaf87-0 | langchain.memory.entity.ConversationEntityMemory¶
class langchain.memory.entity.ConversationEntityMemory[source]¶
Bases: BaseChatMemory
Entity extractor & summarizer memory.
Extracts named entities from the recent chat history and generates summaries.
With a swappable entity store, persisting entities across conversations.
Defaults to an in-memory entity store, and can be swapped out for a Redis,
SQLite, or other entity store.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param ai_prefix: str = 'AI'¶
param chat_history_key: str = 'history'¶
param chat_memory: BaseChatMessageHistory [Optional]¶
param entity_cache: List[str] = []¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.ConversationEntityMemory.html |
26c918adaf87-1 | param entity_extraction_prompt: langchain.schema.prompt_template.BasePromptTemplate = PromptTemplate(input_variables=['history', 'input'], template='You are an AI assistant reading the transcript of a conversation between an AI and a human. Extract all of the proper nouns from the last line of conversation. As a guideline, a proper noun is generally capitalized. You should definitely extract all names and places.\n\nThe conversation history is provided just in case of a coreference (e.g. "What do you know about him" where "him" is defined in a previous line) -- ignore items mentioned there that are not in the last line.\n\nReturn the output as a single comma-separated list, or NONE if there is nothing of note to return (e.g. the user is just issuing a greeting or having a simple conversation).\n\nEXAMPLE\nConversation history:\nPerson #1: how\'s it going today?\nAI: "It\'s going great! How about you?"\nPerson #1: good! busy working on Langchain. lots to do.\nAI: "That sounds like a lot of work! What kind of things are you doing to make Langchain better?"\nLast line:\nPerson #1: i\'m trying to improve Langchain\'s interfaces, the UX, its integrations with various products the user might want ... a lot of stuff.\nOutput: Langchain\nEND OF EXAMPLE\n\nEXAMPLE\nConversation history:\nPerson #1: how\'s it going today?\nAI: "It\'s going great! How about you?"\nPerson #1: good! busy working on Langchain. lots to do.\nAI: "That sounds like a lot of work! What kind of things are you doing to make Langchain better?"\nLast line:\nPerson #1: i\'m trying to improve Langchain\'s interfaces, the UX, its integrations with various products | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.ConversationEntityMemory.html |
26c918adaf87-2 | trying to improve Langchain\'s interfaces, the UX, its integrations with various products the user might want ... a lot of stuff. I\'m working with Person #2.\nOutput: Langchain, Person #2\nEND OF EXAMPLE\n\nConversation history (for reference only):\n{history}\nLast line of conversation (for extraction):\nHuman: {input}\n\nOutput:')¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.ConversationEntityMemory.html |
26c918adaf87-3 | param entity_store: langchain.memory.entity.BaseEntityStore [Optional]¶
param entity_summarization_prompt: langchain.schema.prompt_template.BasePromptTemplate = PromptTemplate(input_variables=['entity', 'history', 'input', 'summary'], template='You are an AI assistant helping a human keep track of facts about relevant people, places, and concepts in their life. Update the summary of the provided entity in the "Entity" section based on the last line of your conversation with the human. If you are writing the summary for the first time, return a single sentence.\nThe update should only include facts that are relayed in the last line of conversation about the provided entity, and should only contain facts about the provided entity.\n\nIf there is no new information about the provided entity or the information is not worth noting (not an important or relevant fact to remember long-term), return the existing summary unchanged.\n\nFull conversation history (for context):\n{history}\n\nEntity to summarize:\n{entity}\n\nExisting summary of {entity}:\n{summary}\n\nLast line of conversation:\nHuman: {input}\nUpdated summary:')¶
param human_prefix: str = 'Human'¶
param input_key: Optional[str] = None¶
param k: int = 3¶
param llm: langchain.schema.language_model.BaseLanguageModel [Required]¶
param output_key: Optional[str] = None¶
param return_messages: bool = False¶
clear() → None[source]¶
Clear memory contents.
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = ‘allow’ was set since it adds all passed values | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.ConversationEntityMemory.html |
26c918adaf87-4 | Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep – set to True to make a deep copy of the model
Returns
new model instance
dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
classmethod from_orm(obj: Any) → Model¶
classmethod get_lc_namespace() → List[str]¶
Get the namespace of the langchain object.
For example, if the class is langchain.llms.openai.OpenAI, then the
namespace is [“langchain”, “llms”, “openai”]
classmethod is_lc_serializable() → bool¶
Is this class serializable? | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.ConversationEntityMemory.html |
26c918adaf87-5 | classmethod is_lc_serializable() → bool¶
Is this class serializable?
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
classmethod lc_id() → List[str]¶
A unique identifier for this class for serialization purposes.
The unique identifier is a list of strings that describes the path
to the object.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, Any][source]¶
Returns chat history and all generated entities with summaries if available,
and updates or clears the recent entity cache.
New entity name can be found when calling this method, before the entity
summaries are generated, so the entity cache values may be empty if no entity
descriptions are generated yet.
classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod parse_obj(obj: Any) → Model¶
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.ConversationEntityMemory.html |
26c918adaf87-6 | save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Save context from this conversation history to the entity store.
Generates a summary for each entity in the entity cache by prompting
the model, and saves these summaries to the entity store.
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
classmethod update_forward_refs(**localns: Any) → None¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
classmethod validate(value: Any) → Model¶
property buffer: List[langchain.schema.messages.BaseMessage]¶
Access chat memory messages.
property lc_attributes: Dict¶
List of attribute names that should be included in the serialized kwargs.
These attributes must be accepted by the constructor.
property lc_secrets: Dict[str, str]¶
A map of constructor argument names to secret ids.
For example,{“openai_api_key”: “OPENAI_API_KEY”}
Examples using ConversationEntityMemory¶
Entity Memory with SQLite storage | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.ConversationEntityMemory.html |
8bb69abe30b1-0 | langchain.memory.entity.UpstashRedisEntityStore¶
class langchain.memory.entity.UpstashRedisEntityStore[source]¶
Bases: BaseEntityStore
Upstash Redis backed Entity store.
Entities get a TTL of 1 day by default, and
that TTL is extended by 3 days every time the entity is read back.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
clear() → None[source]¶
Delete all entities from store.
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep – set to True to make a deep copy of the model
Returns
new model instance
delete(key: str) → None[source]¶
Delete entity value from store. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.UpstashRedisEntityStore.html |
8bb69abe30b1-1 | delete(key: str) → None[source]¶
Delete entity value from store.
dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
exists(key: str) → bool[source]¶
Check if entity exists in store.
classmethod from_orm(obj: Any) → Model¶
get(key: str, default: Optional[str] = None) → Optional[str][source]¶
Get entity value from store.
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod parse_obj(obj: Any) → Model¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.UpstashRedisEntityStore.html |
8bb69abe30b1-2 | classmethod parse_obj(obj: Any) → Model¶
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶
set(key: str, value: Optional[str]) → None[source]¶
Set entity value in store.
classmethod update_forward_refs(**localns: Any) → None¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
classmethod validate(value: Any) → Model¶
property full_key_prefix: str¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.UpstashRedisEntityStore.html |
263c5abb87f4-0 | langchain.memory.kg.ConversationKGMemory¶
class langchain.memory.kg.ConversationKGMemory[source]¶
Bases: BaseChatMemory
Knowledge graph conversation memory.
Integrates with external knowledge graph to store and retrieve
information about knowledge triples in the conversation.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param ai_prefix: str = 'AI'¶
param chat_memory: BaseChatMessageHistory [Optional]¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
263c5abb87f4-1 | param entity_extraction_prompt: langchain.schema.prompt_template.BasePromptTemplate = PromptTemplate(input_variables=['history', 'input'], template='You are an AI assistant reading the transcript of a conversation between an AI and a human. Extract all of the proper nouns from the last line of conversation. As a guideline, a proper noun is generally capitalized. You should definitely extract all names and places.\n\nThe conversation history is provided just in case of a coreference (e.g. "What do you know about him" where "him" is defined in a previous line) -- ignore items mentioned there that are not in the last line.\n\nReturn the output as a single comma-separated list, or NONE if there is nothing of note to return (e.g. the user is just issuing a greeting or having a simple conversation).\n\nEXAMPLE\nConversation history:\nPerson #1: how\'s it going today?\nAI: "It\'s going great! How about you?"\nPerson #1: good! busy working on Langchain. lots to do.\nAI: "That sounds like a lot of work! What kind of things are you doing to make Langchain better?"\nLast line:\nPerson #1: i\'m trying to improve Langchain\'s interfaces, the UX, its integrations with various products the user might want ... a lot of stuff.\nOutput: Langchain\nEND OF EXAMPLE\n\nEXAMPLE\nConversation history:\nPerson #1: how\'s it going today?\nAI: "It\'s going great! How about you?"\nPerson #1: good! busy working on Langchain. lots to do.\nAI: "That sounds like a lot of work! What kind of things are you doing to make Langchain better?"\nLast line:\nPerson #1: i\'m trying to improve Langchain\'s interfaces, the UX, its integrations with various products | lang/api.python.langchain.com/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
263c5abb87f4-2 | trying to improve Langchain\'s interfaces, the UX, its integrations with various products the user might want ... a lot of stuff. I\'m working with Person #2.\nOutput: Langchain, Person #2\nEND OF EXAMPLE\n\nConversation history (for reference only):\n{history}\nLast line of conversation (for extraction):\nHuman: {input}\n\nOutput:')¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
263c5abb87f4-3 | param human_prefix: str = 'Human'¶
param input_key: Optional[str] = None¶
param k: int = 2¶
param kg: langchain.graphs.networkx_graph.NetworkxEntityGraph [Optional]¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
263c5abb87f4-4 | param knowledge_extraction_prompt: langchain.schema.prompt_template.BasePromptTemplate = PromptTemplate(input_variables=['history', 'input'], template="You are a networked intelligence helping a human track knowledge triples about all relevant people, things, concepts, etc. and integrating them with your knowledge stored within your weights as well as that stored in a knowledge graph. Extract all of the knowledge triples from the last line of conversation. A knowledge triple is a clause that contains a subject, a predicate, and an object. The subject is the entity being described, the predicate is the property of the subject that is being described, and the object is the value of the property.\n\nEXAMPLE\nConversation history:\nPerson #1: Did you hear aliens landed in Area 51?\nAI: No, I didn't hear that. What do you know about Area 51?\nPerson #1: It's a secret military base in Nevada.\nAI: What do you know about Nevada?\nLast line of conversation:\nPerson #1: It's a state in the US. It's also the number 1 producer of gold in the US.\n\nOutput: (Nevada, is a, state)<|>(Nevada, is in, US)<|>(Nevada, is the number 1 producer of, gold)\nEND OF EXAMPLE\n\nEXAMPLE\nConversation history:\nPerson #1: Hello.\nAI: Hi! How are you?\nPerson #1: I'm good. How are you?\nAI: I'm good too.\nLast line of conversation:\nPerson #1: I'm going to the store.\n\nOutput: NONE\nEND OF EXAMPLE\n\nEXAMPLE\nConversation history:\nPerson #1: What do you know about Descartes?\nAI: Descartes was a French philosopher, mathematician, and scientist who lived in the 17th century.\nPerson #1: The | lang/api.python.langchain.com/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
263c5abb87f4-5 | mathematician, and scientist who lived in the 17th century.\nPerson #1: The Descartes I'm referring to is a standup comedian and interior designer from Montreal.\nAI: Oh yes, He is a comedian and an interior designer. He has been in the industry for 30 years. His favorite food is baked bean pie.\nLast line of conversation:\nPerson #1: Oh huh. I know Descartes likes to drive antique scooters and play the mandolin.\nOutput: (Descartes, likes to drive, antique scooters)<|>(Descartes, plays, mandolin)\nEND OF EXAMPLE\n\nConversation history (for reference only):\n{history}\nLast line of conversation (for extraction):\nHuman: {input}\n\nOutput:")¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
263c5abb87f4-6 | param llm: langchain.schema.language_model.BaseLanguageModel [Required]¶
param output_key: Optional[str] = None¶
param return_messages: bool = False¶
param summary_message_cls: Type[langchain.schema.messages.BaseMessage] = <class 'langchain.schema.messages.SystemMessage'>¶
Number of previous utterances to include in the context.
clear() → None[source]¶
Clear memory contents.
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep – set to True to make a deep copy of the model
Returns
new model instance | lang/api.python.langchain.com/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
263c5abb87f4-7 | deep – set to True to make a deep copy of the model
Returns
new model instance
dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
classmethod from_orm(obj: Any) → Model¶
get_current_entities(input_string: str) → List[str][source]¶
get_knowledge_triplets(input_string: str) → List[KnowledgeTriple][source]¶
classmethod get_lc_namespace() → List[str]¶
Get the namespace of the langchain object.
For example, if the class is langchain.llms.openai.OpenAI, then the
namespace is [“langchain”, “llms”, “openai”]
classmethod is_lc_serializable() → bool¶
Is this class serializable?
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps(). | lang/api.python.langchain.com/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
263c5abb87f4-8 | classmethod lc_id() → List[str]¶
A unique identifier for this class for serialization purposes.
The unique identifier is a list of strings that describes the path
to the object.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, Any][source]¶
Return history buffer.
classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod parse_obj(obj: Any) → Model¶
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Save context from this conversation to buffer.
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
classmethod update_forward_refs(**localns: Any) → None¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
classmethod validate(value: Any) → Model¶
property lc_attributes: Dict¶
List of attribute names that should be included in the serialized kwargs.
These attributes must be accepted by the constructor.
property lc_secrets: Dict[str, str]¶
A map of constructor argument names to secret ids.
For example,{“openai_api_key”: “OPENAI_API_KEY”}
Examples using ConversationKGMemory¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
263c5abb87f4-9 | Examples using ConversationKGMemory¶
Conversation Knowledge Graph | lang/api.python.langchain.com/en/latest/memory/langchain.memory.kg.ConversationKGMemory.html |
a3040ff311db-0 | langchain.memory.chat_message_histories.sql.create_message_model¶
langchain.memory.chat_message_histories.sql.create_message_model(table_name, DynamicBase)[source]¶
Create a message model for a given table name.
Parameters
table_name – The name of the table to use.
DynamicBase – The base class to use for the model.
Returns
The model class. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.sql.create_message_model.html |
50b4d0afae28-0 | langchain.memory.simple.SimpleMemory¶
class langchain.memory.simple.SimpleMemory[source]¶
Bases: BaseMemory
Simple memory for storing context or other information that shouldn’t
ever change between prompts.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param memories: Dict[str, Any] = {}¶
clear() → None[source]¶
Nothing to clear, got a memory like a vault.
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep – set to True to make a deep copy of the model
Returns
new model instance | lang/api.python.langchain.com/en/latest/memory/langchain.memory.simple.SimpleMemory.html |
50b4d0afae28-1 | deep – set to True to make a deep copy of the model
Returns
new model instance
dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
classmethod from_orm(obj: Any) → Model¶
classmethod get_lc_namespace() → List[str]¶
Get the namespace of the langchain object.
For example, if the class is langchain.llms.openai.OpenAI, then the
namespace is [“langchain”, “llms”, “openai”]
classmethod is_lc_serializable() → bool¶
Is this class serializable?
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
classmethod lc_id() → List[str]¶
A unique identifier for this class for serialization purposes.
The unique identifier is a list of strings that describes the path
to the object. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.simple.SimpleMemory.html |
50b4d0afae28-2 | The unique identifier is a list of strings that describes the path
to the object.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, str][source]¶
Return key-value pairs given the text input to the chain.
classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod parse_obj(obj: Any) → Model¶
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Nothing should be saved or changed, my memory is set in stone.
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
classmethod update_forward_refs(**localns: Any) → None¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
classmethod validate(value: Any) → Model¶
property lc_attributes: Dict¶
List of attribute names that should be included in the serialized kwargs.
These attributes must be accepted by the constructor.
property lc_secrets: Dict[str, str]¶
A map of constructor argument names to secret ids.
For example,{“openai_api_key”: “OPENAI_API_KEY”}
property memory_variables: List[str]¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.simple.SimpleMemory.html |
50b4d0afae28-3 | property memory_variables: List[str]¶
The string keys this memory class will add to chain inputs. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.simple.SimpleMemory.html |
dcaebc956620-0 | langchain.memory.summary.ConversationSummaryMemory¶
class langchain.memory.summary.ConversationSummaryMemory[source]¶
Bases: BaseChatMemory, SummarizerMixin
Conversation summarizer to chat memory.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param ai_prefix: str = 'AI'¶
param buffer: str = ''¶
param chat_memory: BaseChatMessageHistory [Optional]¶
param human_prefix: str = 'Human'¶
param input_key: Optional[str] = None¶
param llm: BaseLanguageModel [Required]¶
param output_key: Optional[str] = None¶
param prompt: BasePromptTemplate = PromptTemplate(input_variables=['new_lines', 'summary'], template='Progressively summarize the lines of conversation provided, adding onto the previous summary returning a new summary.\n\nEXAMPLE\nCurrent summary:\nThe human asks what the AI thinks of artificial intelligence. The AI thinks artificial intelligence is a force for good.\n\nNew lines of conversation:\nHuman: Why do you think artificial intelligence is a force for good?\nAI: Because artificial intelligence will help humans reach their full potential.\n\nNew summary:\nThe human asks what the AI thinks of artificial intelligence. The AI thinks artificial intelligence is a force for good because it will help humans reach their full potential.\nEND OF EXAMPLE\n\nCurrent summary:\n{summary}\n\nNew lines of conversation:\n{new_lines}\n\nNew summary:')¶
param return_messages: bool = False¶
param summary_message_cls: Type[BaseMessage] = <class 'langchain.schema.messages.SystemMessage'>¶
clear() → None[source]¶
Clear memory contents.
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.summary.ConversationSummaryMemory.html |
dcaebc956620-1 | Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep – set to True to make a deep copy of the model
Returns
new model instance
dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
classmethod from_messages(llm: BaseLanguageModel, chat_memory: BaseChatMessageHistory, *, summarize_step: int = 2, **kwargs: Any) → ConversationSummaryMemory[source]¶
classmethod from_orm(obj: Any) → Model¶
classmethod get_lc_namespace() → List[str]¶
Get the namespace of the langchain object. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.summary.ConversationSummaryMemory.html |
dcaebc956620-2 | Get the namespace of the langchain object.
For example, if the class is langchain.llms.openai.OpenAI, then the
namespace is [“langchain”, “llms”, “openai”]
classmethod is_lc_serializable() → bool¶
Is this class serializable?
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
classmethod lc_id() → List[str]¶
A unique identifier for this class for serialization purposes.
The unique identifier is a list of strings that describes the path
to the object.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, Any][source]¶
Return history buffer.
classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod parse_obj(obj: Any) → Model¶
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
predict_new_summary(messages: List[BaseMessage], existing_summary: str) → str¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.summary.ConversationSummaryMemory.html |
dcaebc956620-3 | predict_new_summary(messages: List[BaseMessage], existing_summary: str) → str¶
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Save context from this conversation to buffer.
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
classmethod update_forward_refs(**localns: Any) → None¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
classmethod validate(value: Any) → Model¶
property lc_attributes: Dict¶
List of attribute names that should be included in the serialized kwargs.
These attributes must be accepted by the constructor.
property lc_secrets: Dict[str, str]¶
A map of constructor argument names to secret ids.
For example,{“openai_api_key”: “OPENAI_API_KEY”}
Examples using ConversationSummaryMemory¶
Set env var OPENAI_API_KEY or load from a .env file:
Set env var OPENAI_API_KEY or load from a .env file
Multiple Memory classes | lang/api.python.langchain.com/en/latest/memory/langchain.memory.summary.ConversationSummaryMemory.html |
56a0230f443a-0 | langchain.memory.combined.CombinedMemory¶
class langchain.memory.combined.CombinedMemory[source]¶
Bases: BaseMemory
Combining multiple memories’ data together.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param memories: List[langchain.schema.memory.BaseMemory] [Required]¶
For tracking all the memories that should be accessed.
clear() → None[source]¶
Clear context from this session for every memory.
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep – set to True to make a deep copy of the model
Returns
new model instance | lang/api.python.langchain.com/en/latest/memory/langchain.memory.combined.CombinedMemory.html |
56a0230f443a-1 | deep – set to True to make a deep copy of the model
Returns
new model instance
dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
classmethod from_orm(obj: Any) → Model¶
classmethod get_lc_namespace() → List[str]¶
Get the namespace of the langchain object.
For example, if the class is langchain.llms.openai.OpenAI, then the
namespace is [“langchain”, “llms”, “openai”]
classmethod is_lc_serializable() → bool¶
Is this class serializable?
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
classmethod lc_id() → List[str]¶
A unique identifier for this class for serialization purposes.
The unique identifier is a list of strings that describes the path
to the object. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.combined.CombinedMemory.html |
56a0230f443a-2 | The unique identifier is a list of strings that describes the path
to the object.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, str][source]¶
Load all vars from sub-memories.
classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod parse_obj(obj: Any) → Model¶
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Save context from this session for every memory.
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
classmethod update_forward_refs(**localns: Any) → None¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
classmethod validate(value: Any) → Model¶
property lc_attributes: Dict¶
List of attribute names that should be included in the serialized kwargs.
These attributes must be accepted by the constructor.
property lc_secrets: Dict[str, str]¶
A map of constructor argument names to secret ids.
For example,{“openai_api_key”: “OPENAI_API_KEY”}
property memory_variables: List[str]¶
All the memory variables that this instance provides. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.combined.CombinedMemory.html |
56a0230f443a-3 | property memory_variables: List[str]¶
All the memory variables that this instance provides.
Examples using CombinedMemory¶
Zep
Multiple Memory classes | lang/api.python.langchain.com/en/latest/memory/langchain.memory.combined.CombinedMemory.html |
40c4a6cd41a4-0 | langchain.memory.buffer.ConversationStringBufferMemory¶
class langchain.memory.buffer.ConversationStringBufferMemory[source]¶
Bases: BaseMemory
Buffer for storing conversation memory.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param ai_prefix: str = 'AI'¶
Prefix to use for AI generated responses.
param buffer: str = ''¶
param human_prefix: str = 'Human'¶
param input_key: Optional[str] = None¶
param output_key: Optional[str] = None¶
clear() → None[source]¶
Clear memory contents.
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep – set to True to make a deep copy of the model
Returns
new model instance | lang/api.python.langchain.com/en/latest/memory/langchain.memory.buffer.ConversationStringBufferMemory.html |
40c4a6cd41a4-1 | deep – set to True to make a deep copy of the model
Returns
new model instance
dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
classmethod from_orm(obj: Any) → Model¶
classmethod get_lc_namespace() → List[str]¶
Get the namespace of the langchain object.
For example, if the class is langchain.llms.openai.OpenAI, then the
namespace is [“langchain”, “llms”, “openai”]
classmethod is_lc_serializable() → bool¶
Is this class serializable?
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
classmethod lc_id() → List[str]¶
A unique identifier for this class for serialization purposes.
The unique identifier is a list of strings that describes the path
to the object. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.buffer.ConversationStringBufferMemory.html |
40c4a6cd41a4-2 | The unique identifier is a list of strings that describes the path
to the object.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, str][source]¶
Return history buffer.
classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod parse_obj(obj: Any) → Model¶
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Save context from this conversation to buffer.
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
classmethod update_forward_refs(**localns: Any) → None¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
classmethod validate(value: Any) → Model¶
property lc_attributes: Dict¶
List of attribute names that should be included in the serialized kwargs.
These attributes must be accepted by the constructor.
property lc_secrets: Dict[str, str]¶
A map of constructor argument names to secret ids.
For example,{“openai_api_key”: “OPENAI_API_KEY”}
property memory_variables: List[str]¶
Will always return list of memory variables.
:meta private: | lang/api.python.langchain.com/en/latest/memory/langchain.memory.buffer.ConversationStringBufferMemory.html |
c3d25425ca21-0 | langchain.memory.chat_message_histories.in_memory.ChatMessageHistory¶
class langchain.memory.chat_message_histories.in_memory.ChatMessageHistory[source]¶
Bases: BaseChatMessageHistory, BaseModel
In memory implementation of chat message history.
Stores messages in an in memory list.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param messages: List[langchain.schema.messages.BaseMessage] [Optional]¶
A list of Messages stored in-memory.
add_ai_message(message: str) → None¶
Convenience method for adding an AI message string to the store.
Parameters
message – The string contents of an AI message.
add_message(message: BaseMessage) → None[source]¶
Add a self-created message to the store
add_user_message(message: str) → None¶
Convenience method for adding a human message string to the store.
Parameters
message – The string contents of a human message.
clear() → None[source]¶
Remove all messages from the store
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include – fields to include in new model | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.in_memory.ChatMessageHistory.html |
c3d25425ca21-1 | Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep – set to True to make a deep copy of the model
Returns
new model instance
dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
classmethod from_orm(obj: Any) → Model¶
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod parse_obj(obj: Any) → Model¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.in_memory.ChatMessageHistory.html |
c3d25425ca21-2 | classmethod parse_obj(obj: Any) → Model¶
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶
classmethod update_forward_refs(**localns: Any) → None¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
classmethod validate(value: Any) → Model¶
Examples using ChatMessageHistory¶
Message Memory in Agent backed by a database | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.in_memory.ChatMessageHistory.html |
a71e242b31cf-0 | langchain.memory.chat_message_histories.upstash_redis.UpstashRedisChatMessageHistory¶
class langchain.memory.chat_message_histories.upstash_redis.UpstashRedisChatMessageHistory(session_id: str, url: str = '', token: str = '', key_prefix: str = 'message_store:', ttl: Optional[int] = None)[source]¶
Chat message history stored in an Upstash Redis database.
Attributes
key
Construct the record key to use
messages
Retrieve the messages from Upstash Redis
Methods
__init__(session_id[, url, token, ...])
add_ai_message(message)
Convenience method for adding an AI message string to the store.
add_message(message)
Append the message to the record in Upstash Redis
add_user_message(message)
Convenience method for adding a human message string to the store.
clear()
Clear session memory from Upstash Redis
__init__(session_id: str, url: str = '', token: str = '', key_prefix: str = 'message_store:', ttl: Optional[int] = None)[source]¶
add_ai_message(message: str) → None¶
Convenience method for adding an AI message string to the store.
Parameters
message – The string contents of an AI message.
add_message(message: BaseMessage) → None[source]¶
Append the message to the record in Upstash Redis
add_user_message(message: str) → None¶
Convenience method for adding a human message string to the store.
Parameters
message – The string contents of a human message.
clear() → None[source]¶
Clear session memory from Upstash Redis | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.upstash_redis.UpstashRedisChatMessageHistory.html |
185194f3a878-0 | langchain.memory.utils.get_prompt_input_key¶
langchain.memory.utils.get_prompt_input_key(inputs: Dict[str, Any], memory_variables: List[str]) → str[source]¶
Get the prompt input key.
Parameters
inputs – Dict[str, Any]
memory_variables – List[str]
Returns
A prompt input key. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.utils.get_prompt_input_key.html |
2137820817f9-0 | langchain.memory.entity.BaseEntityStore¶
class langchain.memory.entity.BaseEntityStore[source]¶
Bases: BaseModel, ABC
Abstract base class for Entity store.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
abstract clear() → None[source]¶
Delete all entities from store.
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep – set to True to make a deep copy of the model
Returns
new model instance
abstract delete(key: str) → None[source]¶
Delete entity value from store. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.BaseEntityStore.html |
2137820817f9-1 | abstract delete(key: str) → None[source]¶
Delete entity value from store.
dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
abstract exists(key: str) → bool[source]¶
Check if entity exists in store.
classmethod from_orm(obj: Any) → Model¶
abstract get(key: str, default: Optional[str] = None) → Optional[str][source]¶
Get entity value from store.
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod parse_obj(obj: Any) → Model¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.BaseEntityStore.html |
2137820817f9-2 | classmethod parse_obj(obj: Any) → Model¶
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶
abstract set(key: str, value: Optional[str]) → None[source]¶
Set entity value in store.
classmethod update_forward_refs(**localns: Any) → None¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
classmethod validate(value: Any) → Model¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.BaseEntityStore.html |
f3fc0ffa1d98-0 | langchain.memory.entity.SQLiteEntityStore¶
class langchain.memory.entity.SQLiteEntityStore[source]¶
Bases: BaseEntityStore
SQLite-backed Entity store
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param session_id: str = 'default'¶
param table_name: str = 'memory_store'¶
clear() → None[source]¶
Delete all entities from store.
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep – set to True to make a deep copy of the model
Returns
new model instance
delete(key: str) → None[source]¶
Delete entity value from store. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.SQLiteEntityStore.html |
f3fc0ffa1d98-1 | delete(key: str) → None[source]¶
Delete entity value from store.
dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
exists(key: str) → bool[source]¶
Check if entity exists in store.
classmethod from_orm(obj: Any) → Model¶
get(key: str, default: Optional[str] = None) → Optional[str][source]¶
Get entity value from store.
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod parse_obj(obj: Any) → Model¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.SQLiteEntityStore.html |
f3fc0ffa1d98-2 | classmethod parse_obj(obj: Any) → Model¶
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶
set(key: str, value: Optional[str]) → None[source]¶
Set entity value in store.
classmethod update_forward_refs(**localns: Any) → None¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
classmethod validate(value: Any) → Model¶
property full_table_name: str¶
Examples using SQLiteEntityStore¶
Entity Memory with SQLite storage | lang/api.python.langchain.com/en/latest/memory/langchain.memory.entity.SQLiteEntityStore.html |
13914ff4e577-0 | langchain.memory.chat_message_histories.cassandra.CassandraChatMessageHistory¶
class langchain.memory.chat_message_histories.cassandra.CassandraChatMessageHistory(session_id: str, session: Session, keyspace: str, table_name: str = 'message_store', ttl_seconds: Optional[int] = None)[source]¶
Chat message history that stores history in Cassandra.
Parameters
session_id – arbitrary key that is used to store the messages
of a single chat session.
session – a Cassandra Session object (an open DB connection)
keyspace – name of the keyspace to use.
table_name – name of the table to use.
ttl_seconds – time-to-live (seconds) for automatic expiration
of stored entries. None (default) for no expiration.
Attributes
messages
Retrieve all session messages from DB
Methods
__init__(session_id, session, keyspace[, ...])
add_ai_message(message)
Convenience method for adding an AI message string to the store.
add_message(message)
Write a message to the table
add_user_message(message)
Convenience method for adding a human message string to the store.
clear()
Clear session memory from DB
__init__(session_id: str, session: Session, keyspace: str, table_name: str = 'message_store', ttl_seconds: Optional[int] = None) → None[source]¶
add_ai_message(message: str) → None¶
Convenience method for adding an AI message string to the store.
Parameters
message – The string contents of an AI message.
add_message(message: BaseMessage) → None[source]¶
Write a message to the table
add_user_message(message: str) → None¶
Convenience method for adding a human message string to the store.
Parameters
message – The string contents of a human message.
clear() → None[source]¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.cassandra.CassandraChatMessageHistory.html |
13914ff4e577-1 | message – The string contents of a human message.
clear() → None[source]¶
Clear session memory from DB
Examples using CassandraChatMessageHistory¶
Cassandra Chat Message History
Cassandra | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.cassandra.CassandraChatMessageHistory.html |
b0ddb716458f-0 | langchain.memory.vectorstore.VectorStoreRetrieverMemory¶
class langchain.memory.vectorstore.VectorStoreRetrieverMemory[source]¶
Bases: BaseMemory
VectorStoreRetriever-backed memory.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param exclude_input_keys: Sequence[str] [Optional]¶
Input keys to exclude in addition to memory key when constructing the document
param input_key: Optional[str] = None¶
Key name to index the inputs to load_memory_variables.
param memory_key: str = 'history'¶
Key name to locate the memories in the result of load_memory_variables.
param retriever: langchain.schema.vectorstore.VectorStoreRetriever [Required]¶
VectorStoreRetriever object to connect to.
param return_docs: bool = False¶
Whether or not to return the result of querying the database directly.
clear() → None[source]¶
Nothing to clear.
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include | lang/api.python.langchain.com/en/latest/memory/langchain.memory.vectorstore.VectorStoreRetrieverMemory.html |
b0ddb716458f-1 | exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep – set to True to make a deep copy of the model
Returns
new model instance
dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
classmethod from_orm(obj: Any) → Model¶
classmethod get_lc_namespace() → List[str]¶
Get the namespace of the langchain object.
For example, if the class is langchain.llms.openai.OpenAI, then the
namespace is [“langchain”, “llms”, “openai”]
classmethod is_lc_serializable() → bool¶
Is this class serializable?
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶
Generate a JSON representation of the model, include and exclude arguments as per dict(). | lang/api.python.langchain.com/en/latest/memory/langchain.memory.vectorstore.VectorStoreRetrieverMemory.html |
b0ddb716458f-2 | Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
classmethod lc_id() → List[str]¶
A unique identifier for this class for serialization purposes.
The unique identifier is a list of strings that describes the path
to the object.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, Union[List[Document], str]][source]¶
Return history buffer.
classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod parse_obj(obj: Any) → Model¶
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Save context from this conversation to buffer.
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
classmethod update_forward_refs(**localns: Any) → None¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
classmethod validate(value: Any) → Model¶
property lc_attributes: Dict¶
List of attribute names that should be included in the serialized kwargs.
These attributes must be accepted by the constructor. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.vectorstore.VectorStoreRetrieverMemory.html |
b0ddb716458f-3 | These attributes must be accepted by the constructor.
property lc_secrets: Dict[str, str]¶
A map of constructor argument names to secret ids.
For example,{“openai_api_key”: “OPENAI_API_KEY”}
property memory_variables: List[str]¶
The list of keys emitted from the load_memory_variables method.
Examples using VectorStoreRetrieverMemory¶
Zep | lang/api.python.langchain.com/en/latest/memory/langchain.memory.vectorstore.VectorStoreRetrieverMemory.html |
be7214fb8f60-0 | langchain.memory.zep_memory.ZepMemory¶
class langchain.memory.zep_memory.ZepMemory[source]¶
Bases: ConversationBufferMemory
Persist your chain history to the Zep MemoryStore.
The number of messages returned by Zep and when the Zep server summarizes chat
histories is configurable. See the Zep documentation for more details.
Documentation: https://docs.getzep.com
Example
memory = ZepMemory(
session_id=session_id, # Identifies your user or a user’s session
url=ZEP_API_URL, # Your Zep server’s URL
api_key=<your_api_key>, # Optional
memory_key=”history”, # Ensure this matches the key used in
# chain’s prompt template
return_messages=True, # Does your prompt template expect a string# or a list of Messages?
)
chain = LLMChain(memory=memory,…) # Configure your chain to use the ZepMemoryinstance
Note
To persist metadata alongside your chat history, your will need to create a
custom Chain class that overrides the prep_outputs method to include the metadata
in the call to self.memory.save_context.
Zep is an open source platform for productionizing LLM apps. Go from a prototype
built in LangChain or LlamaIndex, or a custom app, to production in minutes without
rewriting code.
For server installation instructions and more, see:
https://docs.getzep.com/deployment/quickstart/
For more information on the zep-python package, see:
https://github.com/getzep/zep-python
Initialize ZepMemory.
Parameters
session_id (str) – Identifies your user or a user’s session
url (str, optional) – Your Zep server’s URL. Defaults to
“http://localhost:8000”. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.zep_memory.ZepMemory.html |
be7214fb8f60-1 | “http://localhost:8000”.
api_key (Optional[str], optional) – Your Zep API key. Defaults to None.
output_key (Optional[str], optional) – The key to use for the output message.
Defaults to None.
input_key (Optional[str], optional) – The key to use for the input message.
Defaults to None.
return_messages (bool, optional) – Does your prompt template expect a string
or a list of Messages? Defaults to False
i.e. return a string.
human_prefix (str, optional) – The prefix to use for human messages.
Defaults to “Human”.
ai_prefix (str, optional) – The prefix to use for AI messages.
Defaults to “AI”.
memory_key (str, optional) – The key to use for the memory.
Defaults to “history”.
Ensure that this matches the key used in
chain’s prompt template.
param ai_prefix: str = 'AI'¶
param chat_memory: ZepChatMessageHistory [Required]¶
param human_prefix: str = 'Human'¶
param input_key: Optional[str] = None¶
param output_key: Optional[str] = None¶
param return_messages: bool = False¶
clear() → None¶
Clear memory contents.
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = ‘allow’ was set since it adds all passed values | lang/api.python.langchain.com/en/latest/memory/langchain.memory.zep_memory.ZepMemory.html |
be7214fb8f60-2 | Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep – set to True to make a deep copy of the model
Returns
new model instance
dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
classmethod from_orm(obj: Any) → Model¶
classmethod get_lc_namespace() → List[str]¶
Get the namespace of the langchain object.
For example, if the class is langchain.llms.openai.OpenAI, then the
namespace is [“langchain”, “llms”, “openai”]
classmethod is_lc_serializable() → bool¶
Is this class serializable? | lang/api.python.langchain.com/en/latest/memory/langchain.memory.zep_memory.ZepMemory.html |
be7214fb8f60-3 | classmethod is_lc_serializable() → bool¶
Is this class serializable?
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
classmethod lc_id() → List[str]¶
A unique identifier for this class for serialization purposes.
The unique identifier is a list of strings that describes the path
to the object.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, Any]¶
Return history buffer.
classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod parse_obj(obj: Any) → Model¶
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
save_context(inputs: Dict[str, Any], outputs: Dict[str, str], metadata: Optional[Dict[str, Any]] = None) → None[source]¶
Save context from this conversation to buffer.
Parameters
inputs (Dict[str, Any]) – The inputs to the chain.
outputs (Dict[str, str]) – The outputs from the chain. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.zep_memory.ZepMemory.html |
be7214fb8f60-4 | outputs (Dict[str, str]) – The outputs from the chain.
metadata (Optional[Dict[str, Any]], optional) – Any metadata to save with
the context. Defaults to None
Returns
None
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
classmethod update_forward_refs(**localns: Any) → None¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
classmethod validate(value: Any) → Model¶
property buffer: Any¶
String buffer of memory.
property buffer_as_messages: List[langchain.schema.messages.BaseMessage]¶
Exposes the buffer as a list of messages in case return_messages is False.
property buffer_as_str: str¶
Exposes the buffer as a string in case return_messages is True.
property lc_attributes: Dict¶
List of attribute names that should be included in the serialized kwargs.
These attributes must be accepted by the constructor.
property lc_secrets: Dict[str, str]¶
A map of constructor argument names to secret ids.
For example,{“openai_api_key”: “OPENAI_API_KEY”}
Examples using ZepMemory¶
Zep
Zep Memory | lang/api.python.langchain.com/en/latest/memory/langchain.memory.zep_memory.ZepMemory.html |
5c23c5f74ab4-0 | langchain.memory.chat_memory.BaseChatMemory¶
class langchain.memory.chat_memory.BaseChatMemory[source]¶
Bases: BaseMemory, ABC
Abstract base class for chat memory.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param chat_memory: langchain.schema.chat_history.BaseChatMessageHistory [Optional]¶
param input_key: Optional[str] = None¶
param output_key: Optional[str] = None¶
param return_messages: bool = False¶
clear() → None[source]¶
Clear memory contents.
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep – set to True to make a deep copy of the model
Returns
new model instance | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_memory.BaseChatMemory.html |
5c23c5f74ab4-1 | deep – set to True to make a deep copy of the model
Returns
new model instance
dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
classmethod from_orm(obj: Any) → Model¶
classmethod get_lc_namespace() → List[str]¶
Get the namespace of the langchain object.
For example, if the class is langchain.llms.openai.OpenAI, then the
namespace is [“langchain”, “llms”, “openai”]
classmethod is_lc_serializable() → bool¶
Is this class serializable?
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
classmethod lc_id() → List[str]¶
A unique identifier for this class for serialization purposes.
The unique identifier is a list of strings that describes the path
to the object. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_memory.BaseChatMemory.html |
5c23c5f74ab4-2 | The unique identifier is a list of strings that describes the path
to the object.
abstract load_memory_variables(inputs: Dict[str, Any]) → Dict[str, Any]¶
Return key-value pairs given the text input to the chain.
classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod parse_obj(obj: Any) → Model¶
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None[source]¶
Save context from this conversation to buffer.
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
classmethod update_forward_refs(**localns: Any) → None¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
classmethod validate(value: Any) → Model¶
property lc_attributes: Dict¶
List of attribute names that should be included in the serialized kwargs.
These attributes must be accepted by the constructor.
property lc_secrets: Dict[str, str]¶
A map of constructor argument names to secret ids.
For example,{“openai_api_key”: “OPENAI_API_KEY”}
abstract property memory_variables: List[str]¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_memory.BaseChatMemory.html |
5c23c5f74ab4-3 | abstract property memory_variables: List[str]¶
The string keys this memory class will add to chain inputs. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_memory.BaseChatMemory.html |
b26b86fb3e4d-0 | langchain.memory.chat_message_histories.rocksetdb.RocksetChatMessageHistory¶
class langchain.memory.chat_message_histories.rocksetdb.RocksetChatMessageHistory(session_id: str, client: ~typing.Any, collection: str, workspace: str = 'commons', messages_key: str = 'messages', sync: bool = False, message_uuid_method: ~typing.Callable[[], ~typing.Union[str, int]] = <function RocksetChatMessageHistory.<lambda>>)[source]¶
Uses Rockset to store chat messages.
To use, ensure that the rockset python package installed.
Example
from langchain.memory.chat_message_histories import (
RocksetChatMessageHistory
)
from rockset import RocksetClient
history = RocksetChatMessageHistory(
session_id="MySession",
client=RocksetClient(),
collection="langchain_demo",
sync=True
)
history.add_user_message("hi!")
history.add_ai_message("whats up?")
print(history.messages)
Constructs a new RocksetChatMessageHistory.
Parameters
session_id (-) – The ID of the chat session
client (-) – The RocksetClient object to use to query
collection (-) – The name of the collection to use to store chat
messages. If a collection with the given name
does not exist in the workspace, it is created.
workspace (-) – The workspace containing collection. Defaults
to “commons”
messages_key (-) – The DB column containing message history.
Defaults to “messages”
sync (-) – Whether to wait for messages to be added. Defaults
to False. NOTE: setting this to True will slow
down performance.
message_uuid_method (-) – The method that generates message IDs.
If set, all messages will have an id field within the | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.rocksetdb.RocksetChatMessageHistory.html |
b26b86fb3e4d-1 | If set, all messages will have an id field within the
additional_kwargs property. If this param is not set
and sync is False, message IDs will not be created.
If this param is not set and sync is True, the
uuid.uuid4 method will be used to create message IDs.
Attributes
ADD_TIMEOUT_MS
CREATE_TIMEOUT_MS
SLEEP_INTERVAL_MS
messages
Messages in this chat history.
Methods
__init__(session_id, client, collection[, ...])
Constructs a new RocksetChatMessageHistory.
add_ai_message(message)
Convenience method for adding an AI message string to the store.
add_message(message)
Add a Message object to the history.
add_user_message(message)
Convenience method for adding a human message string to the store.
clear()
Removes all messages from the chat history
__init__(session_id: str, client: ~typing.Any, collection: str, workspace: str = 'commons', messages_key: str = 'messages', sync: bool = False, message_uuid_method: ~typing.Callable[[], ~typing.Union[str, int]] = <function RocksetChatMessageHistory.<lambda>>) → None[source]¶
Constructs a new RocksetChatMessageHistory.
Parameters
session_id (-) – The ID of the chat session
client (-) – The RocksetClient object to use to query
collection (-) – The name of the collection to use to store chat
messages. If a collection with the given name
does not exist in the workspace, it is created.
workspace (-) – The workspace containing collection. Defaults
to “commons”
messages_key (-) – The DB column containing message history.
Defaults to “messages”
sync (-) – Whether to wait for messages to be added. Defaults
to False. NOTE: setting this to True will slow
down performance. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.rocksetdb.RocksetChatMessageHistory.html |
b26b86fb3e4d-2 | to False. NOTE: setting this to True will slow
down performance.
message_uuid_method (-) – The method that generates message IDs.
If set, all messages will have an id field within the
additional_kwargs property. If this param is not set
and sync is False, message IDs will not be created.
If this param is not set and sync is True, the
uuid.uuid4 method will be used to create message IDs.
add_ai_message(message: str) → None¶
Convenience method for adding an AI message string to the store.
Parameters
message – The string contents of an AI message.
add_message(message: BaseMessage) → None[source]¶
Add a Message object to the history.
Parameters
message – A BaseMessage object to store.
add_user_message(message: str) → None¶
Convenience method for adding a human message string to the store.
Parameters
message – The string contents of a human message.
clear() → None[source]¶
Removes all messages from the chat history
Examples using RocksetChatMessageHistory¶
Rockset Chat Message History | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.rocksetdb.RocksetChatMessageHistory.html |
4cc58e6d702e-0 | langchain.memory.chat_message_histories.singlestoredb.SingleStoreDBChatMessageHistory¶
class langchain.memory.chat_message_histories.singlestoredb.SingleStoreDBChatMessageHistory(session_id: str, *, table_name: str = 'message_store', id_field: str = 'id', session_id_field: str = 'session_id', message_field: str = 'message', pool_size: int = 5, max_overflow: int = 10, timeout: float = 30, **kwargs: Any)[source]¶
Chat message history stored in a SingleStoreDB database.
Initialize with necessary components.
Parameters
table_name (str, optional) – Specifies the name of the table in use.
Defaults to “message_store”.
id_field (str, optional) – Specifies the name of the id field in the table.
Defaults to “id”.
session_id_field (str, optional) – Specifies the name of the session_id
field in the table. Defaults to “session_id”.
message_field (str, optional) – Specifies the name of the message field
in the table. Defaults to “message”.
pool (Following arguments pertain to the connection) –
pool_size (int, optional) – Determines the number of active connections in
the pool. Defaults to 5.
max_overflow (int, optional) – Determines the maximum number of connections
allowed beyond the pool_size. Defaults to 10.
timeout (float, optional) – Specifies the maximum wait time in seconds for
establishing a connection. Defaults to 30.
connection (database) –
host (str, optional) – Specifies the hostname, IP address, or URL for the
database connection. The default scheme is “mysql”.
user (str, optional) – Database username.
password (str, optional) – Database password. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.singlestoredb.SingleStoreDBChatMessageHistory.html |
4cc58e6d702e-1 | password (str, optional) – Database password.
port (int, optional) – Database port. Defaults to 3306 for non-HTTP
connections, 80 for HTTP connections, and 443 for HTTPS connections.
database (str, optional) – Database name.
the (Additional optional arguments provide further customization over) –
connection –
pure_python (bool, optional) – Toggles the connector mode. If True,
operates in pure Python mode.
local_infile (bool, optional) – Allows local file uploads.
charset (str, optional) – Specifies the character set for string values.
ssl_key (str, optional) – Specifies the path of the file containing the SSL
key.
ssl_cert (str, optional) – Specifies the path of the file containing the SSL
certificate.
ssl_ca (str, optional) – Specifies the path of the file containing the SSL
certificate authority.
ssl_cipher (str, optional) – Sets the SSL cipher list.
ssl_disabled (bool, optional) – Disables SSL usage.
ssl_verify_cert (bool, optional) – Verifies the server’s certificate.
Automatically enabled if ssl_ca is specified.
ssl_verify_identity (bool, optional) – Verifies the server’s identity.
conv (dict[int, Callable], optional) – A dictionary of data conversion
functions.
credential_type (str, optional) – Specifies the type of authentication to
use: auth.PASSWORD, auth.JWT, or auth.BROWSER_SSO.
autocommit (bool, optional) – Enables autocommits.
results_type (str, optional) – Determines the structure of the query results:
tuples, namedtuples, dicts.
results_format (str, optional) – Deprecated. This option has been renamed to
results_type.
Examples
Basic Usage:
from langchain.memory.chat_message_histories import ( | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.singlestoredb.SingleStoreDBChatMessageHistory.html |
4cc58e6d702e-2 | Examples
Basic Usage:
from langchain.memory.chat_message_histories import (
SingleStoreDBChatMessageHistory
)
message_history = SingleStoreDBChatMessageHistory(
session_id="my-session",
host="https://user:[email protected]:3306/database"
)
Advanced Usage:
from langchain.memory.chat_message_histories import (
SingleStoreDBChatMessageHistory
)
message_history = SingleStoreDBChatMessageHistory(
session_id="my-session",
host="127.0.0.1",
port=3306,
user="user",
password="password",
database="db",
table_name="my_custom_table",
pool_size=10,
timeout=60,
)
Using environment variables:
from langchain.memory.chat_message_histories import (
SingleStoreDBChatMessageHistory
)
os.environ['SINGLESTOREDB_URL'] = 'me:[email protected]/my_db'
message_history = SingleStoreDBChatMessageHistory("my-session")
Attributes
messages
Retrieve the messages from SingleStoreDB
Methods
__init__(session_id, *[, table_name, ...])
Initialize with necessary components.
add_ai_message(message)
Convenience method for adding an AI message string to the store.
add_message(message)
Append the message to the record in SingleStoreDB
add_user_message(message)
Convenience method for adding a human message string to the store.
clear()
Clear session memory from SingleStoreDB | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.singlestoredb.SingleStoreDBChatMessageHistory.html |
4cc58e6d702e-3 | clear()
Clear session memory from SingleStoreDB
__init__(session_id: str, *, table_name: str = 'message_store', id_field: str = 'id', session_id_field: str = 'session_id', message_field: str = 'message', pool_size: int = 5, max_overflow: int = 10, timeout: float = 30, **kwargs: Any)[source]¶
Initialize with necessary components.
Parameters
table_name (str, optional) – Specifies the name of the table in use.
Defaults to “message_store”.
id_field (str, optional) – Specifies the name of the id field in the table.
Defaults to “id”.
session_id_field (str, optional) – Specifies the name of the session_id
field in the table. Defaults to “session_id”.
message_field (str, optional) – Specifies the name of the message field
in the table. Defaults to “message”.
pool (Following arguments pertain to the connection) –
pool_size (int, optional) – Determines the number of active connections in
the pool. Defaults to 5.
max_overflow (int, optional) – Determines the maximum number of connections
allowed beyond the pool_size. Defaults to 10.
timeout (float, optional) – Specifies the maximum wait time in seconds for
establishing a connection. Defaults to 30.
connection (database) –
host (str, optional) – Specifies the hostname, IP address, or URL for the
database connection. The default scheme is “mysql”.
user (str, optional) – Database username.
password (str, optional) – Database password.
port (int, optional) – Database port. Defaults to 3306 for non-HTTP
connections, 80 for HTTP connections, and 443 for HTTPS connections.
database (str, optional) – Database name. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.singlestoredb.SingleStoreDBChatMessageHistory.html |
4cc58e6d702e-4 | database (str, optional) – Database name.
the (Additional optional arguments provide further customization over) –
connection –
pure_python (bool, optional) – Toggles the connector mode. If True,
operates in pure Python mode.
local_infile (bool, optional) – Allows local file uploads.
charset (str, optional) – Specifies the character set for string values.
ssl_key (str, optional) – Specifies the path of the file containing the SSL
key.
ssl_cert (str, optional) – Specifies the path of the file containing the SSL
certificate.
ssl_ca (str, optional) – Specifies the path of the file containing the SSL
certificate authority.
ssl_cipher (str, optional) – Sets the SSL cipher list.
ssl_disabled (bool, optional) – Disables SSL usage.
ssl_verify_cert (bool, optional) – Verifies the server’s certificate.
Automatically enabled if ssl_ca is specified.
ssl_verify_identity (bool, optional) – Verifies the server’s identity.
conv (dict[int, Callable], optional) – A dictionary of data conversion
functions.
credential_type (str, optional) – Specifies the type of authentication to
use: auth.PASSWORD, auth.JWT, or auth.BROWSER_SSO.
autocommit (bool, optional) – Enables autocommits.
results_type (str, optional) – Determines the structure of the query results:
tuples, namedtuples, dicts.
results_format (str, optional) – Deprecated. This option has been renamed to
results_type.
Examples
Basic Usage:
from langchain.memory.chat_message_histories import (
SingleStoreDBChatMessageHistory
)
message_history = SingleStoreDBChatMessageHistory(
session_id="my-session", | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.singlestoredb.SingleStoreDBChatMessageHistory.html |
4cc58e6d702e-5 | message_history = SingleStoreDBChatMessageHistory(
session_id="my-session",
host="https://user:[email protected]:3306/database"
)
Advanced Usage:
from langchain.memory.chat_message_histories import (
SingleStoreDBChatMessageHistory
)
message_history = SingleStoreDBChatMessageHistory(
session_id="my-session",
host="127.0.0.1",
port=3306,
user="user",
password="password",
database="db",
table_name="my_custom_table",
pool_size=10,
timeout=60,
)
Using environment variables:
from langchain.memory.chat_message_histories import (
SingleStoreDBChatMessageHistory
)
os.environ['SINGLESTOREDB_URL'] = 'me:[email protected]/my_db'
message_history = SingleStoreDBChatMessageHistory("my-session")
add_ai_message(message: str) → None¶
Convenience method for adding an AI message string to the store.
Parameters
message – The string contents of an AI message.
add_message(message: BaseMessage) → None[source]¶
Append the message to the record in SingleStoreDB
add_user_message(message: str) → None¶
Convenience method for adding a human message string to the store.
Parameters
message – The string contents of a human message.
clear() → None[source]¶
Clear session memory from SingleStoreDB | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.singlestoredb.SingleStoreDBChatMessageHistory.html |
939ac6a7ef2c-0 | langchain.memory.summary.SummarizerMixin¶
class langchain.memory.summary.SummarizerMixin[source]¶
Bases: BaseModel
Mixin for summarizer.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param ai_prefix: str = 'AI'¶
param human_prefix: str = 'Human'¶
param llm: langchain.schema.language_model.BaseLanguageModel [Required]¶
param prompt: langchain.schema.prompt_template.BasePromptTemplate = PromptTemplate(input_variables=['new_lines', 'summary'], template='Progressively summarize the lines of conversation provided, adding onto the previous summary returning a new summary.\n\nEXAMPLE\nCurrent summary:\nThe human asks what the AI thinks of artificial intelligence. The AI thinks artificial intelligence is a force for good.\n\nNew lines of conversation:\nHuman: Why do you think artificial intelligence is a force for good?\nAI: Because artificial intelligence will help humans reach their full potential.\n\nNew summary:\nThe human asks what the AI thinks of artificial intelligence. The AI thinks artificial intelligence is a force for good because it will help humans reach their full potential.\nEND OF EXAMPLE\n\nCurrent summary:\n{summary}\n\nNew lines of conversation:\n{new_lines}\n\nNew summary:')¶
param summary_message_cls: Type[langchain.schema.messages.BaseMessage] = <class 'langchain.schema.messages.SystemMessage'>¶
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = ‘allow’ was set since it adds all passed values | lang/api.python.langchain.com/en/latest/memory/langchain.memory.summary.SummarizerMixin.html |
939ac6a7ef2c-1 | Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep – set to True to make a deep copy of the model
Returns
new model instance
dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
classmethod from_orm(obj: Any) → Model¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.summary.SummarizerMixin.html |
939ac6a7ef2c-2 | classmethod from_orm(obj: Any) → Model¶
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod parse_obj(obj: Any) → Model¶
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
predict_new_summary(messages: List[BaseMessage], existing_summary: str) → str[source]¶
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶
classmethod update_forward_refs(**localns: Any) → None¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
classmethod validate(value: Any) → Model¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.summary.SummarizerMixin.html |
e24e755eb19b-0 | langchain.memory.chat_message_histories.streamlit.StreamlitChatMessageHistory¶
class langchain.memory.chat_message_histories.streamlit.StreamlitChatMessageHistory(key: str = 'langchain_messages')[source]¶
Chat message history that stores messages in Streamlit session state.
Parameters
key – The key to use in Streamlit session state for storing messages.
Attributes
messages
Retrieve the current list of messages
Methods
__init__([key])
add_ai_message(message)
Convenience method for adding an AI message string to the store.
add_message(message)
Add a message to the session memory
add_user_message(message)
Convenience method for adding a human message string to the store.
clear()
Clear session memory
__init__(key: str = 'langchain_messages')[source]¶
add_ai_message(message: str) → None¶
Convenience method for adding an AI message string to the store.
Parameters
message – The string contents of an AI message.
add_message(message: BaseMessage) → None[source]¶
Add a message to the session memory
add_user_message(message: str) → None¶
Convenience method for adding a human message string to the store.
Parameters
message – The string contents of a human message.
clear() → None[source]¶
Clear session memory
Examples using StreamlitChatMessageHistory¶
Streamlit Chat Message History | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.streamlit.StreamlitChatMessageHistory.html |
e102554b6154-0 | langchain.memory.chat_message_histories.redis.RedisChatMessageHistory¶
class langchain.memory.chat_message_histories.redis.RedisChatMessageHistory(session_id: str, url: str = 'redis://localhost:6379/0', key_prefix: str = 'message_store:', ttl: Optional[int] = None)[source]¶
Chat message history stored in a Redis database.
Attributes
key
Construct the record key to use
messages
Retrieve the messages from Redis
Methods
__init__(session_id[, url, key_prefix, ttl])
add_ai_message(message)
Convenience method for adding an AI message string to the store.
add_message(message)
Append the message to the record in Redis
add_user_message(message)
Convenience method for adding a human message string to the store.
clear()
Clear session memory from Redis
__init__(session_id: str, url: str = 'redis://localhost:6379/0', key_prefix: str = 'message_store:', ttl: Optional[int] = None)[source]¶
add_ai_message(message: str) → None¶
Convenience method for adding an AI message string to the store.
Parameters
message – The string contents of an AI message.
add_message(message: BaseMessage) → None[source]¶
Append the message to the record in Redis
add_user_message(message: str) → None¶
Convenience method for adding a human message string to the store.
Parameters
message – The string contents of a human message.
clear() → None[source]¶
Clear session memory from Redis
Examples using RedisChatMessageHistory¶
Redis Chat Message History
Message Memory in Agent backed by a database | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.redis.RedisChatMessageHistory.html |
b88a18861d6f-0 | langchain.memory.chat_message_histories.mongodb.MongoDBChatMessageHistory¶
class langchain.memory.chat_message_histories.mongodb.MongoDBChatMessageHistory(connection_string: str, session_id: str, database_name: str = 'chat_history', collection_name: str = 'message_store')[source]¶
Chat message history that stores history in MongoDB.
Parameters
connection_string – connection string to connect to MongoDB
session_id – arbitrary key that is used to store the messages
of a single chat session.
database_name – name of the database to use
collection_name – name of the collection to use
Attributes
messages
Retrieve the messages from MongoDB
Methods
__init__(connection_string, session_id[, ...])
add_ai_message(message)
Convenience method for adding an AI message string to the store.
add_message(message)
Append the message to the record in MongoDB
add_user_message(message)
Convenience method for adding a human message string to the store.
clear()
Clear session memory from MongoDB
__init__(connection_string: str, session_id: str, database_name: str = 'chat_history', collection_name: str = 'message_store')[source]¶
add_ai_message(message: str) → None¶
Convenience method for adding an AI message string to the store.
Parameters
message – The string contents of an AI message.
add_message(message: BaseMessage) → None[source]¶
Append the message to the record in MongoDB
add_user_message(message: str) → None¶
Convenience method for adding a human message string to the store.
Parameters
message – The string contents of a human message.
clear() → None[source]¶
Clear session memory from MongoDB
Examples using MongoDBChatMessageHistory¶
Mongodb Chat Message History | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.mongodb.MongoDBChatMessageHistory.html |
42ff9277969c-0 | langchain.memory.chat_message_histories.sql.BaseMessageConverter¶
class langchain.memory.chat_message_histories.sql.BaseMessageConverter[source]¶
The class responsible for converting BaseMessage to your SQLAlchemy model.
Methods
__init__()
from_sql_model(sql_message)
Convert a SQLAlchemy model to a BaseMessage instance.
get_sql_model_class()
Get the SQLAlchemy model class.
to_sql_model(message, session_id)
Convert a BaseMessage instance to a SQLAlchemy model.
__init__()¶
abstract from_sql_model(sql_message: Any) → BaseMessage[source]¶
Convert a SQLAlchemy model to a BaseMessage instance.
abstract get_sql_model_class() → Any[source]¶
Get the SQLAlchemy model class.
abstract to_sql_model(message: BaseMessage, session_id: str) → Any[source]¶
Convert a BaseMessage instance to a SQLAlchemy model.
Examples using BaseMessageConverter¶
SQL Chat Message History | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.sql.BaseMessageConverter.html |
243cff31644a-0 | langchain.memory.chat_message_histories.dynamodb.DynamoDBChatMessageHistory¶
class langchain.memory.chat_message_histories.dynamodb.DynamoDBChatMessageHistory(table_name: str, session_id: str, endpoint_url: Optional[str] = None, primary_key_name: str = 'SessionId', key: Optional[Dict[str, str]] = None, boto3_session: Optional[Session] = None, kms_key_id: Optional[str] = None)[source]¶
Chat message history that stores history in AWS DynamoDB.
This class expects that a DynamoDB table exists with name table_name
Parameters
table_name – name of the DynamoDB table
session_id – arbitrary key that is used to store the messages
of a single chat session.
endpoint_url – URL of the AWS endpoint to connect to. This argument
is optional and useful for test purposes, like using Localstack.
If you plan to use AWS cloud service, you normally don’t have to
worry about setting the endpoint_url.
primary_key_name – name of the primary key of the DynamoDB table. This argument
is optional, defaulting to “SessionId”.
key – an optional dictionary with a custom primary and secondary key.
This argument is optional, but useful when using composite dynamodb keys, or
isolating records based off of application details such as a user id.
This may also contain global and local secondary index keys.
kms_key_id – an optional AWS KMS Key ID, AWS KMS Key ARN, or AWS KMS Alias for
client-side encryption
Attributes
messages
Retrieve the messages from DynamoDB
Methods
__init__(table_name, session_id[, ...])
add_ai_message(message)
Convenience method for adding an AI message string to the store.
add_message(message)
Append the message to the record in DynamoDB
add_user_message(message) | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.dynamodb.DynamoDBChatMessageHistory.html |
243cff31644a-1 | Append the message to the record in DynamoDB
add_user_message(message)
Convenience method for adding a human message string to the store.
clear()
Clear session memory from DynamoDB
__init__(table_name: str, session_id: str, endpoint_url: Optional[str] = None, primary_key_name: str = 'SessionId', key: Optional[Dict[str, str]] = None, boto3_session: Optional[Session] = None, kms_key_id: Optional[str] = None)[source]¶
add_ai_message(message: str) → None¶
Convenience method for adding an AI message string to the store.
Parameters
message – The string contents of an AI message.
add_message(message: BaseMessage) → None[source]¶
Append the message to the record in DynamoDB
add_user_message(message: str) → None¶
Convenience method for adding a human message string to the store.
Parameters
message – The string contents of a human message.
clear() → None[source]¶
Clear session memory from DynamoDB
Examples using DynamoDBChatMessageHistory¶
Dynamodb Chat Message History | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.dynamodb.DynamoDBChatMessageHistory.html |
505e6217552b-0 | langchain.memory.chat_message_histories.neo4j.Neo4jChatMessageHistory¶
class langchain.memory.chat_message_histories.neo4j.Neo4jChatMessageHistory(session_id: Union[str, int], url: Optional[str] = None, username: Optional[str] = None, password: Optional[str] = None, database: str = 'neo4j', node_label: str = 'Session', window: int = 3)[source]¶
Chat message history stored in a Neo4j database.
Attributes
messages
Retrieve the messages from Neo4j
Methods
__init__(session_id[, url, username, ...])
add_ai_message(message)
Convenience method for adding an AI message string to the store.
add_message(message)
Append the message to the record in Neo4j
add_user_message(message)
Convenience method for adding a human message string to the store.
clear()
Clear session memory from Neo4j
__init__(session_id: Union[str, int], url: Optional[str] = None, username: Optional[str] = None, password: Optional[str] = None, database: str = 'neo4j', node_label: str = 'Session', window: int = 3)[source]¶
add_ai_message(message: str) → None¶
Convenience method for adding an AI message string to the store.
Parameters
message – The string contents of an AI message.
add_message(message: BaseMessage) → None[source]¶
Append the message to the record in Neo4j
add_user_message(message: str) → None¶
Convenience method for adding a human message string to the store.
Parameters
message – The string contents of a human message.
clear() → None[source]¶
Clear session memory from Neo4j | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.neo4j.Neo4jChatMessageHistory.html |
4958b71a0edd-0 | langchain.memory.buffer.ConversationBufferMemory¶
class langchain.memory.buffer.ConversationBufferMemory[source]¶
Bases: BaseChatMemory
Buffer for storing conversation memory.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param ai_prefix: str = 'AI'¶
param chat_memory: BaseChatMessageHistory [Optional]¶
param human_prefix: str = 'Human'¶
param input_key: Optional[str] = None¶
param output_key: Optional[str] = None¶
param return_messages: bool = False¶
clear() → None¶
Clear memory contents.
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep – set to True to make a deep copy of the model
Returns
new model instance | lang/api.python.langchain.com/en/latest/memory/langchain.memory.buffer.ConversationBufferMemory.html |
4958b71a0edd-1 | deep – set to True to make a deep copy of the model
Returns
new model instance
dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
classmethod from_orm(obj: Any) → Model¶
classmethod get_lc_namespace() → List[str]¶
Get the namespace of the langchain object.
For example, if the class is langchain.llms.openai.OpenAI, then the
namespace is [“langchain”, “llms”, “openai”]
classmethod is_lc_serializable() → bool¶
Is this class serializable?
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
classmethod lc_id() → List[str]¶
A unique identifier for this class for serialization purposes.
The unique identifier is a list of strings that describes the path
to the object. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.buffer.ConversationBufferMemory.html |
4958b71a0edd-2 | The unique identifier is a list of strings that describes the path
to the object.
load_memory_variables(inputs: Dict[str, Any]) → Dict[str, Any][source]¶
Return history buffer.
classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
classmethod parse_obj(obj: Any) → Model¶
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) → None¶
Save context from this conversation to buffer.
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶
to_json() → Union[SerializedConstructor, SerializedNotImplemented]¶
to_json_not_implemented() → SerializedNotImplemented¶
classmethod update_forward_refs(**localns: Any) → None¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
classmethod validate(value: Any) → Model¶
property buffer: Any¶
String buffer of memory.
property buffer_as_messages: List[langchain.schema.messages.BaseMessage]¶
Exposes the buffer as a list of messages in case return_messages is False.
property buffer_as_str: str¶
Exposes the buffer as a string in case return_messages is True.
property lc_attributes: Dict¶
List of attribute names that should be included in the serialized kwargs.
These attributes must be accepted by the constructor. | lang/api.python.langchain.com/en/latest/memory/langchain.memory.buffer.ConversationBufferMemory.html |
4958b71a0edd-3 | These attributes must be accepted by the constructor.
property lc_secrets: Dict[str, str]¶
A map of constructor argument names to secret ids.
For example,{“openai_api_key”: “OPENAI_API_KEY”}
Examples using ConversationBufferMemory¶
Gradio
SceneXplain
Xata chat memory
Streamlit Chat Message History
Dynamodb Chat Message History
Chat Over Documents with Vectara
Bittensor
Bedrock
Set env var OPENAI_API_KEY or load from a .env file:
Structure answers with OpenAI functions
Agents
Agent Debates with Tools
Message Memory in Agent backed by a database
Memory in the Multi-Input Chain
Memory in LLMChain
Multiple Memory classes
Customizing Conversational Memory
Memory in Agent
Shared memory across agents and tools
Add Memory to OpenAI Functions Agent
First we add a step to load memory
Adding memory | lang/api.python.langchain.com/en/latest/memory/langchain.memory.buffer.ConversationBufferMemory.html |
814ef8553629-0 | langchain.memory.chat_message_histories.elasticsearch.ElasticsearchChatMessageHistory¶
class langchain.memory.chat_message_histories.elasticsearch.ElasticsearchChatMessageHistory(index: str, session_id: str, *, es_connection: Optional[Elasticsearch] = None, es_url: Optional[str] = None, es_cloud_id: Optional[str] = None, es_user: Optional[str] = None, es_api_key: Optional[str] = None, es_password: Optional[str] = None)[source]¶
Chat message history that stores history in Elasticsearch.
Parameters
es_url – URL of the Elasticsearch instance to connect to.
es_cloud_id – Cloud ID of the Elasticsearch instance to connect to.
es_user – Username to use when connecting to Elasticsearch.
es_password – Password to use when connecting to Elasticsearch.
es_api_key – API key to use when connecting to Elasticsearch.
es_connection – Optional pre-existing Elasticsearch connection.
index – Name of the index to use.
session_id – Arbitrary key that is used to store the messages
of a single chat session.
Attributes
messages
Retrieve the messages from Elasticsearch
Methods
__init__(index, session_id, *[, ...])
add_ai_message(message)
Convenience method for adding an AI message string to the store.
add_message(message)
Add a message to the chat session in Elasticsearch
add_user_message(message)
Convenience method for adding a human message string to the store.
clear()
Clear session memory in Elasticsearch
connect_to_elasticsearch(*[, es_url, ...])
get_user_agent() | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.elasticsearch.ElasticsearchChatMessageHistory.html |
814ef8553629-1 | connect_to_elasticsearch(*[, es_url, ...])
get_user_agent()
__init__(index: str, session_id: str, *, es_connection: Optional[Elasticsearch] = None, es_url: Optional[str] = None, es_cloud_id: Optional[str] = None, es_user: Optional[str] = None, es_api_key: Optional[str] = None, es_password: Optional[str] = None)[source]¶
add_ai_message(message: str) → None¶
Convenience method for adding an AI message string to the store.
Parameters
message – The string contents of an AI message.
add_message(message: BaseMessage) → None[source]¶
Add a message to the chat session in Elasticsearch
add_user_message(message: str) → None¶
Convenience method for adding a human message string to the store.
Parameters
message – The string contents of a human message.
clear() → None[source]¶
Clear session memory in Elasticsearch
static connect_to_elasticsearch(*, es_url: Optional[str] = None, cloud_id: Optional[str] = None, api_key: Optional[str] = None, username: Optional[str] = None, password: Optional[str] = None) → Elasticsearch[source]¶
static get_user_agent() → str[source]¶ | lang/api.python.langchain.com/en/latest/memory/langchain.memory.chat_message_histories.elasticsearch.ElasticsearchChatMessageHistory.html |
087854c5a7d5-0 | langchain.schema.runnable.utils.IsLocalDict¶
class langchain.schema.runnable.utils.IsLocalDict(name: str, keys: Set[str])[source]¶
Check if a name is a local dict.
Methods
__init__(name, keys)
generic_visit(node)
Called if no explicit visitor function exists for a node.
visit(node)
Visit a node.
visit_Call(node)
visit_Constant(node)
visit_Subscript(node)
__init__(name: str, keys: Set[str]) → None[source]¶
generic_visit(node)¶
Called if no explicit visitor function exists for a node.
visit(node)¶
Visit a node.
visit_Call(node: Call) → Any[source]¶
visit_Constant(node)¶
visit_Subscript(node: Subscript) → Any[source]¶ | lang/api.python.langchain.com/en/latest/schema.runnable/langchain.schema.runnable.utils.IsLocalDict.html |
b34114f8a8eb-0 | langchain.schema.runnable.utils.get_function_first_arg_dict_keys¶
langchain.schema.runnable.utils.get_function_first_arg_dict_keys(func: Callable) → Optional[List[str]][source]¶
Get the keys of the first argument of a function if it is a dict. | lang/api.python.langchain.com/en/latest/schema.runnable/langchain.schema.runnable.utils.get_function_first_arg_dict_keys.html |
70af02203692-0 | langchain.schema.runnable.utils.gated_coro¶
async langchain.schema.runnable.utils.gated_coro(semaphore: Semaphore, coro: Coroutine) → Any[source]¶
Run a coroutine with a semaphore.
:param semaphore: The semaphore to use.
:param coro: The coroutine to run.
Returns
The result of the coroutine. | lang/api.python.langchain.com/en/latest/schema.runnable/langchain.schema.runnable.utils.gated_coro.html |
b124fe321890-0 | langchain.schema.runnable.utils.indent_lines_after_first¶
langchain.schema.runnable.utils.indent_lines_after_first(text: str, prefix: str) → str[source]¶
Indent all lines of text after the first line.
Parameters
text – The text to indent
prefix – Used to determine the number of spaces to indent
Returns
The indented text
Return type
str | lang/api.python.langchain.com/en/latest/schema.runnable/langchain.schema.runnable.utils.indent_lines_after_first.html |
9646e64f2988-0 | langchain.schema.runnable.base.RunnableEachBase¶
class langchain.schema.runnable.base.RunnableEachBase[source]¶
Bases: RunnableSerializable[List[Input], List[Output]]
A runnable that delegates calls to another runnable
with each element of the input sequence.
Use only if creating a new RunnableEach subclass with different __init__ args.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
param bound: langchain.schema.runnable.base.Runnable[langchain.schema.runnable.utils.Input, langchain.schema.runnable.utils.Output] [Required]¶
async abatch(inputs: List[Input], config: Optional[Union[RunnableConfig, List[RunnableConfig]]] = None, *, return_exceptions: bool = False, **kwargs: Optional[Any]) → List[Output]¶
Default implementation runs ainvoke in parallel using asyncio.gather.
The default implementation of batch works well for IO bound runnables.
Subclasses should override this method if they can batch more efficiently;
e.g., if the underlying runnable uses an API which supports a batch mode.
async ainvoke(input: List[Input], config: Optional[RunnableConfig] = None, **kwargs: Any) → List[Output][source]¶
Default implementation of ainvoke, calls invoke from a thread.
The default implementation allows usage of async code even if
the runnable did not implement a native async version of invoke.
Subclasses should override this method if they can run asynchronously.
async astream(input: Input, config: Optional[RunnableConfig] = None, **kwargs: Optional[Any]) → AsyncIterator[Output]¶
Default implementation of astream, which calls ainvoke.
Subclasses should override this method if they support streaming output. | lang/api.python.langchain.com/en/latest/schema.runnable/langchain.schema.runnable.base.RunnableEachBase.html |
Subsets and Splits