tool

large language model tool.

Classes

class LLMTool(device='?', overwrite=False, verbose=True)[source]

Bases: TextTool

Parameters:
  • device (str | None)

  • overwrite (bool | None)

  • verbose (bool | int | None)

chain_from_yaml(llm, yaml_path)[source]

Return a chain from a yaml file.

Parameters:
  • llm (BaseChatModel) – A large language model.

  • yaml_path (Union[str, Path]) – Path to the .yaml config file.

Return type:

RunnableSerializable

Returns:

A chain.

chains_from_yaml(llm, yaml_path)[source]

Return a dictionary of chains from a yaml file.

Parameters:
  • llm (BaseChatModel) – A large language model.

  • yaml_path (Union[str, Path]) – Path to the .yaml config file.

Return type:

dict[slice(<class ‘str’>, <class ‘langchain_core.runnables.base.RunnableSerializable’>, None)]

Returns:

A dictionary of strings mapped to chains.

default_parser(generation, data, start_after=None, regex=None, to_lower=False, expect=None, on_failure=None)[source]

Parse a message starting from start_flag and check whether it is one of the expected_labels.

Parameters:
  • generation (AIMessage) – Message from a llm.

  • data (dict) – Additional data from the chain.

  • start_after (Optional[str]) – If not None, parses the message from the last instance of start_after.

  • regex (Optional[str]) – If not None, performs regex search to the output (it applies subsequently to start_after).

  • to_lower (Optional[bool]) – If True, changes the output to lowercase (it applies subsequently to regex).

  • expect (Optional[list[str]]) – If not None, when the final output is not one of the expected labels prints a failure message.

  • on_failure (Optional[str]) – If a failure occur, i.e. start_after missing or regex not found or unexpected output, set the output to default.

Return type:

str

Returns:

The parsed message.

instantiate_llm(provider, **provider_kwargs)[source]

Return a large language model from a provider and key value arguments.

Parameters:
  • provider (str) – Name of the llm provider.

  • provider_kwargs – Key value arguments for instantiating the llm.

Return type:

BaseChatModel

Returns:

A large language model.

instantiate_parser(kind='default', **parser_kwargs)[source]

Return a kind of parser with key value arguments.

Parameters:
  • kind (str) – Name of the kind of parser.

  • parser_kwargs – Key value arguments for the parser.

Return type:

Callable

Returns:

A kind of parser with key value arguments.

static load_template(prompt)[source]

Return a chat prompt template from a string or a path to a .txt file.

Parameters:

prompt (Union[str, Path]) – String or path to a .txt file.

Return type:

ChatPromptTemplate

Returns:

A chat prompt template.

static make_chain(llm, prompt, parser)[source]

Return a chain composed of the prompt, llm, and parser.

Parameters:
  • llm (BaseChatModel) – A large language model.

  • prompt (ChatPromptTemplate) – A prompt template.

  • parser (Callable) – A parser in the form of a chain.

Return type:

RunnableSerializable

Returns:

A chain composed of the prompt, llm, and parser.