LangChain decorators
is a layer on the top of LangChain that provides syntactic sugar 🍭 for writing custom langchain prompts and chains
For Feedback, Issues, Contributions - please raise an issue here:
ju-bezdek/langchain-decorators
Main principles and benefits:
- more
pythonic
way of writing code - write multiline prompts that won’t break your code flow with indentation
- making use of IDE in-built support for hinting, type checking and popup with docs to quickly peek in the function to see the prompt, parameters it consumes etc.
- leverage all the power of 🦜🔗 LangChain ecosystem
- adding support for optional parameters
- easily share parameters between the prompts by binding them to one class
Quick start
Installation
Examples
Good idea on how to start is to review the examples here:Defining other parameters
Here we are just marking a function as a prompt withllm_prompt
decorator, turning it effectively into a LLMChain. Instead of running it
Standard LLMchain takes much more init parameter than just inputs_variables and prompt… here is this implementation detail hidden in the decorator.
Here is how it works:
- Using Global settings:
- Using predefined prompt types
- Define the settings directly in the decorator
Passing a memory and/or callbacks:
To pass any of these, just declare them in the function (or use kwargs to pass anything)Simplified streaming
If we want to leverage streaming:- we need to define prompt as async function
- turn on the streaming on the decorator, or we can define PromptType with streaming on
- capture the stream using StreamingContext
Prompt declarations
By default the prompt is the whole function docs, unless you mark your promptDocumenting your prompt
We can specify what part of our docs is the prompt definition, by specifying a code block with<prompt>
language tag
human message
(we are using the real role that are enforced by the LLM - GPT supports system, assistant, user)Output parsers
- llm_prompt decorator natively tries to detect the best output parser based on the output type. (if not set, it returns the raw string)
- list, dict and pydantic outputs are also supported natively (automatically)