skip navigation
skip mega-menu

In this story we will describe how you can create complex chain workflows using LangChain (v. 0.0.190) with ChatGPT under the hood. 这个故事是一个 previous story on Medium 是建立在那个故事的思想基础上的.

LangChain has a set of foundational chains:

  • LLM: a simple chain with a prompt template that can process multiple inputs.
  • RouterChain: a gateway that uses the large language model (LLM) to select the most suitable processing chain.
  • Sequential: a family of chains which processes input in a sequential manner. 这意味着链中第一个节点的输出, becomes the input of the second node and the output of the second, 第三个输入,以此类推.
  • Transformation: a type of chain that allows Python function calls for customizable text manipulation.

A Complex Workflow

In this story we are going to use all foundational chains to create the following workflow as a simple command line application:

Complex LangChain Flow

该流程执行以下步骤:

  • Receive the user input
  • 输入通过回调写入文件.
  • The router selects the most appropriate chain from five options:
    - Python programmer (provides a solution and a unit test using a sequential chain)
    - Kotlin programmer (provides a solution and a unit test using a sequential chain)
    - Poet(提供单个响应的简单LLMChain)
    -维基百科专家(简单的LLMChain)
    -图形美工(简单的LLMChain)
    -英国、美国法律专家(simple LLMChain)
    - Word gap filler (contains a sequential chain that transforms the input and fills the gaps)
  • 大语言模型(LLM)响应.
  • 输出再次通过回调写入文件.

As you can see the main router chain triggers simple LLMChain’s, but also SimpleSequentialChain’s.

Flow Implementation

We have published a Python implementation of the flow described above in this Github repository:

GitHub - gilfernandes/complex_chain_playground: Playground project acting as an example for a…

Playground project acting as an example for a complex LangChain workflow - GitHub …

github.com


If you want to play around with it you can clone the repository and then install the needed libraries using Conda with Mamba.

Here is the script we have used to install the necessary libraries:

Conda create——name langchain2 python=3.10
激活langchain2
Conda安装-c Conda -forge mamba
mamba install openai
mamba install langchain
Mamba安装prompt_toolkit

You will need a ChatGPT key 安装在您的环境中以使脚本工作.

在Linux中,您可以执行像这样的脚本来设置 ChatGPT key:

export OPENAI_API_KEY=

You can then activate the Conda environment and run the script:

激活langchain2
python ./lang_chain_router_chain.py

Example Output

我们执行了带有一些问题的脚本,并捕获了一个 transcript in this file.

Here are some prompts we used as input and the corresponding agents that were triggered:

你能帮我把这篇课文中的单词填上吗? 强化学习(RL)是机器科学的一个领域 
learning concerned with how intelligent agents ought to take actions in an environment
为了最大化累积奖励的概念.
Reinforcement learning is one of three basic machine learning paradigms,
还有监督学习和非监督学习.
  • word filler
    删除每三个单词,然后填补空白
What are the main differences between the UK and US legal systems 
就遗产税而言?
  • legal expert
    产生一个解释与法律的比较
你能写一个返回日期列表的Python函数吗 
between two dates?
  • python programmer
    生成代码,然后进行单元测试
Can you write a Python function which implements the Levenshtein distance 
between two words?
  • python programmer
    生成代码,然后进行单元测试
Can you write a Kotlin function which converts two dates in ISO Format 
(like e.g. '2023-01-01') to LocalDate and then calculates the number of days
between both?
  • kotlin programmer
    生成代码,然后进行单元测试
Can you please write a poem about the joys of software development 
在英国的乡村?
  • poet
    generates a poem:

在代码的领域里,逻辑是存在的
在青翠的田野里,软件栖息.
在那里,字节和比特随着微风翩翩起舞,
在英国乡村,程序员可以轻松地工作.

你能用s型函数的输出生成一个图像吗 
and its derivative?
  • graphical artist
    生成SVG图像(不是很准确)
Can you explain to me the concept of a QBit in Quantum Computing?
  • wikipedia expert
    生成对主题的适当解释

Implementation Details

The project contains a main script that sets up the chains and executes them: complex_chain.py. 项目里还有其他文件,比如 FileCallbackHandler.py which is an implementation of a call back handler used to write the model input and output into an HTML file.

我们现在要关注的是 complex_chain.py.

complex_chain.py sets up the model first:

class Config(): 
model = 'gpt-3.5-turbo-0613'
llm = ChatOpenAI(model=model, temperature=0)

它声明了的一个特殊变体 langchain.chains.router.MultiPromptChain,因为我们不能把它们一起使用 langchain.chains.SimpleSequentialChain:

类MyMultiPromptChain (MultiRouteChain):
"""A multi-route chain that uses an LLM router chain to choose amongst prompts."""

router_chain: RouterChain
用于决定目标链及其输入的链."""
destination_chains: Mapping[str, Union[LLMChain, SimpleSequentialChain]]
名称到候选链的映射,输入可以路由到候选链."""
default_chain: LLMChain
"""Default chain to use when router doesn't map input to one of the destinations."""

@property
def output_keys(self) -> List[str]:
return ["text"]

It then generates all chains (including a default chain) and adds them to a list:

def generate_destination_chains ():
"""
创建具有不同提示模板的LLM链列表.
Note that some of the chains are sequential chains which are supposed to generate unit tests.
"""
prompt_factory = PromptFactory()
destination_chains = {}
对于prompt_factory中的p_info.prompt_infos:
name = p_info['name']
Prompt_template = p_info[' Prompt_template ']

chain = LLMChain(
llm=cfg.llm,
prompt=PromptTemplate(template=prompt_template, input_variables=['input']),
output_key='text',
回调函数= [file_ballback_handler]
)
如果名称不在prompt_factory中.programmer_test_dict.keys() and name != prompt_factory.word_filler_name:
Destination_chains [name] = chain
Elif name == prompt_factory.word_filler_name:
transform_chain = transform_chain
input_variables =(“输入”), output_variables =(“输入”), 变换= create_transform_func (3), 回调函数= [file_ballback_handler]
)
destination_chains[name] = SimpleSequentialChain()
chains=[transform_chain, chain], verbose=True, output_key='text', 回调函数= [file_ballback_handler]
)
else:
# Normal chain用于生成代码
生成单元测试的附加链
模板= prompt_factory.programmer_test_dict[名称]
prompt_template = PromptTemplate(input_variables =(“输入”), template=template)
test_chain = LLMChain(llm=cfg . cfg).llm, prompt=prompt_template, output_key='text', 回调函数= [file_ballback_handler])
destination_chains[name] = SimpleSequentialChain()
chains=[chain, test_chain], verbose=True, output_key='text', 回调函数= [file_ballback_handler]
)


default_chain = ConversationChain(llm=cfg).llm, output_key="text")
return prompt_factory.prompt_info, destination_chains, default_chain

它建立了路由器链:

def generate_router_chain(prompt_info, destination_chains, default_chain):
"""
根据提示信息生成路由器链.
:param prompt_info上面生成的提示信息.
:param destination_chains The LLM chains with different prompt templates
default_chain默认链
"""
destinations = [f"{p['name']}: {p['description']}" for p in prompt_infos]
destinations_str = '\n'.join(destinations)
router_template = MULTI_PROMPT_ROUTER_TEMPLATE.格式(目的地= destinations_str)
router_prompt = PromptTemplate(
模板= router_template,
input_variables =(“输入”),
output_parser = RouterOutputParser ()
)
router_chain = LLMRouterChain.from_llm(cfg.llm, router_prompt)
multi_route_chain = MyMultiPromptChain(
router_chain = router_chain,
destination_chains = destination_chains,
default_chain = default_chain,
verbose=True,
回调函数= [file_ballback_handler]
)
return multi_route_chain

Finally it contains a main method which allows some user interaction:

如果__name__ == "__main__":
#将您的API密钥放在这里或在您的环境中定义它
# os.environ["OPENAI_API_KEY"] = ''

prompt_info, destination_chains, default_chain = generate_destination_chains()
chain = generate_router_chain(prompt_info, destination_chains, default_chain)
with open('conversation.log', 'w') as f:
while True:
question = prompt(
HTML("Type Your question ('q' to exit, 's' to save to html file): ")
)
if question == 'q':
break
如果['s', 'w']中的疑问:
file_ballback_handler.create_html()
continue
result = chain.run(question)
f.写(f”问:{问题}\ n \ n”)
f.write(f"A: {result}")
f.write('\n\n ====================================================================== \n\n')
print(result)
print()

Final Thoughts

LangChain allows the creation of really complex interaction flows with LLMs.

However setting up the workflow was a bit more complicated that imagined, because langchain.chains.router.MultiPromptChain 似乎不太合得来 langchain.chains.SimpleSequentialChain. So we needed to create a custom class to create the complex flow.


吉尔·费尔南德斯,Onepoint咨询公司

十大正规博彩网站评级

Sign up here