100字范文,内容丰富有趣,生活中的好帮手!
100字范文 > 【LangChain】自定义chain

【LangChain】自定义chain

时间:2020-01-01 14:43:23

相关推荐

【LangChain】自定义chain

LangChain学习文档

Chains(链) 【LangChain】不同的调用方式(Different call methods)【LangChain】自定义chain

概述

要实现自定义Chain,我们可以使用Chain的子类,并实现它,如下:

内容

from __future__ import annotationsfrom typing import Any, Dict, List, Optionalfrom pydantic import Extrafrom langchain.base_language import BaseLanguageModelfrom langchain.callbacks.manager import (AsyncCallbackManagerForChainRun,CallbackManagerForChainRun,)from langchain.chains.base import Chainfrom langchain.prompts.base import BasePromptTemplate# 继承Chain类class MyCustomChain(Chain):"""An example of a custom chain."""prompt: BasePromptTemplate"""Prompt object to use."""llm: BaseLanguageModeloutput_key: str = "text" #: :meta private:class Config:"""Configuration for this pydantic object."""extra = Extra.forbidarbitrary_types_allowed = True# 来自Chain抽象类,必须重写@propertydef input_keys(self) -> List[str]:"""Will be whatever keys the prompt expects.:meta private:"""return self.prompt.input_variables# 来自Chain抽象类,必须重写@propertydef output_keys(self) -> List[str]:"""Will always return text key.:meta private:"""return [self.output_key]# 来自Chain抽象类,必须重写def _call(self,inputs: Dict[str, Any],run_manager: Optional[CallbackManagerForChainRun] = None,) -> Dict[str, str]:# Your custom chain logic goes here# This is just an example that mimics LLMChainprompt_value = self.prompt.format_prompt(**inputs)# Whenever you call a language model, or another chain, you should pass# a callback manager to it. This allows the inner run to be tracked by# any callbacks that are registered on the outer run.# You can always obtain a callback manager for this by calling# `run_manager.get_child()` as shown below.response = self.llm.generate_prompt([prompt_value], callbacks=run_manager.get_child() if run_manager else None)# If you want to log something about this run, you can do so by calling# methods on the `run_manager`, as shown below. This will trigger any# callbacks that are registered for that event.if run_manager:run_manager.on_text("Log something about this run")return {self.output_key: response.generations[0][0].text}async def _acall(self,inputs: Dict[str, Any],run_manager: Optional[AsyncCallbackManagerForChainRun] = None,) -> Dict[str, str]:# Your custom chain logic goes here# This is just an example that mimics LLMChainprompt_value = self.prompt.format_prompt(**inputs)# Whenever you call a language model, or another chain, you should pass# a callback manager to it. This allows the inner run to be tracked by# any callbacks that are registered on the outer run.# You can always obtain a callback manager for this by calling# `run_manager.get_child()` as shown below.response = await self.llm.agenerate_prompt([prompt_value], callbacks=run_manager.get_child() if run_manager else None)# If you want to log something about this run, you can do so by calling# methods on the `run_manager`, as shown below. This will trigger any# callbacks that are registered for that event.if run_manager:await run_manager.on_text("Log something about this run")return {self.output_key: response.generations[0][0].text}@propertydef _chain_type(self) -> str:return "my_custom_chain"from langchain.callbacks.stdout import StdOutCallbackHandlerfrom langchain.chat_models.openai import ChatOpenAIfrom langchain.prompts.prompt import PromptTemplatechain = MyCustomChain(prompt=PromptTemplate.from_template("tell us a joke about {topic}"),llm=ChatOpenAI(),)chain.run({"topic": "callbacks"}, callbacks=[StdOutCallbackHandler()])""" > Entering new MyCustomChain chain...Log something about this run> Finished chain.'Why did the callback function feel lonely? Because it was always waiting for someone to call it back!'"""

总结:

自定义一个类(如:MyCustomChain)并继承Chain类;如:class MyCustomChain(Chain):由于Chain是抽象类,需要重装其三个方法:input_keys()output_keys()_call()方法。通过MyCustomChain创建chain,在执行run方法运行。

参考地址:

Custom chain

本内容不代表本网观点和政治立场,如有侵犯你的权益请联系我们处理。
网友评论
网友评论仅供其表达个人看法,并不表明网站立场。