Robocorp
This notebook covers how to get started with Robocorp Action Server action toolkit and LangChain.
Installation
First, see the Robocorp Quickstart on how to setup Action Server and create your Actions.
In your LangChain application, install the langchain-robocorp
package:
# Install package
%pip install --upgrade --quiet langchain-robocorp
Environment Setup
Optionally you can set the following environment variables:
LANGCHAIN_TRACING_V2=true
: To enable LangSmith log run tracing that can also be bind to respective Action Server action run logs. See LangSmith documentation for more.
Usage
from langchain.agents import AgentExecutor, OpenAIFunctionsAgent
from langchain.chat_models import ChatOpenAI
from langchain_core.messages import SystemMessage
from langchain_robocorp import ActionServerToolkit
# Initialize LLM chat model
llm = ChatOpenAI(model="gpt-4", temperature=0)
# Initialize Action Server Toolkit
toolkit = ActionServerToolkit(url="http://localhost:8080", report_trace=True)
tools = toolkit.get_tools()
# Initialize Agent
system_message = SystemMessage(content="You are a helpful assistant")
prompt = OpenAIFunctionsAgent.create_prompt(system_message)
agent = OpenAIFunctionsAgent(llm=llm, prompt=prompt, tools=tools)
executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
executor.invoke("What is the current date?")
Single input tools
By default toolkit.get_tools()
will return the actions as Structured
Tools. To return single input tools, pass a Chat model to be used for
processing the inputs.
# Initialize single input Action Server Toolkit
toolkit = ActionServerToolkit(url="http://localhost:8080")
tools = toolkit.get_tools(llm=llm)