Skip to content

Commit

Permalink
Merge branch 'main' into vb/react-agent-prompt
Browse files Browse the repository at this point in the history
  • Loading branch information
vbarda committed Jan 24, 2025
2 parents a2a1a77 + 48040d8 commit 4bbbdd3
Show file tree
Hide file tree
Showing 9 changed files with 1,078 additions and 5,757 deletions.
4 changes: 1 addition & 3 deletions .github/workflows/run_notebooks.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
lib-version:
- "development"
- "latest"
lib-version: ${{ (github.event_name == 'workflow_dispatch' || github.event_name == 'schedule') && fromJSON('["development", "latest"]') || fromJSON('["development"]') }}

steps:
- uses: actions/checkout@v4
Expand Down
40 changes: 9 additions & 31 deletions docs/docs/how-tos/react-agent-from-scratch.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -6,37 +6,15 @@
"source": [
"# How to create a ReAct agent from scratch\n",
"\n",
"<div class=\"admonition tip\">\n",
" <p class=\"admonition-title\">Prerequisites</p>\n",
" <p>\n",
" This guide assumes familiarity with the following:\n",
" <ul>\n",
" <li>\n",
" <a href=\"https://langchain-ai.github.io/langgraph/concepts/agentic_concepts/#tool-calling-agent\">\n",
" Tool calling agent\n",
" </a>\n",
" </li> \n",
" <li>\n",
" <a href=\"https://python.langchain.com/docs/concepts/chat_models\">\n",
" Chat Models\n",
" </a>\n",
" </li>\n",
" <li>\n",
" <a href=\"https://python.langchain.com/docs/concepts/messages\">\n",
" Messages\n",
" </a>\n",
" </li>\n",
" <li>\n",
" <a href=\"https://langchain-ai.github.io/langgraph/concepts/low_level/\">\n",
" LangGraph Glossary\n",
" </a>\n",
" </li>\n",
" </ul>\n",
" </p>\n",
"</div> \n",
"\n",
"\n",
"Using the prebuilt ReAct agent ([create_react_agent](https://langchain-ai.github.io/langgraph/reference/prebuilt/#langgraph.prebuilt.chat_agent_executor.create_react_agent)) is a great way to get started, but sometimes you might want more control and customization. In those cases, you can create a custom ReAct agent. This guide shows how to implement ReAct agent from scratch using LangGraph.\n",
"!!! info \"Prerequisites\"\n",
" This guide assumes familiarity with the following:\n",
" \n",
" - [Tool calling agent](../../concepts/agentic_concepts/#tool-calling-agent)\n",
" - [Chat Models](https://python.langchain.com/docs/concepts/chat_models/)\n",
" - [Messages](https://python.langchain.com/docs/concepts/messages/)\n",
" - [LangGraph Glossary](../../concepts/low_level/)\n",
"\n",
"Using the prebuilt ReAct agent [create_react_agent][langgraph.prebuilt.chat_agent_executor.create_react_agent] is a great way to get started, but sometimes you might want more control and customization. In those cases, you can create a custom ReAct agent. This guide shows how to implement ReAct agent from scratch using LangGraph.\n",
"\n",
"## Setup\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/how-tos/tool-calling.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"We'll be using a small chat model from Anthropic in our example. To use chat models with tool calling, we need to first ensure that the model is aware of the available tools. We do this by calling `.bind_tools` method on `ChatAnthropic` moodel"
"We'll be using a small chat model from Anthropic in our example. To use chat models with tool calling, we need to first ensure that the model is aware of the available tools. We do this by calling `.bind_tools` method on `ChatAnthropic` model"
]
},
{
Expand Down
25 changes: 4 additions & 21 deletions docs/docs/tutorials/plan-and-execute/plan-and-execute.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -130,36 +130,19 @@
},
{
"cell_type": "code",
"execution_count": 7,
"execution_count": null,
"id": "72d233ca-1dbf-4b43-b680-b3bf39e3691f",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"================================\u001b[1m System Message \u001b[0m================================\n",
"\n",
"You are a helpful assistant.\n",
"\n",
"=============================\u001b[1m Messages Placeholder \u001b[0m=============================\n",
"\n",
"\u001b[33;1m\u001b[1;3m{messages}\u001b[0m\n"
]
}
],
"outputs": [],
"source": [
"from langchain import hub\n",
"from langchain_openai import ChatOpenAI\n",
"\n",
"from langgraph.prebuilt import create_react_agent\n",
"\n",
"# Get the prompt to use - you can modify this!\n",
"prompt = hub.pull(\"ih/ih-react-agent-executor\")\n",
"prompt.pretty_print()\n",
"\n",
"# Choose the LLM that will drive the agent\n",
"llm = ChatOpenAI(model=\"gpt-4-turbo-preview\")\n",
"prompt = \"You are a helpful assistant.\"\n",
"agent_executor = create_react_agent(llm, tools, state_modifier=prompt)"
]
},
Expand Down Expand Up @@ -546,7 +529,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.9"
"version": "3.12.3"
}
},
"nbformat": 4,
Expand Down
131 changes: 28 additions & 103 deletions libs/langgraph/langgraph/func/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@
import functools
import inspect
import types
from collections.abc import Iterator
from dataclasses import dataclass
from typing import (
Any,
Expand All @@ -18,17 +17,12 @@
overload,
)

from langchain_core.runnables.base import Runnable
from langchain_core.runnables.config import RunnableConfig
from langchain_core.runnables.graph import Graph, Node

from langgraph.channels.ephemeral_value import EphemeralValue
from langgraph.channels.last_value import LastValue
from langgraph.checkpoint.base import BaseCheckpointSaver
from langgraph.constants import END, PREVIOUS, START, TAG_HIDDEN
from langgraph.pregel import Pregel
from langgraph.pregel.call import P, T, call, get_runnable_for_entrypoint
from langgraph.pregel.protocol import PregelProtocol
from langgraph.pregel.read import PregelNode
from langgraph.pregel.write import ChannelWrite, ChannelWriteEntry
from langgraph.store.base import BaseStore
Expand Down Expand Up @@ -164,8 +158,10 @@ class entrypoint:
to the function. This input parameter can be of any type. Use a dictionary
to pass multiple parameters to the function.
The decorated function also has access to these optional parameters:
The decorated function can request access to additional parameters
that will be injected automatically at run time. These parameters include:
- `store`: An instance of [BaseStore][langgraph.store.base.BaseStore]. Useful for long-term memory.
- `writer`: A `StreamWriter` instance for writing data to a stream.
- `config`: A configuration object for accessing workflow settings.
- `previous`: The previous return value for the given thread (available only when
Expand Down Expand Up @@ -255,7 +251,7 @@ def review_workflow(topic: str) -> dict:
```python
from langgraph.checkpoint.memory import MemorySaver
from langgraph.func import entrypoint, task
from langgraph.func import entrypoint
@entrypoint(checkpointer=MemorySaver())
def my_workflow(input_data: str, previous: Optional[str] = None) -> str:
Expand Down Expand Up @@ -288,6 +284,29 @@ class final(Generic[R, S]):
This primitive allows to save a value to the checkpointer distinct from the
return value from the entrypoint.
Example: Decoupling the return value and the save value
```python
from langgraph.checkpoint.memory import MemorySaver
from langgraph.func import entrypoint
@entrypoint(checkpointer=MemorySaver())
def my_workflow(number: int, *, previous: Any = None) -> entrypoint.final[int, int]:
previous = previous or 0
# This will return the previous value to the caller, saving
# 2 * number to the checkpoint, which will be used in the next invocation
# for the `previous` parameter.
return entrypoint.final(value=previous, save=2 * number)
config = {
"configurable": {
"thread_id": "1"
}
}
my_workflow.invoke(3, config) # 0 (previous was None)
my_workflow.invoke(1, config) # 6 (previous was 3 * 2 from the previous invocation)
```
"""

value: R
Expand Down Expand Up @@ -498,7 +517,7 @@ def _pluck_save_value(value: Any) -> Any:
else:
output_type = save_type = sig.return_annotation

return EntrypointPregel(
return Pregel(
nodes={
func.__name__: PregelNode(
bound=bound,
Expand Down Expand Up @@ -529,97 +548,3 @@ def _pluck_save_value(value: Any) -> Any:
store=self.store,
config_type=self.config_schema,
)


class EntrypointPregel(Pregel):
def get_graph(
self,
config: Optional[RunnableConfig] = None,
*,
xray: Union[int, bool] = False,
) -> Graph:
name, entrypoint = next(iter(self.nodes.items()))
graph = Graph()
node = Node(f"__{name}", name, entrypoint.bound, None)
graph.nodes[node.id] = node
candidates: list[tuple[Node, Union[Callable, PregelProtocol]]] = [
*_find_children(entrypoint.bound, node)
]
seen: set[Union[Callable, PregelProtocol]] = set()
for parent, child in candidates:
if child in seen:
continue
else:
seen.add(child)
if callable(child):
node = Node(f"__{child.__name__}", child.__name__, child, None) # type: ignore[arg-type]
graph.nodes[node.id] = node
graph.add_edge(parent, node, conditional=True)
graph.add_edge(node, parent)
candidates.extend(_find_children(child, node))
elif isinstance(child, Runnable):
if xray > 0:
graph = child.get_graph(config, xray=xray - 1 if xray else 0)
graph.trim_first_node()
graph.trim_last_node()
s, e = graph.extend(graph, prefix=child.name or "")
if s is None:
raise ValueError(
f"Could not extend subgraph '{child.name}' due to missing entrypoint"
)
else:
graph.add_edge(parent, s, conditional=True)
if e is not None:
graph.add_edge(e, parent)
else:
node = graph.add_node(child, child.name)
graph.add_edge(parent, node, conditional=True)
graph.add_edge(node, parent)
return graph


def _find_children(
candidate: Union[Callable, Runnable], parent: Node
) -> Iterator[tuple[Node, Union[Callable, PregelProtocol]]]:
from langchain_core.runnables.utils import get_function_nonlocals

from langgraph.utils.runnable import (
RunnableCallable,
RunnableLambda,
RunnableSeq,
RunnableSequence,
)

candidates: list[Union[Callable, Runnable]] = []
if callable(candidate) and getattr(candidate, "_is_pregel_task", False) is True:
candidates.extend(
nl.__self__ if hasattr(nl, "__self__") else nl
for nl in get_function_nonlocals(
candidate.__wrapped__
if hasattr(candidate, "__wrapped__") and callable(candidate.__wrapped__)
else candidate
)
)
else:
candidates.append(candidate)

for c in candidates:
if callable(c) and getattr(c, "_is_pregel_task", False) is True:
yield (parent, c)
elif isinstance(c, PregelProtocol):
yield (parent, c)
elif isinstance(c, RunnableSequence) or isinstance(c, RunnableSeq):
candidates.extend(c.steps)
elif isinstance(c, RunnableLambda):
candidates.extend(c.deps)
elif isinstance(c, RunnableCallable):
if c.func is not None:
candidates.extend(
nl.__self__ if hasattr(nl, "__self__") else nl
for nl in get_function_nonlocals(c.func)
)
elif c.afunc is not None:
candidates.extend(
nl.__self__ if hasattr(nl, "__self__") else nl
for nl in get_function_nonlocals(c.afunc)
)
2 changes: 1 addition & 1 deletion libs/langgraph/pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[tool.poetry]
name = "langgraph"
version = "0.2.66"
version = "0.2.67"
description = "Building stateful, multi-actor applications with LLMs"
authors = []
license = "MIT"
Expand Down
Loading

0 comments on commit 4bbbdd3

Please sign in to comment.