Skip to content

Commit ea08015

Browse files
committed
docs: update
1 parent 8abb173 commit ea08015

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

core-jdk8/src/site/markdown/how-tos/stream-tokens.md

+6-6
Original file line numberDiff line numberDiff line change
@@ -5,15 +5,15 @@ agent. We will use a ReAct agent as an example. The tl;dr is to use
55
[streamEvents](https://js.langchain.com/v0.2/docs/how_to/chat_streaming/#stream-events)
66
([API Ref](https://api.js.langchain.com/classes/langchain_core_runnables.Runnable.html#streamEvents)).
77

8-
<div class="admonition info">
8+
<!-- <div class="admonition info">
99
<p class="admonition-title">Note</p>
1010
<p>
1111
If you are using a version of <code>@langchain/core</code> < 0.2.3, when calling chat models or LLMs you need to call <code>await model.stream()</code> within your nodes to get token-by-token streaming events, and aggregate final outputs if needed to update the graph state. In later versions of <code>@langchain/core</code>, this occurs automatically, and you can call <code>await model.invoke()</code>.
1212
<br>
1313
For more on how to upgrade <code>@langchain/core</code>, check out <a href="https://js.langchain.com/v0.2/docs/how_to/installation/#installing-integration-packages">the instructions here</a>.
1414
</p>
1515
16-
</div>
16+
</div> -->
1717

1818
This how-to guide closely follows the others in this directory, showing how to
1919
incorporate the functionality into a prototypical agent in LangGraph.
@@ -23,7 +23,7 @@ This works for
2323
and all its subclasses, such as
2424
[MessageGraph](/langgraphjs/reference/classes/langgraph.MessageGraph.html).
2525

26-
<div class="admonition info">
26+
<!-- <div class="admonition info">
2727
<p class="admonition-title">Streaming Support</p>
2828
<p>
2929
Token streaming is supported by many, but not all chat models. Check to see if your LLM integration supports token streaming <a href="https://js.langchain.com/v0.2/docs/integrations/chat/">here (doc)</a>. Note that some integrations may support <i>general</i> token streaming but lack support for streaming tool calls.
@@ -35,7 +35,7 @@ and all its subclasses, such as
3535
<p>
3636
In this how-to, we will create our agent from scratch to be transparent (but verbose). You can accomplish similar functionality using the <code>createReactAgent({ llm, tools })</code> (<a href="/langgraphjs/reference/functions/langgraph_prebuilt.createReactAgent.html">API doc</a>) constructor. This may be more appropriate if you are used to LangChain's <a href="https://js.langchain.com/v0.2/docs/how_to/agent_executor">AgentExecutor</a> class.
3737
</p>
38-
</div>
38+
</div> -->
3939

4040
## Setup
4141

@@ -121,12 +121,12 @@ Now load the [chat model](https://js.langchain.com/v0.2/docs/concepts/#chat-mode
121121
[tool calling](https://js.langchain.com/v0.2/docs/how_to/tool_calling/#passing-tools-to-llms),
122122
meaning it can return function arguments in its response.
123123

124-
<div class="admonition tip">
124+
<!-- <div class="admonition tip">
125125
<p class="admonition-title">Note</p>
126126
<p>
127127
These model requirements are not general requirements for using LangGraph - they are just requirements for this one example.
128128
</p>
129-
</div>
129+
</div> -->
130130

131131

132132
```typescript

0 commit comments

Comments
 (0)