Skip to content

Commit 55973e9

Browse files
authored
improve streaming example in getting started guide (#691)
1 parent 856bbc2 commit 55973e9

File tree

4 files changed

+83
-109
lines changed

4 files changed

+83
-109
lines changed

docs/docs/getting-started/guide-chat.mdx

Lines changed: 4 additions & 59 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,9 @@
22
sidebar_position: 3
33
---
44

5+
import CodeBlock from "@theme/CodeBlock";
6+
import Example from "@examples/models/chat/chat_streaming_stdout.ts";
7+
58
# Quickstart, using Chat Models
69

710
Chat models are a variation on language models.
@@ -316,62 +319,4 @@ const chain = new ConversationChain({
316319

317320
You can also use the streaming API to get words streamed back to you as they are generated. This is useful for eg. chatbots, where you want to show the user what is being generated as it is being generated. Note: OpenAI as of this writing does not support `tokenUsage` reporting while streaming is enabled.
318321

319-
```typescript
320-
let s = "";
321-
const chatStreaming = new ChatOpenAI({
322-
streaming: true,
323-
callbackManager: CallbackManager.fromHandlers({
324-
async handleLLMNewToken(token: string) {
325-
console.clear();
326-
s += token;
327-
console.log(s);
328-
},
329-
}),
330-
});
331-
332-
const responseD = await chatStreaming.call([
333-
new HumanChatMessage("Write me a song about sparkling water."),
334-
]);
335-
```
336-
337-
```
338-
Verse 1:
339-
Bubbles in the bottle,
340-
Light and refreshing,
341-
It's the drink that I love,
342-
My thirst quenching blessing.
343-
344-
Chorus:
345-
Sparkling water, my fountain of youth,
346-
I can't get enough, it's the perfect truth,
347-
It's fizzy and fun, and oh so clear,
348-
Sparkling water, it's crystal clear.
349-
350-
Verse 2:
351-
No calories or sugars,
352-
Just a burst of delight,
353-
It's the perfect cooler,
354-
On a hot summer night.
355-
356-
Chorus:
357-
Sparkling water, my fountain of youth,
358-
I can't get enough, it's the perfect truth,
359-
It's fizzy and fun, and oh so clear,
360-
Sparkling water, it's crystal clear.
361-
362-
Bridge:
363-
It's my happy place,
364-
In every situation,
365-
My daily dose of hydration,
366-
Always bringing satisfaction.
367-
368-
Chorus:
369-
Sparkling water, my fountain of youth,
370-
I can't get enough, it's the perfect truth,
371-
It's fizzy and fun, and oh so clear,
372-
Sparkling water, it's crystal clear.
373-
374-
Outro:
375-
Sparkling water, it's crystal clear,
376-
My love for you will never disappear.
377-
```
322+
<CodeBlock language="typescript">{Example}</CodeBlock>

docs/docs/getting-started/guide-llm.mdx

Lines changed: 4 additions & 50 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,9 @@
22
sidebar_position: 2
33
---
44

5+
import CodeBlock from "@theme/CodeBlock";
6+
import Example from "@examples/models/llm/llm_streaming_stdout.ts";
7+
58
# Quickstart, using LLMs
69

710
This tutorial gives you a quick walkthrough about building an end-to-end language model application with LangChain.
@@ -228,53 +231,4 @@ console.log(res2);
228231
229232
You can also use the streaming API to get words streamed back to you as they are generated. This is useful for eg. chatbots, where you want to show the user what is being generated as it is being generated. Note: OpenAI as of this writing does not support `tokenUsage` reporting while streaming is enabled.
230233
231-
```typescript
232-
const chat = new OpenAI({
233-
streaming: true,
234-
callbackManager: CallbackManager.fromHandlers({
235-
async handleLLMNewToken(token: string) {
236-
console.log(token);
237-
},
238-
}),
239-
});
240-
241-
const response = await chat.call("Write me a song about sparkling water.");
242-
console.log(response);
243-
```
244-
245-
```
246-
Verse 1
247-
248-
On a hot summer day, I'm looking for a treat
249-
I'm thirsty for something cool and sweet
250-
When I open up the fridge, what do I see?
251-
A bottle of sparkling water, it's calling out to me
252-
253-
Chorus
254-
255-
Sparkling water, it's so refreshing
256-
It quenches my thirst, it's the perfect thing
257-
It's so light and bubbly, it's like a dream
258-
And I'm loving every sip of sparkling water
259-
260-
Verse 2
261-
262-
I take it out of the fridge and pour some in a glass
263-
It's so light and bubbly, I can feel the fizz
264-
I take a sip and suddenly I'm revived
265-
This sparkling water's just what I need to survive
266-
267-
Chorus
268-
269-
Sparkling water, it's so refreshing
270-
It quenches my thirst, it's the perfect thing
271-
It's so light and bubbly, it's like a dream
272-
And I'm loving every sip of sparkling water
273-
274-
Bridge
275-
276-
It's like drinking sunshine between my hands
277-
It's so light and bubbly, I'm in a trance
278-
The summer heat's no match for sparkling water
279-
It's my favorite
280-
```
234+
<CodeBlock language="typescript">{Example}</CodeBlock>
Lines changed: 38 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,38 @@
1+
import { CallbackManager } from "langchain/callbacks";
2+
import { ChatOpenAI } from "langchain/chat_models";
3+
import { HumanChatMessage } from "langchain/schema";
4+
5+
export const run = async () => {
6+
const chat = new ChatOpenAI({
7+
streaming: true,
8+
callbackManager: CallbackManager.fromHandlers({
9+
async handleLLMNewToken(token: string) {
10+
process.stdout.write(token);
11+
},
12+
}),
13+
});
14+
15+
await chat.call([
16+
new HumanChatMessage("Write me a song about sparkling water."),
17+
]);
18+
/*
19+
Verse 1:
20+
Bubbles rise, crisp and clear
21+
Refreshing taste that brings us cheer
22+
Sparkling water, so light and pure
23+
Quenches our thirst, it's always secure
24+
25+
Chorus:
26+
Sparkling water, oh how we love
27+
Its fizzy bubbles and grace above
28+
It's the perfect drink, anytime, anyplace
29+
Refreshing as it gives us a taste
30+
31+
Verse 2:
32+
From morning brunch to evening feast
33+
It's the perfect drink for a treat
34+
A sip of it brings a smile so bright
35+
Our thirst is quenched in just one sip so light
36+
...
37+
*/
38+
};
Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,37 @@
1+
import { CallbackManager } from "langchain/callbacks";
2+
import { OpenAI } from "langchain/llms";
3+
4+
export const run = async () => {
5+
// To enable streaming, we pass in `streaming: true` to the LLM constructor.
6+
// Additionally, we pass in a `CallbackManager` with a handler set up for the `handleLLMNewToken` event.
7+
const chat = new OpenAI({
8+
streaming: true,
9+
callbackManager: CallbackManager.fromHandlers({
10+
async handleLLMNewToken(token: string) {
11+
process.stdout.write(token);
12+
},
13+
}),
14+
});
15+
16+
await chat.call("Write me a song about sparkling water.");
17+
/*
18+
Verse 1
19+
Crystal clear and made with care
20+
Sparkling water on my lips, so refreshing in the air
21+
Fizzy bubbles, light and sweet
22+
My favorite beverage I can’t help but repeat
23+
24+
Chorus
25+
A toast to sparkling water, I’m feeling so alive
26+
Let’s take a sip, and let’s take a drive
27+
A toast to sparkling water, it’s the best I’ve had in my life
28+
It’s the best way to start off the night
29+
30+
Verse 2
31+
It’s the perfect drink to quench my thirst
32+
It’s the best way to stay hydrated, it’s the first
33+
A few ice cubes, a splash of lime
34+
It will make any day feel sublime
35+
...
36+
*/
37+
};

0 commit comments

Comments
 (0)