Skip to content

Commit cbd111e

Browse files
Add structured outputs
Closes #35.
1 parent 573760f commit cbd111e

File tree

1 file changed

+38
-0
lines changed

1 file changed

+38
-0
lines changed

README.md

+38
Original file line numberDiff line numberDiff line change
@@ -173,6 +173,38 @@ console.log(await promptWithCalculator("What is 2 + 2?"));
173173

174174
We'll likely explore more specific APIs for tool- and function-calling in the future; follow along in [issue #7](https://github.com/webmachinelearning/prompt-api/issues/7).
175175

176+
### Structured output or JSON output
177+
178+
To help with programmatic processing of language model responses, the prompt API supports structured outputs defined by a JSON schema.
179+
180+
```js
181+
const session = await ai.languageModel.create();
182+
183+
const responseJSONSchemaObj = new AILanguageModelResponseSchema({
184+
type: "object",
185+
required: ["Rating"],
186+
additionalProperties: false,
187+
properties: {
188+
Rating: {
189+
type: "number",
190+
minimum: 0,
191+
maximum: 5,
192+
},
193+
},
194+
});
195+
196+
// Prompt the model and wait for the json response to come back.
197+
const result = await session.prompt("Summarize this feedback into a rating between 0-5: "+
198+
"The food was delicious, service was excellent, will recommend.",
199+
{responseJSONSchema : responseJSONSchemaObj}
200+
);
201+
console.log(result);
202+
```
203+
204+
The `responseJSONSchema` option for `prompt()` and `promptStreaming()` can also accept a JSON schema directly as a JavaScript object. This is particularly useful for cases where the schema is not reused for other prompts.
205+
206+
While processing the JSON schema, in cases where the user agent detects unsupported schema a `"NotSupportedError"` `DOMException`, will be raised with appropriate error message. The result value returned is a string, that can be parsed with `JSON.parse()`. If the user agent is unable to produce a response that is compliant with the schema, a `"SyntaxError"` `DOMException` will be raised.
207+
176208
### Configuration of per-session parameters
177209

178210
In addition to the `systemPrompt` and `initialPrompts` options shown above, the currently-configurable model parameters are [temperature](https://huggingface.co/blog/how-to-generate#sampling) and [top-K](https://huggingface.co/blog/how-to-generate#top-k-sampling). The `params()` API gives the default and maximum values for these parameters.
@@ -475,6 +507,11 @@ interface AILanguageModelParams {
475507
readonly attribute float maxTemperature;
476508
};
477509
510+
[Exposed=(Window,Worker)]
511+
interface AILanguageModelResponseSchema {
512+
constructor(object responseJSONSchemaObject);
513+
}
514+
478515
dictionary AILanguageModelCreateCoreOptions {
479516
// Note: these two have custom out-of-range handling behavior, not in the IDL layer.
480517
// They are unrestricted double so as to allow +Infinity without failing.
@@ -503,6 +540,7 @@ dictionary AILanguageModelPrompt {
503540
};
504541
505542
dictionary AILanguageModelPromptOptions {
543+
object responseJSONSchema;
506544
AbortSignal signal;
507545
};
508546

0 commit comments

Comments
 (0)