-
Notifications
You must be signed in to change notification settings - Fork 1
Handy Prompt (hprompt)
Handy prompt files, or .hprompt
files, are self-containing LLM request files in mark-up format. See more examples below.
---
# frontmatter data
model: gpt-3.5-turbo
temperature: 0.5
meta:
credential_path: .env
var_map_path: substitute.txt
output_path: out/%Y-%m-%d/result.%H-%M-%S.hprompt
---
$system$
You are a helpful assistant.
$user$
Your current context:
%context%
Please follow my instructions:
%instructions%
- Frontmatter:
- Specify request arguments
- Specify output path and other runtime configurations
- Body:
- Construct chat messages with
$system$
,$user$
,$assistant$
and$tool$
.
- Construct chat messages with
.hprompt
files are parsed into HandyPrompt
objects by hprompt.load_from("xxx.hprompt")
, and there are two kinds: ChatPrompt
and CompletionsPrompt
. The only difference is whether the body part contains message role keys (e.g. $user$
).
Check out corresponding parts of CLI Usage and Dev Usage respectively.
The root keywords of the frontmatter are request arguments that will be passed to chat
api or completions
api. Examples:
Tip
YAML is a superset of JSON, so you definitely can use the JSON format for some fields.
model: gpt-3.5-turbo
temperature: 0.5
response_format: { "type": "json_object" }
stream: true
timeout: 10
tools: [
{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"]
}
},
"required": ["location"]
}
}
}
]
Here you can also specify endpoint information like api_key
, organization
, api_type
, api_base
, api_version
(used for Azure endpoints). But it is recommended to store these into a separate credential file and point to its path in the meta
field (see below).
meta
field of the frontmatter is parsed into a RunConfig
dataclass instance.
All options are examplified below:
Note
All relative paths in meta
field are resolved relative to the hprompt file directory.
meta:
# record request arguments in the output file
# options: blacklist, whitelist, none, all
# if not specified, use blacklist
record_request: blacklist
# if record_request is blacklist, record all request arguments except these
# if not specified, use DEFAULT_BLACKLIST
record_blacklist: ["api_key", "organization"]
# if record_request is whitelist, record only these
record_whitelist: ["temperature", "model"]
# variable map
var_map:
# wrap variable name with quotes because it contains special characters
"%variable1%": data1
"%variable2%": "data2"
# alternatively, use variable map file
var_map_path: var_map.txt
# output the result to a file; you can use strftime format
output_path: out/%Y-%m-%d/result.%H-%M-%S.hprompt
# output the evaluated prompt to a file; you can use strftime format
output_evaled_prompt_path: out/%Y-%m-%d/evaled.%H-%M-%S.hprompt
# credential file path
credential_path: credential.env
# credential type
# options: env, json, yaml
# if env, load environment variables from the credential file
# if json or yaml, load the content of the file as request arguments
# if not specified, guess from the file extension
credential_type: env
# verbose output to stderr
verbose: false
There are two subclasses of HandyPrompt
: ChatPrompt
and CompletionsPrompt
. All of them share the same format of frontmatter.
For ChatPrompt
, the body part is a markup format of chat messages. Each role key (e.g. $system$
/ $user$
/ $assistant
/ $tool$
) should be placed in a separate line, and follows the content.
$system$
You are a helpful assistant.
$user$
Please help me merge the following two JSON documents into one.
$assistant$
Sure, please give me the two JSON documents.
$user$
{
"item1": "It is really a good day."
}
{
"item2": "Indeed."
}
%output_format%
You can specify extra properties of a message after the role:
$role$ {key1="value1" key2='value2'}
This gives you support for tool calls, image input and extra name fields, etc. See examples below.
As for CompletionsPrompt
, the only difference is that there is no role keys in the body. Variable substitution works the same.
You should place credentials in a separate file, and specify its path in meta.credential_path
in the frontmatter.
meta.credential_type
could be env
/ json
/ yaml
, you can omit it and leave it inferred from the file extension.
Example .env
:
OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxxxxx
OPENAI_ORGANIZATION=org-xxxxxxxxxxxxxxxxxxxxx
Example credential.yaml
:
api_key: sk-xxxxxxxxxxxxxxxxxxxxxxxxxxx
organization: org-xxxxxxxxxxxxxxxxxxxxx
Example credential.json
:
{
"api_key": "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxx",
"organization": "org-xxxxxxxxxxxxxxxxxxxxx"
}
You can substitute placeholder variables like %output_format%
, which can be replaced by the dict var_map
specified in the frontmatter meta
field. Note that we need to wrap variable names with quotes because it contains special characters.
---
meta:
var_map:
'%output_format%': Please output a single YAML object that contains all items from the two input JSON objects.
'%variable2%': Placeholder text.
'%variable1%': Placeholder text.
---
You can also store them in text files (specify meta.var_map_path
in frontmatter) to make multiple prompts modular. A substitute map substitute.txt
looks like this:
%output_format%
Please output a single YAML object that contains all items from the two input JSON objects.
%variable1%
Placeholder text.
%variable2%
Placeholder text.
You need to:
- specify
tools
in the frontmatter; - then the responded type for
$assistant$
will betool_calls
:$assistant$ {type="tool_calls"}
- and the tool calls are in YAML format.
- Then append
$tool$
messages with correspondingtool_call_id
s for further chatting.
Full example:
---
model: gpt-4o
tools:
- function:
description: Get the current weather in a given location
name: get_current_weather
parameters:
properties:
location:
description: The city and state, e.g. San Francisco, CA
type: string
unit:
enum:
- celsius
- fahrenheit
type: string
required:
- location
- unit
type: object
type: function
---
$user$
Please tell me the weathers in SF and NY.
$assistant$ {type="tool_calls"}
- function:
arguments: '{"location": "San Francisco, CA", "unit": "celsius"}'
name: get_current_weather
id: call_fwXuTQZSjPr966yMrrzymHb3
index: 0
type: function
- function:
arguments: '{"location": "New York, NY", "unit": "fahrenheit"}'
name: get_current_weather
id: call_bHFVWmtBk0z8wsEn3k6VHzOo
index: 1
type: function
$tool$ {tool_call_id="call_fwXuTQZSjPr966yMrrzymHb3"}
24
$tool$ {tool_call_id="call_bHFVWmtBk0z8wsEn3k6VHzOo"}
75
You need to specify content_array
type for $user$
and then write in YAML array format for the content. Full example:
---
model: gpt-4o
---
$user$ {type="content_array"}
- text: What's in this image?
type: text
- image_url:
url: https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg
type: image_url