8.16
1 LLM API Library
Sends the current prompt, from current-prompt-port, to the current LLM backend via current-send-prompt!.
The values strs are first written to current-prompt-port.
No prompt is sent os the current-prompt-port is empty.
The result is the string returned by the LLM, or (void) if no prompt is sent.
Examples:
> (require llm llm/ollama/phi3)
> (display (prompt! "Please write a haiku about the reliability and performance of Phi3 for use in software engineering." "Make it a short haiku."))
Flawless code streams,
Phi3 runs smoothly every time—
Reliable friend.
This brief poetic form reflects on the consistency and dependability of Phi3 as if personified by an enduring companion that one can count on in times of need within software engineering projects – it suggests performance without detailing specific metrics or features, keeping with traditional haiku's focus on nature-inspired imagery.
parameter
(current-prompt-port port) → void? port : string-port?
A parameter to which the prompt is written before being sent to the current backend.
Default value is a new output string-port?.
parameter
(current-send-prompt!) → (-> string? ... void?)
(current-send-prompt! prompt!) → void? prompt! : (-> string? ... void?)
A parameter that defines how to send a prompt to the current backend.
Typically configured by importing a backend, rather than accessed manually.
parameter
(current-response-timeout seconds) → void? seconds : natural-number/c
A parameter that defines the how many seconds to wait for a response from the LLM after sending a prompt.
value