Generators
Generators produce model outputs from message sequences. The Python SDK ships with a
Generator base class, GenerateParams for tuning inference, and get_generator()
for resolving backend identifiers.
Create a generator
Section titled “Create a generator”from dreadnode.generators.generator import GenerateParams, get_generatorfrom dreadnode.generators.message import Message
async def main() -> None: generator = get_generator("gpt-4o-mini")
params = GenerateParams(temperature=0.2, max_tokens=200) messages = [ Message(role="system", content="You are a concise assistant."), Message(role="user", content="Give me a one-sentence summary of Dreadnode."), ]
result = await generator.generate_messages([messages], [params]) generated = result[0] print(generated.message.content)GenerateParams basics
Section titled “GenerateParams basics”GenerateParams maps to common inference settings like temperature, max_tokens,
top_p, stop, and tool_choice. These params are merged with generator defaults
when making calls.
from dreadnode.generators.generator import GenerateParams, get_generator
generator = get_generator("gpt-4o-mini", params=GenerateParams(temperature=0.0))Backends and identifiers
Section titled “Backends and identifiers”get_generator() resolves backend identifiers. The default provider is LiteLLM, so
get_generator("gpt-4o-mini") returns a LiteLLMGenerator under the hood. Other
backends include VLLMGenerator, TransformersGenerator, and HTTPGenerator.
from dreadnode.generators.generator import get_generator
litellm = get_generator("litellm!openai/gpt-4o")vllm = get_generator("vllm!meta-llama/Meta-Llama-3-8B-Instruct")transformers = get_generator("transformers!microsoft/phi-2")http = get_generator("http!my-endpoint,api_key=ENV_API_KEY")HTTP generator
Section titled “HTTP generator”Use HTTPGenerator when you want to map messages to a custom HTTP endpoint.
from dreadnode.generators.generator import HTTPGenerator
generator = HTTPGenerator.for_json_endpoint( "https://api.example.com/v1/chat", request={ "model": "{{ model }}", "messages": "$messages", }, response={ "content_path": "$.choices[0].message.content", },)