Skip to contents

Formats conversation messages using the model's built-in chat template or a custom template. This is essential for chat models that expect specific formatting for multi-turn conversations.

Usage

apply_chat_template(model, messages, template = NULL, add_assistant = TRUE)

Arguments

model

A model object created with model_load

messages

List of chat messages, each with 'role' and 'content' fields. Role should be 'user', 'assistant', or 'system'

template

Optional custom template string (default: NULL, uses model's built-in template)

add_assistant

Whether to add assistant prompt suffix for response generation (default: TRUE)

Value

Formatted prompt string ready for text generation

Examples

if (FALSE) { # \dontrun{
# Load a chat model
model <- model_load("path/to/chat_model.gguf")

# Format a conversation
messages <- list(
  list(role = "system", content = "You are a helpful assistant."),
  list(role = "user", content = "What is machine learning?"),
  list(role = "assistant", content = "Machine learning is..."),
  list(role = "user", content = "Give me an example.")
)

# Apply chat template
formatted_prompt <- apply_chat_template(model, messages)

# Generate response
response <- quick_llama(formatted_prompt)
} # }