Vrin’s specialization feature lets you configure the AI’s behavior, tone, and reasoning focus to match your domain.
specialize()
Set a custom system prompt and optional reasoning parameters.
client.specialize(
custom_prompt="You are a senior financial analyst specializing in tech sector M&A.",
reasoning_focus=["cross_document_synthesis", "causal_chains"],
analysis_depth="expert",
)
Parameters
System prompt that defines the AI’s persona, domain expertise, and response style.
Additional keyword arguments are forwarded to the backend. Common options:
| Parameter | Type | Description |
|---|
reasoning_focus | List[str] | Reasoning strategies to emphasize |
analysis_depth | str | "basic", "detailed", or "expert" |
Example specializations
Legal research:
client.specialize(
custom_prompt="You are a senior M&A legal partner with 25+ years experience. "
"Cite specific clauses and precedents. Flag regulatory risks.",
)
Sales enablement:
client.specialize(
custom_prompt="You are a sales engineer. Answer questions using battle cards, "
"case studies, and pricing documentation. Be concise and action-oriented.",
)
Technical documentation:
client.specialize(
custom_prompt="You are a senior software architect. Reference code examples, "
"API docs, and architecture diagrams. Prefer precise technical language.",
)
get_specialization()
Retrieve the current specialization settings.
settings = client.get_specialization()
print(settings.get("custom_prompt"))
Returns
{
"custom_prompt": "You are a senior financial analyst...",
"reasoning_focus": ["cross_document_synthesis"],
"analysis_depth": "expert"
}
Notes
- Specialization is persisted per user — it applies to all subsequent queries until changed.
- To reset to default behavior, call
specialize() with a generic prompt.
- Specialization affects the LLM generation step only. Retrieval (graph traversal + vector search) is not modified.