Skip to main content

AI Service

The AI service provides AI-powered chat functionality, enabling users to query fund data using natural language.

Architecture

The AI service is a separate Django application using:
  • Django Channels for WebSocket support
  • pydantic-ai for agent orchestration
  • Multiple LLM providers with automatic fallback

Key Components

ComponentDescription
agents/Tool registry, model management, base classes
apps/Domain-specific tools
prompts/System prompt configuration
settings/AI service configuration

LLM Providers

The service supports multiple providers:
ProviderModels
AnthropicClaude Opus, Sonnet, Haiku
OpenAIGPT-4, GPT-3.5
GoogleGemini Pro
AWS BedrockClaude via Bedrock
Automatic fallback ensures availability if primary model fails.

Available Tools

Fund Family Tools

ToolDescription
list_fund_familiesList all accessible fund families
search_fund_familySearch by name

Entity Tools

ToolDescription
search_entitiesSearch by name or external ID
list_investor_entitiesList investor entities
list_investment_entitiesList investments
list_fund_entitiesList funds

Capital Activity Tools

ToolDescription
get_capital_activity_statsStatistics for an investor
get_capital_activity_summaryITD summary with breakdown
list_capital_activity_transactionsTransaction list

Fee Tools

ToolDescription
get_itd_fees_accrued_and_paidITD fee summary
search_fee_charge_eventFind fee events by date

Financial Reporting Tools

ToolDescription
list_hypothetical_waterfallsList waterfalls
get_performance_stats_by_datePerformance metrics
get_hypothetical_performance_multiplesTVPI, DPI, etc.

Credit Facility Tools

ToolDescription
list_credit_facilitiesList credit facilities
get_credit_facility_statsPrincipal summary

Calculation Tools

ToolDescription
list_calculationsList MXL calculations
explain_calculationsMXL documentation

Running the Service

just run-server-ai
The service runs on port 5050, accepting WebSocket connections at /ws/chat/.

Configuration

# Primary model
AI_MODEL=bedrock:us.anthropic.claude-opus-4-20250514-v1:0

# Fallback model
AI_FALLBACK_MODEL=openai:gpt-5.1

# API keys
ANTHROPIC_API_KEY=...
OPENAI_API_KEY=...
GEMINI_API_KEY=...

WebSocket Protocol

Client to Server

MessageDescription
ChatMessageCommandSend a chat message
HeartbeatCommandKeep connection alive

Server to Client

MessageDescription
ConnectedEventConnection established
AckMessageEventMessage received
ChatMessageStreamStartEventResponse starting
ChatMessageStreamChunkEventStreaming chunk
ChatMessageStreamEndEventResponse complete
TitleGeneratedEventConversation title

Adding New Tools

1

Create schemas

Define input/output models in tools/schemas.py:
class MyToolInput(BaseModel):
    fund_family_id: str
    date: date

class MyToolOutput(BaseModel):
    result: Decimal
2

Implement function

Create tool in tools/functions.py:
async def my_tool(ctx: AgentContext, input: MyToolInput) -> MyToolOutput:
    # Call main server API
    result = await ctx.api_client.get_my_data(...)
    return MyToolOutput(result=result)
3

Register tool

Add to tools/loader.py:
register_tool(
    name="my_tool",
    description="Gets my data for...",
    function=my_tool,
    input_model=MyToolInput,
    output_model=MyToolOutput,
)

Security

The AI agent:
  • Only accesses data the user has permission to view
  • Uses the user’s authentication context
  • Follows system prompt security policies
  • Logs all tool invocations