Getting Started with the Agent Builder SDK
A quickstart guide to building your first AI agent with the Agent Builder Lib SDK. Install, configure, and deploy in minutes using Python or TypeScript.
Why Agent Builder Lib?
Building AI agents from scratch means dealing with model routing, tool orchestration, memory management, and protocol compliance. Agent Builder Lib handles all of that so you can focus on what your agent actually does. The SDK is available for both Python and TypeScript, with identical APIs and full type safety.
Installation
Python
Install the SDK using pip or your preferred package manager:
pip install agent-builder-lib
TypeScript
Install via npm, pnpm, or yarn:
npm install @agent-builder/sdk
Creating Your First Agent
An agent needs three things: a name, at least one skill, and a model backend. Here's the minimal setup in both languages.
Python
from agent_builder import Agent, Skill
async def summarize(text: str) -> str:
"""Summarize the given text."""
return await agent.llm.complete(
f"Summarize this concisely: {text}"
)
agent = Agent(
name="summarizer",
description="Summarizes long documents into key points",
skills=[Skill(name="summarize", handler=summarize)],
)
agent.serve(port=8080)
TypeScript
import { Agent, Skill } from '@agent-builder/sdk';
const summarize: Skill = {
name: 'summarize',
handler: async (text: string) => {
return agent.llm.complete(
`Summarize this concisely: ${text}`
);
},
};
const agent = new Agent({
name: 'summarizer',
description: 'Summarizes long documents into key points',
skills: [summarize],
});
agent.serve({ port: 8080 });
Adding Tools
Agents become powerful when they can use tools. The SDK provides a simple decorator pattern (Python) or object pattern (TypeScript) for registering tools:
from agent_builder import tool
@tool(description="Fetch a web page and return its text content")
async def fetch_page(url: str) -> str:
async with httpx.AsyncClient() as client:
resp = await client.get(url)
return resp.text
Pass tools to your agent and the SDK handles argument parsing, validation, and error recovery automatically. The LLM sees a typed schema for each tool and decides when to invoke them.
Running Locally
The SDK includes a built-in development server with hot reloading and an interactive console:
agent-builder dev --port 8080
Open http://localhost:8080 to see your agent's AgentCard, send test messages, and inspect traces in real time. No cloud account required for local development.
Deploying to Oya.ai
When you're ready to go live, deploy directly to the Oya.ai platform:
agent-builder deploy --project my-agent
This publishes your agent with A2A protocol support, automatic scaling, and built-in monitoring. Your agent is instantly discoverable by other agents in the ecosystem.
Next Steps
- Read the full SDK documentation for advanced configuration
- Explore the plugin system to add memory, caching, and custom middleware
- Check out the comparison guide to choose between Python and TypeScript SDKs