Key Benefits
- Deploy agents in seconds - Serverless infrastructure with automatic scaling
- Access from Python, JS, Go, Rust and coming more - Native-feeling SDKs for every language
- Built-in streaming - Real-time token streaming for responsive applications
Quick Actions
Deploy your first agent
Get started with a complete tutorial
Use frameworks
Integrate with popular AI frameworks
Explore SDKs
Access agents from any language
The Problem Every AI Developer Faces
Suppose you’ve built an incredible AI agent in Python. It uses LangGraph for complex reasoning, leverages powerful tools, and produces amazing results. Your team loves it!Then reality hits:
Your whole team is excited to use it! But the frontend team needs to access it in JavaScript, your mobile app team wants it in Kotlin, your Unity team wants it in C#, your systems team requires it in Rust.The traditional approach?
Build separate implementations, REST APIs, WebSocket handlers…Sound exhausting? That’s because it is!
What RunAgent Actually Does
RunAgent fundamentally changes how AI agent deployment works. Your Python function signatures automatically become API contracts (REST or Streaming) for every supported language. The fastest way to experience the magic is with RunAgent CLI:1
Install the RunAgent CLI
This installs the powerful RunAgent CLI, which is used to deploy and manage your agents.
2
Initialize Your Agent
Let’s start with a minimal Agent example.This creates a new directory with the basic structure you need to get started.
3
Define Your Agent Function
In your agent codebase (for example, This
main.py), define a function that will serve as your agent’s entrypoint:main.py
mock_response function is one of the invocation functions for our agent. It takes a message and role as parameters and returns the agent’s response.4
Configure the Agent Entrypoint
Open the The
runagent.config.json file and add your function as an entrypoint:file specifies where your function lives, module is the function name, and tag is a label you’ll use to call this specific entrypoint.5
Run the Agent Locally
Start your agent locally to test it:You will see output similar to:That’s it! Your agent is now running and accessible through standard REST API as well as all RunAgent SDKs.
6
Use the Agent in Your Application
Using the RunAgent SDKs, you can access your agent from any supported language. You only need the agent ID and the entrypoint tag. Your Notice how the same agent logic is accessible from all languages with idiomatic syntax for each.
mock_response function now becomes accessible in multiple languages:From Local Testing to Production in Seconds
Once you’ve tested your agent locally and confirmed everything works, deploying to production is just one command away. RunAgent Cloud is the fastest AI agent deployment platform, designed to take your local agent from development to production instantly. No complex infrastructure setup, no deployment headaches.Deploy to Production
- Automatic scaling - Handle one request or one million
- Global edge network - Low latency worldwide
- Zero infrastructure management - Focus on your agent, not servers
- Production-ready security - Enterprise-grade isolation and security
Learn More About Deployment
Explore advanced deployment options, environment variables, and monitoring capabilities.
Agent Framework Support
RunAgent works with any Python-based AI agent framework:LangGraph
Deploy your LangGraph agents with built-in state management and workflow execution.
CrewAI
Deploy multi-agent CrewAI systems with coordinated execution and monitoring.
AutoGen
Deploy Microsoft AutoGen multi-agent conversations with step-by-step execution.
AG2
Deploy AG2 conversational agents with fact-checking and multi-turn interactions.
Agno
Deploy Agno AI agents with print response capabilities and streaming support.
Letta
Deploy Letta memory-enabled agents with persistent conversations and tool integration.
Custom Framework
Use any Python-based framework by defining simple entrypoint functions.
Multi-Language SDK Support
RunAgent provides native-like access to your deployed agents across multiple languages:Python SDK
Python client with streaming capabilities. Access your agents like local functions with full type safety and Python idioms.
JavaScript SDK
Full TypeScript support with streaming and Promise-based APIs. Perfect for modern web applications with async/await patterns.
Rust SDK
High-performance async SDK with futures and streaming support. Zero-cost abstractions for systems programming and performance-critical applications.
Go SDK
Idiomatic Go client with context-aware operations and channel-based streaming. Built for concurrent, scalable applications.
All Language SDKs
We’re actively developing SDKs for additional languages including C++, C#, Java, and PHP. Want to contribute or request a specific language? Join our Discord community.
Real-Time Streaming Across Languages
In addition to standard REST API responses, you can also stream your agent responses seamlessly through our SDKs. When your entrypoint streams a response, RunAgent makes it feel native in every language SDK:1
Define a Streaming Function
Create a function in your agent codebase that returns an Iterator (in this case in This
main.py):main.py
mock_response_stream function will return an Iterator that yields response chunks as they’re generated.2
Configure the Streaming Entrypoint
Add this function to your
runagent.config.json file as another entrypoint:The tag for a streaming entrypoint must end with a
_stream suffix. This is how RunAgent identifies it as a streaming endpoint.3
Run the Agent Locally
Spin up the agent just like before. Now you have an additional streaming entrypoint available.
4
Use Streaming in Your Application
Using the RunAgent SDKs, you can stream responses from your agent. The streaming experience feels natural in each language:
Next Steps
- Head to the tutorials guide to see the full capabilities
- Browse our framework guides for LangGraph, CrewAI, and more
- Check out core concepts to understand how it all works
Still have a question?
- Join our Discord Community
- Email us: [email protected]
- Follow us on X
- New here? Sign up