Skip to content

Commit 8f4dd2b

Browse files
saqadriandrew-lastmile
authored andcommitted
Revise quickstart guide for mcp-agent setup
Focus on CLI path
1 parent ad0288d commit 8f4dd2b

File tree

1 file changed

+132
-105
lines changed

1 file changed

+132
-105
lines changed

docs/get-started/quickstart.mdx

Lines changed: 132 additions & 105 deletions
Original file line numberDiff line numberDiff line change
@@ -5,122 +5,149 @@ description: "Copy, paste, and run your first mcp-agent in minutes."
55
icon: rocket-launch
66
---
77

8-
<Info>
9-
Want the CLI to scaffold this for you? Run `uvx mcp-agent init --template basic --dir my-first-agent` and skip the copy-paste.
10-
</Info>
11-
12-
## 1. Create a folder
13-
14-
```bash
15-
mkdir mcp-basic-agent
16-
cd mcp-basic-agent
17-
```
18-
19-
## 2. Install dependencies with uv
8+
Let's get you set up with a hello world mcp-agent!
9+
10+
## Create the agent
11+
12+
<Tabs>
13+
<Tab title="Use CLI (Recommended)">
14+
```bash
15+
mkdir mcp-basic-agent
16+
cd mcp-basic-agent
17+
uvx mcp-agent init
18+
uv init
19+
uv add mcp-agent
20+
uv sync
21+
```
22+
23+
(Prefer pip? `pip install mcp-agent` works too.)
24+
</Tab>
25+
<Tab title="Do it manually">
26+
<Steps>
27+
<Step title="Create a folder">
28+
```bash
29+
mkdir mcp-basic-agent
30+
cd mcp-basic-agent
31+
```
32+
</Step>
33+
34+
<Step title="Install dependencies with uv">
35+
```bash
36+
uv init
37+
uv add mcp-agent
38+
uv sync
39+
```
40+
41+
(Prefer pip? `python -m venv .venv && pip install mcp-agent` works too.)
42+
</Step>
43+
44+
<Step title="Add configuration files">
45+
`mcp_agent.config.yaml`
46+
```yaml
47+
execution_engine: asyncio
48+
logger:
49+
transports: [console]
50+
level: info
51+
52+
mcp:
53+
servers:
54+
fetch:
55+
command: "uvx"
56+
args: ["mcp-server-fetch"]
57+
filesystem:
58+
command: "npx"
59+
args: ["-y", "@modelcontextprotocol/server-filesystem"]
60+
61+
openai:
62+
default_model: gpt-4o-mini
63+
```
64+
65+
`mcp_agent.secrets.yaml`
66+
```yaml
67+
openai:
68+
api_key: "your-openai-api-key"
69+
```
70+
</Step>
71+
72+
<Step title="Paste the hello world agent">
73+
`main.py`
74+
```python
75+
import asyncio
76+
import os
77+
import time
78+
79+
from mcp_agent.app import MCPApp
80+
from mcp_agent.agents.agent import Agent
81+
from mcp_agent.workflows.llm.augmented_llm_openai import OpenAIAugmentedLLM
82+
83+
app = MCPApp(name="mcp_basic_agent")
84+
85+
@app.tool()
86+
async def example_usage() -> str:
87+
async with app.run() as session:
88+
logger = session.logger
89+
context = session.context
90+
91+
# Let the filesystem server access the current directory
92+
context.config.mcp.servers["filesystem"].args.extend([os.getcwd()])
93+
94+
agent = Agent(
95+
name="finder",
96+
instruction="""You can read local files or fetch URLs.
97+
Return the requested information when asked.""",
98+
server_names=["fetch", "filesystem"],
99+
)
100+
101+
async with agent:
102+
logger.info("Connected MCP servers", data=list(context.server_registry.registry.keys()))
103+
104+
llm = await agent.attach_llm(OpenAIAugmentedLLM)
105+
result = await llm.generate_str(
106+
"Print the contents of README.md verbatim; create it first if missing"
107+
)
108+
logger.info("README contents", data=result)
109+
110+
result = await llm.generate_str(
111+
"Fetch the first two paragraphs from https://modelcontextprotocol.io/introduction"
112+
)
113+
logger.info("Fetched content", data=result)
114+
115+
tweet = await llm.generate_str(
116+
"Summarize that content in a 140-character tweet"
117+
)
118+
logger.info("Tweet", data=tweet)
119+
return tweet
120+
121+
if __name__ == "__main__":
122+
start = time.time()
123+
asyncio.run(example_usage())
124+
end = time.time()
125+
print(f"Finished in {end - start:.2f}s")
126+
```
127+
</Step>
128+
</Steps>
129+
</Tab>
130+
</Tabs>
131+
132+
## Run it locally
20133

21134
```bash
22-
uv init
23-
uv add mcp-agent
24-
uv sync
25-
```
26-
27-
(Prefer pip? `python -m venv .venv && pip install mcp-agent` works too.)
28-
29-
## 3. Add configuration files
30-
31-
`mcp_agent.config.yaml`
32-
```yaml
33-
execution_engine: asyncio
34-
logger:
35-
transports: [console]
36-
level: info
37-
38-
mcp:
39-
servers:
40-
fetch:
41-
command: "uvx"
42-
args: ["mcp-server-fetch"]
43-
filesystem:
44-
command: "npx"
45-
args: ["-y", "@modelcontextprotocol/server-filesystem"]
46-
47-
openai:
48-
default_model: gpt-4o-mini
135+
uv run main.py
49136
```
50137

51-
`mcp_agent.secrets.yaml`
52-
```yaml
53-
openai:
54-
api_key: "your-openai-api-key"
55-
```
138+
You should see log entries for tool discovery, file access, web fetches, and the final summary tweet. Try editing the instructions or adding new MCP servers to see how the agent evolves.
56139

57-
## 4. Paste the example agent
58-
59-
`main.py`
60-
```python
61-
import asyncio
62-
import os
63-
import time
64-
65-
from mcp_agent.app import MCPApp
66-
from mcp_agent.agents.agent import Agent
67-
from mcp_agent.workflows.llm.augmented_llm_openai import OpenAIAugmentedLLM
68-
69-
app = MCPApp(name="mcp_basic_agent")
70-
71-
@app.tool()
72-
async def example_usage() -> str:
73-
async with app.run() as session:
74-
logger = session.logger
75-
context = session.context
76-
77-
# Let the filesystem server access the current directory
78-
context.config.mcp.servers["filesystem"].args.extend([os.getcwd()])
79-
80-
agent = Agent(
81-
name="finder",
82-
instruction="""You can read local files or fetch URLs.
83-
Return the requested information when asked.""",
84-
server_names=["fetch", "filesystem"],
85-
)
86-
87-
async with agent:
88-
logger.info("Connected MCP servers", data=list(context.server_registry.registry.keys()))
89-
90-
llm = await agent.attach_llm(OpenAIAugmentedLLM)
91-
result = await llm.generate_str(
92-
"Print the contents of README.md verbatim; create it first if missing"
93-
)
94-
logger.info("README contents", data=result)
95-
96-
result = await llm.generate_str(
97-
"Fetch the first two paragraphs from https://modelcontextprotocol.io/introduction"
98-
)
99-
logger.info("Fetched content", data=result)
100-
101-
tweet = await llm.generate_str(
102-
"Summarize that content in a 140-character tweet"
103-
)
104-
logger.info("Tweet", data=tweet)
105-
return tweet
106-
107-
if __name__ == "__main__":
108-
start = time.time()
109-
asyncio.run(example_usage())
110-
end = time.time()
111-
print(f"Finished in {end - start:.2f}s")
112-
```
140+
## Deploy it (optional)
113141

114-
## 5. Run it
142+
You can deploy your agent as an MCP server.
115143

116144
```bash
117-
uv run main.py
145+
uvx mcp-agent login
146+
uvx mcp-agent deploy
118147
```
119148

120-
You should see log entries for tool discovery, file access, web fetches, and the final summary tweet. Try editing the instructions or adding new MCP servers to see how the agent evolves.
121-
122149
## Next steps
123150

124151
- Check out the generated README (if you used the CLI) for tips on extending the agent.
125152
- Layer in more capabilities using the [Effective Patterns](/mcp-agent-sdk/effective-patterns/overview) guide.
126-
- Ready to share it? Deploy with `uvx mcp-agent deploy my-agent` by following [Deploy to Cloud](/get-started/deploy-to-cloud).
153+
- Ready to deploy your agent? Follow [Deploy to Cloud](/get-started/deploy-to-cloud).

0 commit comments

Comments
 (0)