File tree Expand file tree Collapse file tree 2 files changed +4
-4
lines changed Expand file tree Collapse file tree 2 files changed +4
-4
lines changed Original file line number Diff line number Diff line change @@ -20,7 +20,6 @@ This example uses Docker Compose to orchestrate:
20
20
- ** Inference Gateway** - Main API gateway with MCP support enabled
21
21
- ** MCP Filesystem Server** - Provides file system operations (restricted to ` /shared ` and ` /tmp ` )
22
22
- ** MCP Web Search Server** - Simulated web search and URL fetching
23
- - ** Optional Ollama** - Local model inference (when using ` --profile with-ollama ` )
24
23
25
24
## Important: Filesystem Access
26
25
@@ -49,7 +48,8 @@ The `/shared` directory contains example files for testing:
49
48
Make sure the environment is configured:
50
49
51
50
``` bash
52
- cp .env.example .env
51
+ # From the .env file grab one of the providers keys and export it
52
+ export OPENAI_API_KEY=your_openai_api_key
53
53
```
54
54
55
55
### 1. Start the MCP Infrastructure
@@ -78,7 +78,7 @@ Set your preferred provider and model:
78
78
79
79
``` bash
80
80
export PROVIDER=groq
81
- export LLM=meta-llama/llama-3.3-70b-versatile
81
+ export LLM=qwen-qwq-32b
82
82
```
83
83
84
84
Or for OpenAI:
@@ -93,7 +93,7 @@ export LLM=gpt-4o
93
93
Test that MCP tools are working correctly:
94
94
95
95
``` bash
96
- npx tsx test- mcp-tools.ts
96
+ npm run example: mcp:remotetools
97
97
```
98
98
99
99
### 5. Run Examples
You can’t perform that action at this time.
0 commit comments