You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
docs(examples): Enhance SDK with new examples and environment configuration (#2)
* docs(examples): Enhance SDK with new examples and environment configuration
Signed-off-by: Eden Reich <eden.reich@gmail.com>
* docs: Update README and examples to clarify tool listing and usage in SDK
Signed-off-by: Eden Reich <eden.reich@gmail.com>
* fix: Update client initialization URLs to include /v1 as the base URL for consistency
Signed-off-by: Eden Reich <eden.reich@gmail.com>
* docs(tools-use): Enhance README and examples with tool usage and definitions
Signed-off-by: Eden Reich <eden.reich@gmail.com>
* docs(examples): Enhance chat completion examples with error handling and streaming support
Signed-off-by: Eden Reich <eden.reich@gmail.com>
* docs(examples): Update README files to enhance clarity and usage instructions
Signed-off-by: Eden Reich <eden.reich@gmail.com>
* docs: Enhance MCP tools section in README with server-side management details
Signed-off-by: Eden Reich <eden.reich@gmail.com>
---------
Signed-off-by: Eden Reich <eden.reich@gmail.com>
A modern Python SDK for interacting with the [Inference Gateway](https://github.com/edenreich/inference-gateway), providing a unified interface to multiple AI providers.
@@ -41,17 +43,17 @@ pip install inference-gateway
41
43
### Basic Usage
42
44
43
45
```python
44
-
from inference_gateway import InferenceGatewayClient, Message, MessageRole
46
+
from inference_gateway import InferenceGatewayClient, Message
# List available MCP tools (requires MCP_ENABLE and MCP_EXPOSE to be set on the gateway)
240
+
tools = client.list_tools()
241
+
print("Available tools:", tools)
242
+
```
243
+
244
+
**Server-Side Tool Management**
245
+
246
+
The SDK currently supports listing available MCP tools, which is particularly useful for UI applications that need to display connected tools to users. The key advantage is that tools are managed server-side:
247
+
248
+
-**Automatic Tool Injection**: Tools are automatically inferred and injected into requests by the Inference Gateway server
249
+
-**Simplified Client Code**: No need to manually manage or configure tools in your client application
250
+
-**Transparent Tool Calls**: During streaming chat completions with configured MCP servers, tool calls appear in the response stream - no special handling required except optionally displaying them to users
251
+
252
+
This architecture allows you to focus on LLM interactions while the gateway handles all tool management complexities behind the scenes.
253
+
210
254
### Custom HTTP Configuration
211
255
212
256
```python
213
257
# With custom headers
214
258
client = InferenceGatewayClient(
215
-
"http://localhost:8080",
259
+
"http://localhost:8080/v1",
216
260
headers={"X-Custom-Header": "value"}
217
261
)
218
262
219
263
# With proxy settings
220
264
client = InferenceGatewayClient(
221
-
"http://localhost:8080",
265
+
"http://localhost:8080/v1",
222
266
proxies={"http": "http://proxy.example.com"}
223
267
)
224
268
```
225
269
270
+
## Examples
271
+
272
+
For comprehensive examples demonstrating various use cases, see the [examples](examples/) directory:
273
+
274
+
-[List LLMs](examples/list/) - How to list available models
275
+
-[Chat](examples/chat/) - Basic and advanced chat completion examples
276
+
-[Tools](examples/tools/) - Working with function tools
277
+
-[MCP](examples/mcp/) - Model Context Protocol integration examples
278
+
279
+
Each example includes a detailed README with setup instructions and explanations.
280
+
226
281
## License
227
282
228
283
This SDK is distributed under the MIT License, see [LICENSE](LICENSE) for more information.
0 commit comments