Skip to content

Commit

Permalink
Add reference to LiteLLM Proxy
Browse files Browse the repository at this point in the history
  • Loading branch information
hungdtrn authored Feb 11, 2024
1 parent 7e36928 commit 661654f
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion llm_assistant/ollama/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ litellm --config llm_assistant/ollama/proxy_config.yaml
```


Modify OpenAI SDK to interact with our proxy server instead. In `openai_chat` we use OpenAI API SDK to interact with proxy server. Although the SDK require to have an api key, we **don't need to include it here**. We can use **any string value** for the api key. This is because we are interacting with the proxy server, not the OpenAI server.
Modify OpenAI SDK to interact with our proxy server instead. In `openai_chat` we use OpenAI API SDK to interact with proxy server. Although the SDK require to have an api key, we **don't need to include it here**. We can use **any string value** for the api key. This is because we are interacting with the proxy server, not the OpenAI server. Please visit [here](https://github.com/BerriAI/litellm), Section "Quick Start Proxy - CLI" for more information.

```shell
python openai_chat.py gpt-3.5-turbo
Expand Down

0 comments on commit 661654f

Please sign in to comment.