Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated the documents and codes on how to use Litellm to create a unified interface for both ChatGPT and Ollama #10

Merged
merged 13 commits into from
Feb 11, 2024

Conversation

hungdtrn
Copy link
Contributor

Initially, I considered using litellm to replace the OpenAI SDK for interacting with both ChatGPT and Ollama. However, as suggested by @phattantran1997 , we can improve upon this approach.

Let’s create a proxy server that wraps around both ChatGPT and Ollama. By doing this, we can reuse the OpenAI SDK to interact with this proxy server instead of the base OpenAI server. This approach is better because it allows us to integrate our Ollama with existing OpenAI applications without requiring any code changes

As about Docker, we need to setup the proxy server inside the Ollama container, and interact with this proxy server instead. @samhwang @nqngo

@samhwang
Copy link
Contributor

samhwang commented Feb 10, 2024

please keep the pre-commit related changes out of this PR. It isn't related to the LiteLLM changes.

@hungdtrn
Copy link
Contributor Author

please keep the pre-commit related changes out of this PR. It isn't related to the LiteLLM changes.

Thanks for reminding me, I have reverted back to the previous commit

@samhwang
Copy link
Contributor

One final nitpick from me and it's good to go. @nqngo wanna give a gloss over?

@hungdtrn hungdtrn merged commit d078140 into main Feb 11, 2024
1 check passed
@hungdtrn
Copy link
Contributor Author

@phattantran1997 Please have a look at [llm_assistant/ollama/README.md] to have a better understanding on how the Litellm proxy server work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants