Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add kwargs support to Baseten LLMs #8082

Closed
wants to merge 0 commits into from

Conversation

philipkiely-baseten
Copy link
Contributor

This bugfix PR adds kwargs support to Baseten model invocations so that e.g. the following script works properly:

chatgpt_chain = LLMChain(
    llm=Baseten(model="MODEL_ID"),
    prompt=prompt,
    verbose=False,
    memory=ConversationBufferWindowMemory(k=2),
    llm_kwargs={"max_length": 4096}
)
``

@vercel
Copy link

vercel bot commented Jul 21, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
langchain ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jul 21, 2023 4:24pm

@dosubot dosubot bot added the 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature label Jul 21, 2023
@vercel vercel bot temporarily deployed to Preview July 21, 2023 16:24 Inactive
@vercel vercel bot temporarily deployed to Production July 21, 2023 19:07 Inactive
@philipkiely-baseten
Copy link
Contributor Author

Closed to to switch from /langchain/llms to /lib/langchain/llms

New version: #8091

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant