Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/lambdahook bedrock #11

Merged
merged 6 commits into from
Oct 26, 2023
Merged

Feature/lambdahook bedrock #11

merged 6 commits into from
Oct 26, 2023

Conversation

rstrahan
Copy link
Contributor

Description of changes:

Add the Plugin feature below to the Bedrock plugin..

(Optional) Use the LLM as a fallback source of answers, using Lambda hooks with CustomNoMatches/no_hits*

Optionally configure QnAbot to prompt the LLM directly by configuring the LLM Plugin LambdaHook function QnAItemLambdaHookFunctionName as a Lambda Hook for the QnABot CustomNoMatches no_hits item. When QnABot cannot answer a question by any other means, it reverts to the no_hits item, which, when configured with this Lambda Hook function, will relay the question to the LLM.

When your Plugin CloudFormation stack status is CREATE_COMPLETE, choose the Outputs tab. Look for the outputs QnAItemLambdaHookFunctionName and QnAItemLambdaHookArgs. Use these values in the LambdaHook section of your no_hits item. You can change the value of "Prefix', or use "None" if you don't want to prefix the LLM answer.

The default behavior is to relay the user's query to the LLM as the prompt. If LLM_QUERY_GENERATION is enabled, the generated (disambiguated) query will be used, otherwise the user's utterance is used. You can override this behavior by supplying an explicit "Prompt" key in the QnAItemLambdaHookArgs value. For example setting QnAItemLambdaHookArgs to {"Prefix": "LLM Answer:", "Model_params": {"modelId": "anthropic.claude-instant-v1", "temperature": 0}, "Prompt":"Why is the sky blue?"} will ignore the user's input and simply use the configured prompt instead. Prompts supplied in this manner do not (yet) support variable substitution (eg to substitute user attributes, session attributes, etc. into the prompt). If you feel that would be a useful feature, please create a feature request issue in the repo, or, better yet, implement it, and submit a Pull Request!

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

@rstrahan rstrahan merged commit 05a34b4 into develop Oct 26, 2023
@rstrahan rstrahan deleted the feature/lambdahook-bedrock branch October 26, 2023 19:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant