You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1. I have searched related issues but cannot get the expected help.
2. The bug has not been fixed in the latest version.
3. Please note that if the bug-related issue you submitted lacks corresponding environment info and a minimal reproducible demo, it will be challenging for us to reproduce and resolve the issue, reducing the likelihood of receiving feedback.
Describe the bug
The outputs are not consistent with different max_prefill_token_num for long context input on pytorch engine.
It seems the model does not stop correctly for max_prefill_token_num <= 512
Checklist
Describe the bug
The outputs are not consistent with different
max_prefill_token_num
for long context input on pytorch engine.It seems the model does not stop correctly for
max_prefill_token_num <= 512
Reproduction
run script
results
max_prefill_token_num=1024
max_prefill_token_num=512
Environment
Error traceback
No response
The text was updated successfully, but these errors were encountered: